Dec 05 12:29:17.873957 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 05 12:29:18.108420 master-0 kubenswrapper[4780]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:29:18.109767 master-0 kubenswrapper[4780]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 12:29:18.109767 master-0 kubenswrapper[4780]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:29:18.109767 master-0 kubenswrapper[4780]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:29:18.109767 master-0 kubenswrapper[4780]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 12:29:18.109767 master-0 kubenswrapper[4780]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:29:18.111022 master-0 kubenswrapper[4780]: I1205 12:29:18.110815 4780 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 12:29:18.116962 master-0 kubenswrapper[4780]: W1205 12:29:18.116906 4780 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:29:18.116962 master-0 kubenswrapper[4780]: W1205 12:29:18.116949 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:29:18.116962 master-0 kubenswrapper[4780]: W1205 12:29:18.116964 4780 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.116978 4780 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.116987 4780 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.116996 4780 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117005 4780 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117013 4780 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117021 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117029 4780 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117036 4780 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117044 4780 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117052 4780 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117059 4780 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117067 4780 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117075 4780 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117083 4780 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117090 4780 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117098 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117106 4780 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117114 4780 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117124 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:29:18.117168 master-0 kubenswrapper[4780]: W1205 12:29:18.117132 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117140 4780 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117151 4780 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117160 4780 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117170 4780 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117208 4780 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117217 4780 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117227 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117235 4780 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117244 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117256 4780 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117267 4780 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117275 4780 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117285 4780 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117293 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117301 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117309 4780 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117318 4780 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117326 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:29:18.118279 master-0 kubenswrapper[4780]: W1205 12:29:18.117336 4780 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117344 4780 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117351 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117359 4780 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117367 4780 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117375 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117383 4780 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117393 4780 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117402 4780 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117410 4780 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117418 4780 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117426 4780 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117434 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117442 4780 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117450 4780 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117460 4780 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117468 4780 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117475 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117483 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:29:18.119336 master-0 kubenswrapper[4780]: W1205 12:29:18.117491 4780 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117499 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117507 4780 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117515 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117522 4780 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117530 4780 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117538 4780 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117546 4780 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117553 4780 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117561 4780 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117568 4780 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: W1205 12:29:18.117577 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: I1205 12:29:18.118521 4780 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: I1205 12:29:18.118552 4780 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: I1205 12:29:18.118577 4780 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: I1205 12:29:18.118589 4780 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: I1205 12:29:18.118602 4780 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: I1205 12:29:18.118613 4780 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: I1205 12:29:18.118640 4780 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: I1205 12:29:18.118656 4780 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: I1205 12:29:18.118666 4780 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 12:29:18.120315 master-0 kubenswrapper[4780]: I1205 12:29:18.118675 4780 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118685 4780 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118694 4780 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118703 4780 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118715 4780 flags.go:64] FLAG: --cgroup-root="" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118724 4780 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118734 4780 flags.go:64] FLAG: --client-ca-file="" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118744 4780 flags.go:64] FLAG: --cloud-config="" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118753 4780 flags.go:64] FLAG: --cloud-provider="" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118761 4780 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118781 4780 flags.go:64] FLAG: --cluster-domain="" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118790 4780 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118800 4780 flags.go:64] FLAG: --config-dir="" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118809 4780 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118819 4780 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118837 4780 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118846 4780 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118856 4780 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118866 4780 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118875 4780 flags.go:64] FLAG: --contention-profiling="false" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118884 4780 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118893 4780 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118904 4780 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118912 4780 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118924 4780 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 12:29:18.121376 master-0 kubenswrapper[4780]: I1205 12:29:18.118933 4780 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.118942 4780 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.118952 4780 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.118962 4780 flags.go:64] FLAG: --enable-server="true" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.118971 4780 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.118988 4780 flags.go:64] FLAG: --event-burst="100" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.118998 4780 flags.go:64] FLAG: --event-qps="50" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119007 4780 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119030 4780 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119039 4780 flags.go:64] FLAG: --eviction-hard="" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119050 4780 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119060 4780 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119069 4780 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119081 4780 flags.go:64] FLAG: --eviction-soft="" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119090 4780 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119099 4780 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119108 4780 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119117 4780 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119126 4780 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119135 4780 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119144 4780 flags.go:64] FLAG: --feature-gates="" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119156 4780 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119165 4780 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119210 4780 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119224 4780 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119236 4780 flags.go:64] FLAG: --healthz-port="10248" Dec 05 12:29:18.122565 master-0 kubenswrapper[4780]: I1205 12:29:18.119248 4780 flags.go:64] FLAG: --help="false" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119260 4780 flags.go:64] FLAG: --hostname-override="" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119271 4780 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119284 4780 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119296 4780 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119307 4780 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119319 4780 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119331 4780 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119343 4780 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119355 4780 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119368 4780 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119380 4780 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119394 4780 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119407 4780 flags.go:64] FLAG: --kube-reserved="" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119419 4780 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119430 4780 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119442 4780 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119452 4780 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119486 4780 flags.go:64] FLAG: --lock-file="" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119496 4780 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119505 4780 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119515 4780 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119529 4780 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119538 4780 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119547 4780 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 12:29:18.124021 master-0 kubenswrapper[4780]: I1205 12:29:18.119556 4780 flags.go:64] FLAG: --logging-format="text" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119565 4780 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119575 4780 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119583 4780 flags.go:64] FLAG: --manifest-url="" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119593 4780 flags.go:64] FLAG: --manifest-url-header="" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119606 4780 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119615 4780 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119626 4780 flags.go:64] FLAG: --max-pods="110" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119635 4780 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119645 4780 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119654 4780 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119663 4780 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119672 4780 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119682 4780 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119691 4780 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119712 4780 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119721 4780 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119731 4780 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119740 4780 flags.go:64] FLAG: --pod-cidr="" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119749 4780 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119769 4780 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119778 4780 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119787 4780 flags.go:64] FLAG: --pods-per-core="0" Dec 05 12:29:18.125743 master-0 kubenswrapper[4780]: I1205 12:29:18.119796 4780 flags.go:64] FLAG: --port="10250" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119805 4780 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119815 4780 flags.go:64] FLAG: --provider-id="" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119824 4780 flags.go:64] FLAG: --qos-reserved="" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119834 4780 flags.go:64] FLAG: --read-only-port="10255" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119843 4780 flags.go:64] FLAG: --register-node="true" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119856 4780 flags.go:64] FLAG: --register-schedulable="true" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119866 4780 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119884 4780 flags.go:64] FLAG: --registry-burst="10" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119894 4780 flags.go:64] FLAG: --registry-qps="5" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119903 4780 flags.go:64] FLAG: --reserved-cpus="" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119912 4780 flags.go:64] FLAG: --reserved-memory="" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119924 4780 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119934 4780 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119943 4780 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119952 4780 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119961 4780 flags.go:64] FLAG: --runonce="false" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119970 4780 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119979 4780 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119989 4780 flags.go:64] FLAG: --seccomp-default="false" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.119998 4780 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.120007 4780 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.120017 4780 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.120026 4780 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.120035 4780 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 12:29:18.127263 master-0 kubenswrapper[4780]: I1205 12:29:18.120044 4780 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120053 4780 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120063 4780 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120072 4780 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120082 4780 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120091 4780 flags.go:64] FLAG: --system-cgroups="" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120101 4780 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120116 4780 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120124 4780 flags.go:64] FLAG: --tls-cert-file="" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120135 4780 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120149 4780 flags.go:64] FLAG: --tls-min-version="" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120158 4780 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120167 4780 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120205 4780 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120215 4780 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120224 4780 flags.go:64] FLAG: --v="2" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120237 4780 flags.go:64] FLAG: --version="false" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120250 4780 flags.go:64] FLAG: --vmodule="" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120262 4780 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: I1205 12:29:18.120272 4780 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: W1205 12:29:18.120503 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: W1205 12:29:18.120520 4780 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: W1205 12:29:18.120574 4780 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: W1205 12:29:18.120589 4780 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:29:18.128636 master-0 kubenswrapper[4780]: W1205 12:29:18.120600 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120612 4780 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120623 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120634 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120645 4780 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120657 4780 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120668 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120681 4780 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120694 4780 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120705 4780 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120713 4780 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120721 4780 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120729 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120737 4780 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120746 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120754 4780 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120762 4780 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120774 4780 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120783 4780 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120791 4780 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:29:18.130019 master-0 kubenswrapper[4780]: W1205 12:29:18.120800 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120807 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120815 4780 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120826 4780 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120836 4780 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120846 4780 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120855 4780 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120864 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120872 4780 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120883 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120892 4780 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120900 4780 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120908 4780 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120916 4780 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120924 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120932 4780 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120941 4780 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120948 4780 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120956 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:29:18.131536 master-0 kubenswrapper[4780]: W1205 12:29:18.120967 4780 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.120977 4780 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.120986 4780 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.120995 4780 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121004 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121012 4780 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121021 4780 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121032 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121041 4780 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121048 4780 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121059 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121067 4780 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121075 4780 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121083 4780 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121115 4780 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121123 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121130 4780 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121139 4780 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121147 4780 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121155 4780 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:29:18.132673 master-0 kubenswrapper[4780]: W1205 12:29:18.121162 4780 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.121170 4780 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.121205 4780 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.121213 4780 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.121221 4780 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.121229 4780 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.121255 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.121263 4780 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.121272 4780 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: I1205 12:29:18.121298 4780 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: I1205 12:29:18.131488 4780 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: I1205 12:29:18.131523 4780 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.131665 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.131680 4780 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.131692 4780 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:29:18.133640 master-0 kubenswrapper[4780]: W1205 12:29:18.131705 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131717 4780 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131729 4780 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131740 4780 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131751 4780 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131761 4780 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131772 4780 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131781 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131789 4780 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131796 4780 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131805 4780 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131813 4780 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131821 4780 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131829 4780 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131836 4780 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131844 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131852 4780 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131860 4780 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131867 4780 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131875 4780 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:29:18.134643 master-0 kubenswrapper[4780]: W1205 12:29:18.131883 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131891 4780 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131898 4780 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131908 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131916 4780 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131923 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131932 4780 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131940 4780 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131948 4780 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131956 4780 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131965 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131974 4780 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131985 4780 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.131994 4780 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.132004 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.132012 4780 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.132023 4780 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.132032 4780 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.132041 4780 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:29:18.135574 master-0 kubenswrapper[4780]: W1205 12:29:18.132051 4780 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132061 4780 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132071 4780 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132080 4780 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132088 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132098 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132108 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132117 4780 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132128 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132138 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132148 4780 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132161 4780 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132174 4780 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132219 4780 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132229 4780 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132238 4780 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132248 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132256 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132264 4780 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:29:18.136679 master-0 kubenswrapper[4780]: W1205 12:29:18.132272 4780 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132280 4780 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132288 4780 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132298 4780 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132308 4780 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132318 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132328 4780 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132340 4780 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132351 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132365 4780 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132376 4780 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: I1205 12:29:18.132390 4780 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132750 4780 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132769 4780 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132779 4780 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132788 4780 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:29:18.138029 master-0 kubenswrapper[4780]: W1205 12:29:18.132797 4780 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132805 4780 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132812 4780 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132820 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132829 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132836 4780 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132844 4780 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132852 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132862 4780 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132872 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132881 4780 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132891 4780 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132900 4780 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132912 4780 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132922 4780 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132931 4780 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132940 4780 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132950 4780 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132959 4780 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132968 4780 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:29:18.139448 master-0 kubenswrapper[4780]: W1205 12:29:18.132976 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.132986 4780 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.132998 4780 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133007 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133015 4780 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133024 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133032 4780 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133041 4780 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133049 4780 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133064 4780 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133077 4780 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133089 4780 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133102 4780 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133113 4780 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133126 4780 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133141 4780 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133152 4780 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133162 4780 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133170 4780 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:29:18.140738 master-0 kubenswrapper[4780]: W1205 12:29:18.133208 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133217 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133225 4780 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133234 4780 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133243 4780 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133253 4780 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133264 4780 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133275 4780 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133285 4780 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133293 4780 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133301 4780 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133309 4780 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133318 4780 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133326 4780 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133334 4780 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133342 4780 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133349 4780 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133360 4780 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133369 4780 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133378 4780 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:29:18.141776 master-0 kubenswrapper[4780]: W1205 12:29:18.133387 4780 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: W1205 12:29:18.133395 4780 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: W1205 12:29:18.133404 4780 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: W1205 12:29:18.133412 4780 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: W1205 12:29:18.133421 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: W1205 12:29:18.133433 4780 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: W1205 12:29:18.133445 4780 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: W1205 12:29:18.133455 4780 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: W1205 12:29:18.133463 4780 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: I1205 12:29:18.133475 4780 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: I1205 12:29:18.134134 4780 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: I1205 12:29:18.137667 4780 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: I1205 12:29:18.138895 4780 server.go:997] "Starting client certificate rotation" Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: I1205 12:29:18.138942 4780 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 12:29:18.142876 master-0 kubenswrapper[4780]: I1205 12:29:18.139167 4780 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 12:29:18.147244 master-0 kubenswrapper[4780]: I1205 12:29:18.147165 4780 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 12:29:18.151035 master-0 kubenswrapper[4780]: I1205 12:29:18.150966 4780 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 12:29:18.151495 master-0 kubenswrapper[4780]: E1205 12:29:18.151413 4780 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:18.161758 master-0 kubenswrapper[4780]: I1205 12:29:18.161704 4780 log.go:25] "Validated CRI v1 runtime API" Dec 05 12:29:18.165053 master-0 kubenswrapper[4780]: I1205 12:29:18.165005 4780 log.go:25] "Validated CRI v1 image API" Dec 05 12:29:18.167402 master-0 kubenswrapper[4780]: I1205 12:29:18.167287 4780 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 12:29:18.170545 master-0 kubenswrapper[4780]: I1205 12:29:18.170480 4780 fs.go:135] Filesystem UUIDs: map[4623d87d-4611-48ee-a0ce-68b00f5d84bd:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Dec 05 12:29:18.170545 master-0 kubenswrapper[4780]: I1205 12:29:18.170522 4780 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Dec 05 12:29:18.188079 master-0 kubenswrapper[4780]: I1205 12:29:18.187730 4780 manager.go:217] Machine: {Timestamp:2025-12-05 12:29:18.186397227 +0000 UTC m=+0.209137973 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514145280 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:7ed1cb80ed224980aa762c96e2471f55 SystemUUID:7ed1cb80-ed22-4980-aa76-2c96e2471f55 BootID:195a1d65-51c2-44ad-9194-26630da59f9f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257070592 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102829056 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257074688 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:27:b3:a6 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:d3:8e:e6 Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:8a:28:cb:43:ed:9b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514145280 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 12:29:18.188079 master-0 kubenswrapper[4780]: I1205 12:29:18.188029 4780 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 12:29:18.188286 master-0 kubenswrapper[4780]: I1205 12:29:18.188260 4780 manager.go:233] Version: {KernelVersion:5.14.0-427.100.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511170715-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 12:29:18.188887 master-0 kubenswrapper[4780]: I1205 12:29:18.188842 4780 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 12:29:18.189098 master-0 kubenswrapper[4780]: I1205 12:29:18.189042 4780 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 12:29:18.189358 master-0 kubenswrapper[4780]: I1205 12:29:18.189082 4780 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 12:29:18.189418 master-0 kubenswrapper[4780]: I1205 12:29:18.189375 4780 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 12:29:18.189418 master-0 kubenswrapper[4780]: I1205 12:29:18.189385 4780 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 12:29:18.189702 master-0 kubenswrapper[4780]: I1205 12:29:18.189612 4780 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 12:29:18.189702 master-0 kubenswrapper[4780]: I1205 12:29:18.189641 4780 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 12:29:18.189911 master-0 kubenswrapper[4780]: I1205 12:29:18.189892 4780 state_mem.go:36] "Initialized new in-memory state store" Dec 05 12:29:18.190973 master-0 kubenswrapper[4780]: I1205 12:29:18.190001 4780 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 12:29:18.190973 master-0 kubenswrapper[4780]: I1205 12:29:18.190896 4780 kubelet.go:418] "Attempting to sync node with API server" Dec 05 12:29:18.190973 master-0 kubenswrapper[4780]: I1205 12:29:18.190914 4780 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 12:29:18.190973 master-0 kubenswrapper[4780]: I1205 12:29:18.190935 4780 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 12:29:18.190973 master-0 kubenswrapper[4780]: I1205 12:29:18.190948 4780 kubelet.go:324] "Adding apiserver pod source" Dec 05 12:29:18.190973 master-0 kubenswrapper[4780]: I1205 12:29:18.190961 4780 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 12:29:18.193239 master-0 kubenswrapper[4780]: I1205 12:29:18.193163 4780 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 05 12:29:18.194292 master-0 kubenswrapper[4780]: I1205 12:29:18.194110 4780 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 12:29:18.194667 master-0 kubenswrapper[4780]: I1205 12:29:18.194620 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 12:29:18.194667 master-0 kubenswrapper[4780]: I1205 12:29:18.194643 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: I1205 12:29:18.194675 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: I1205 12:29:18.194684 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: I1205 12:29:18.194691 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: I1205 12:29:18.194698 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: I1205 12:29:18.194706 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: I1205 12:29:18.194713 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: I1205 12:29:18.194722 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: I1205 12:29:18.194729 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: I1205 12:29:18.194761 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: I1205 12:29:18.194774 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 12:29:18.194780 master-0 kubenswrapper[4780]: W1205 12:29:18.194728 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:18.195234 master-0 kubenswrapper[4780]: E1205 12:29:18.194815 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:18.195234 master-0 kubenswrapper[4780]: I1205 12:29:18.195009 4780 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 12:29:18.195552 master-0 kubenswrapper[4780]: I1205 12:29:18.195525 4780 server.go:1280] "Started kubelet" Dec 05 12:29:18.195714 master-0 kubenswrapper[4780]: W1205 12:29:18.195618 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:18.195803 master-0 kubenswrapper[4780]: E1205 12:29:18.195754 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:18.196339 master-0 kubenswrapper[4780]: I1205 12:29:18.196232 4780 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 12:29:18.196421 master-0 kubenswrapper[4780]: I1205 12:29:18.196387 4780 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 05 12:29:18.196450 master-0 kubenswrapper[4780]: I1205 12:29:18.196406 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:18.196617 master-0 kubenswrapper[4780]: I1205 12:29:18.196289 4780 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 12:29:18.197447 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 05 12:29:18.198678 master-0 kubenswrapper[4780]: I1205 12:29:18.198641 4780 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 12:29:18.201540 master-0 kubenswrapper[4780]: I1205 12:29:18.201479 4780 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 12:29:18.201641 master-0 kubenswrapper[4780]: I1205 12:29:18.201568 4780 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 12:29:18.202131 master-0 kubenswrapper[4780]: I1205 12:29:18.202101 4780 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 12:29:18.202245 master-0 kubenswrapper[4780]: I1205 12:29:18.202135 4780 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 12:29:18.202245 master-0 kubenswrapper[4780]: E1205 12:29:18.202081 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:18.202299 master-0 kubenswrapper[4780]: I1205 12:29:18.202249 4780 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 05 12:29:18.204241 master-0 kubenswrapper[4780]: W1205 12:29:18.204132 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:18.204314 master-0 kubenswrapper[4780]: E1205 12:29:18.204272 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:18.204965 master-0 kubenswrapper[4780]: I1205 12:29:18.204769 4780 reconstruct.go:97] "Volume reconstruction finished" Dec 05 12:29:18.206326 master-0 kubenswrapper[4780]: I1205 12:29:18.206300 4780 reconciler.go:26] "Reconciler: start to sync state" Dec 05 12:29:18.206694 master-0 kubenswrapper[4780]: I1205 12:29:18.206668 4780 factory.go:55] Registering systemd factory Dec 05 12:29:18.206751 master-0 kubenswrapper[4780]: I1205 12:29:18.206697 4780 factory.go:221] Registration of the systemd container factory successfully Dec 05 12:29:18.206751 master-0 kubenswrapper[4780]: E1205 12:29:18.206711 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Dec 05 12:29:18.206993 master-0 kubenswrapper[4780]: I1205 12:29:18.206964 4780 server.go:449] "Adding debug handlers to kubelet server" Dec 05 12:29:18.207851 master-0 kubenswrapper[4780]: E1205 12:29:18.207371 4780 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.187e518a3feabbc6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.195497926 +0000 UTC m=+0.218238652,LastTimestamp:2025-12-05 12:29:18.195497926 +0000 UTC m=+0.218238652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:18.208363 master-0 kubenswrapper[4780]: I1205 12:29:18.208337 4780 factory.go:153] Registering CRI-O factory Dec 05 12:29:18.208363 master-0 kubenswrapper[4780]: I1205 12:29:18.208364 4780 factory.go:221] Registration of the crio container factory successfully Dec 05 12:29:18.208447 master-0 kubenswrapper[4780]: I1205 12:29:18.208425 4780 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 12:29:18.208501 master-0 kubenswrapper[4780]: I1205 12:29:18.208452 4780 factory.go:103] Registering Raw factory Dec 05 12:29:18.208550 master-0 kubenswrapper[4780]: I1205 12:29:18.208505 4780 manager.go:1196] Started watching for new ooms in manager Dec 05 12:29:18.209193 master-0 kubenswrapper[4780]: I1205 12:29:18.209152 4780 manager.go:319] Starting recovery of all containers Dec 05 12:29:18.221104 master-0 kubenswrapper[4780]: E1205 12:29:18.221045 4780 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Dec 05 12:29:18.224884 master-0 kubenswrapper[4780]: I1205 12:29:18.224818 4780 manager.go:324] Recovery completed Dec 05 12:29:18.236238 master-0 kubenswrapper[4780]: I1205 12:29:18.236198 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.237580 master-0 kubenswrapper[4780]: I1205 12:29:18.237536 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.237580 master-0 kubenswrapper[4780]: I1205 12:29:18.237579 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.237671 master-0 kubenswrapper[4780]: I1205 12:29:18.237590 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.238328 master-0 kubenswrapper[4780]: I1205 12:29:18.238295 4780 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 12:29:18.238328 master-0 kubenswrapper[4780]: I1205 12:29:18.238313 4780 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 12:29:18.238410 master-0 kubenswrapper[4780]: I1205 12:29:18.238332 4780 state_mem.go:36] "Initialized new in-memory state store" Dec 05 12:29:18.303399 master-0 kubenswrapper[4780]: E1205 12:29:18.303349 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:18.404460 master-0 kubenswrapper[4780]: E1205 12:29:18.404251 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:18.408804 master-0 kubenswrapper[4780]: E1205 12:29:18.408746 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Dec 05 12:29:18.461841 master-0 kubenswrapper[4780]: I1205 12:29:18.461312 4780 policy_none.go:49] "None policy: Start" Dec 05 12:29:18.463279 master-0 kubenswrapper[4780]: I1205 12:29:18.463245 4780 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 12:29:18.463393 master-0 kubenswrapper[4780]: I1205 12:29:18.463289 4780 state_mem.go:35] "Initializing new in-memory state store" Dec 05 12:29:18.504536 master-0 kubenswrapper[4780]: E1205 12:29:18.504465 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:18.527088 master-0 kubenswrapper[4780]: I1205 12:29:18.527015 4780 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 12:29:18.529226 master-0 kubenswrapper[4780]: I1205 12:29:18.529104 4780 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 12:29:18.529347 master-0 kubenswrapper[4780]: I1205 12:29:18.529236 4780 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 12:29:18.529347 master-0 kubenswrapper[4780]: I1205 12:29:18.529294 4780 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 12:29:18.529575 master-0 kubenswrapper[4780]: E1205 12:29:18.529517 4780 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 12:29:18.531251 master-0 kubenswrapper[4780]: W1205 12:29:18.531057 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:18.531362 master-0 kubenswrapper[4780]: E1205 12:29:18.531284 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:18.542304 master-0 kubenswrapper[4780]: I1205 12:29:18.542260 4780 manager.go:334] "Starting Device Plugin manager" Dec 05 12:29:18.542708 master-0 kubenswrapper[4780]: I1205 12:29:18.542336 4780 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 12:29:18.542708 master-0 kubenswrapper[4780]: I1205 12:29:18.542361 4780 server.go:79] "Starting device plugin registration server" Dec 05 12:29:18.543041 master-0 kubenswrapper[4780]: I1205 12:29:18.543016 4780 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 12:29:18.543135 master-0 kubenswrapper[4780]: I1205 12:29:18.543047 4780 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 12:29:18.543763 master-0 kubenswrapper[4780]: I1205 12:29:18.543348 4780 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 12:29:18.543763 master-0 kubenswrapper[4780]: I1205 12:29:18.543647 4780 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 12:29:18.543763 master-0 kubenswrapper[4780]: I1205 12:29:18.543668 4780 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 12:29:18.545234 master-0 kubenswrapper[4780]: E1205 12:29:18.545171 4780 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 05 12:29:18.630509 master-0 kubenswrapper[4780]: I1205 12:29:18.630321 4780 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Dec 05 12:29:18.630734 master-0 kubenswrapper[4780]: I1205 12:29:18.630578 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.632618 master-0 kubenswrapper[4780]: I1205 12:29:18.632587 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.632686 master-0 kubenswrapper[4780]: I1205 12:29:18.632632 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.632686 master-0 kubenswrapper[4780]: I1205 12:29:18.632648 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.632846 master-0 kubenswrapper[4780]: I1205 12:29:18.632821 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.633489 master-0 kubenswrapper[4780]: I1205 12:29:18.633424 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:29:18.633654 master-0 kubenswrapper[4780]: I1205 12:29:18.633579 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.633985 master-0 kubenswrapper[4780]: I1205 12:29:18.633938 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.634037 master-0 kubenswrapper[4780]: I1205 12:29:18.633988 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.634037 master-0 kubenswrapper[4780]: I1205 12:29:18.634006 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.634280 master-0 kubenswrapper[4780]: I1205 12:29:18.634246 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.634509 master-0 kubenswrapper[4780]: I1205 12:29:18.634452 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.634569 master-0 kubenswrapper[4780]: I1205 12:29:18.634538 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.635078 master-0 kubenswrapper[4780]: I1205 12:29:18.635029 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.635146 master-0 kubenswrapper[4780]: I1205 12:29:18.635097 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.635146 master-0 kubenswrapper[4780]: I1205 12:29:18.635116 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.635769 master-0 kubenswrapper[4780]: I1205 12:29:18.635734 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.635830 master-0 kubenswrapper[4780]: I1205 12:29:18.635783 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.635883 master-0 kubenswrapper[4780]: I1205 12:29:18.635848 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.635927 master-0 kubenswrapper[4780]: I1205 12:29:18.635881 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.635965 master-0 kubenswrapper[4780]: I1205 12:29:18.635923 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.635965 master-0 kubenswrapper[4780]: I1205 12:29:18.635953 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.636032 master-0 kubenswrapper[4780]: I1205 12:29:18.636001 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.636446 master-0 kubenswrapper[4780]: I1205 12:29:18.636403 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.636507 master-0 kubenswrapper[4780]: I1205 12:29:18.636460 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.637008 master-0 kubenswrapper[4780]: I1205 12:29:18.636981 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.637050 master-0 kubenswrapper[4780]: I1205 12:29:18.637007 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.637050 master-0 kubenswrapper[4780]: I1205 12:29:18.637021 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.637237 master-0 kubenswrapper[4780]: I1205 12:29:18.637212 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.637285 master-0 kubenswrapper[4780]: I1205 12:29:18.637248 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.637324 master-0 kubenswrapper[4780]: I1205 12:29:18.637287 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.637324 master-0 kubenswrapper[4780]: I1205 12:29:18.637309 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.637397 master-0 kubenswrapper[4780]: I1205 12:29:18.637322 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:29:18.637397 master-0 kubenswrapper[4780]: I1205 12:29:18.637371 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.638088 master-0 kubenswrapper[4780]: I1205 12:29:18.638038 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.638135 master-0 kubenswrapper[4780]: I1205 12:29:18.638099 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.638135 master-0 kubenswrapper[4780]: I1205 12:29:18.638125 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.638312 master-0 kubenswrapper[4780]: I1205 12:29:18.638270 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.638356 master-0 kubenswrapper[4780]: I1205 12:29:18.638316 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.638356 master-0 kubenswrapper[4780]: I1205 12:29:18.638336 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.638500 master-0 kubenswrapper[4780]: I1205 12:29:18.638452 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:29:18.638548 master-0 kubenswrapper[4780]: I1205 12:29:18.638520 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.639392 master-0 kubenswrapper[4780]: I1205 12:29:18.639352 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.639444 master-0 kubenswrapper[4780]: I1205 12:29:18.639397 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.639444 master-0 kubenswrapper[4780]: I1205 12:29:18.639415 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.644095 master-0 kubenswrapper[4780]: I1205 12:29:18.644047 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.645578 master-0 kubenswrapper[4780]: I1205 12:29:18.645544 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.645578 master-0 kubenswrapper[4780]: I1205 12:29:18.645575 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.645677 master-0 kubenswrapper[4780]: I1205 12:29:18.645587 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.645677 master-0 kubenswrapper[4780]: I1205 12:29:18.645632 4780 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:29:18.646659 master-0 kubenswrapper[4780]: E1205 12:29:18.646617 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 05 12:29:18.709623 master-0 kubenswrapper[4780]: I1205 12:29:18.709528 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.709623 master-0 kubenswrapper[4780]: I1205 12:29:18.709606 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.709851 master-0 kubenswrapper[4780]: I1205 12:29:18.709644 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.709851 master-0 kubenswrapper[4780]: I1205 12:29:18.709772 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:29:18.709920 master-0 kubenswrapper[4780]: I1205 12:29:18.709858 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:29:18.709920 master-0 kubenswrapper[4780]: I1205 12:29:18.709891 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.709920 master-0 kubenswrapper[4780]: I1205 12:29:18.709916 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.710000 master-0 kubenswrapper[4780]: I1205 12:29:18.709937 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.710000 master-0 kubenswrapper[4780]: I1205 12:29:18.709961 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.710000 master-0 kubenswrapper[4780]: I1205 12:29:18.709986 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:29:18.710085 master-0 kubenswrapper[4780]: I1205 12:29:18.710012 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:29:18.710085 master-0 kubenswrapper[4780]: I1205 12:29:18.710034 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:29:18.710085 master-0 kubenswrapper[4780]: I1205 12:29:18.710060 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.710160 master-0 kubenswrapper[4780]: I1205 12:29:18.710091 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.710160 master-0 kubenswrapper[4780]: I1205 12:29:18.710115 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.710238 master-0 kubenswrapper[4780]: I1205 12:29:18.710142 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.710238 master-0 kubenswrapper[4780]: I1205 12:29:18.710202 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:29:18.810300 master-0 kubenswrapper[4780]: E1205 12:29:18.810159 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Dec 05 12:29:18.810620 master-0 kubenswrapper[4780]: I1205 12:29:18.810432 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.810620 master-0 kubenswrapper[4780]: I1205 12:29:18.810477 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.810620 master-0 kubenswrapper[4780]: I1205 12:29:18.810509 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:29:18.810793 master-0 kubenswrapper[4780]: I1205 12:29:18.810632 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.810793 master-0 kubenswrapper[4780]: I1205 12:29:18.810678 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.810793 master-0 kubenswrapper[4780]: I1205 12:29:18.810763 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:29:18.810960 master-0 kubenswrapper[4780]: I1205 12:29:18.810809 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:29:18.810960 master-0 kubenswrapper[4780]: I1205 12:29:18.810767 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:29:18.810960 master-0 kubenswrapper[4780]: I1205 12:29:18.810902 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.810925 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.810986 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.811010 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.811006 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.811050 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.811053 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.811069 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.811081 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.811087 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.811100 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.811146 master-0 kubenswrapper[4780]: I1205 12:29:18.811117 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811215 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811206 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811282 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811247 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811315 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811333 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811350 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811360 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811372 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811270 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811398 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811414 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811237 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:18.811756 master-0 kubenswrapper[4780]: I1205 12:29:18.811421 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:18.847687 master-0 kubenswrapper[4780]: I1205 12:29:18.847534 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:18.849302 master-0 kubenswrapper[4780]: I1205 12:29:18.849248 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:18.849302 master-0 kubenswrapper[4780]: I1205 12:29:18.849305 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:18.849485 master-0 kubenswrapper[4780]: I1205 12:29:18.849323 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:18.849485 master-0 kubenswrapper[4780]: I1205 12:29:18.849392 4780 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:29:18.850569 master-0 kubenswrapper[4780]: E1205 12:29:18.850488 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 05 12:29:18.976144 master-0 kubenswrapper[4780]: I1205 12:29:18.975864 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:29:18.988827 master-0 kubenswrapper[4780]: I1205 12:29:18.988772 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:19.013709 master-0 kubenswrapper[4780]: I1205 12:29:19.013631 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:19.048138 master-0 kubenswrapper[4780]: I1205 12:29:19.047726 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:29:19.059096 master-0 kubenswrapper[4780]: I1205 12:29:19.059035 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:29:19.114415 master-0 kubenswrapper[4780]: W1205 12:29:19.114323 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:19.115052 master-0 kubenswrapper[4780]: E1205 12:29:19.114420 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:19.197618 master-0 kubenswrapper[4780]: I1205 12:29:19.197559 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:19.233990 master-0 kubenswrapper[4780]: W1205 12:29:19.233809 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:19.233990 master-0 kubenswrapper[4780]: E1205 12:29:19.233898 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:19.251325 master-0 kubenswrapper[4780]: I1205 12:29:19.251274 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:19.252708 master-0 kubenswrapper[4780]: I1205 12:29:19.252640 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:19.252708 master-0 kubenswrapper[4780]: I1205 12:29:19.252691 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:19.252708 master-0 kubenswrapper[4780]: I1205 12:29:19.252708 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:19.252987 master-0 kubenswrapper[4780]: I1205 12:29:19.252773 4780 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:29:19.253537 master-0 kubenswrapper[4780]: E1205 12:29:19.253499 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 05 12:29:19.533153 master-0 kubenswrapper[4780]: W1205 12:29:19.533071 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75143d9bc4a2dc15781dc51ccff632a.slice/crio-e5863ee594cde86aeedce8416be9b249f569b2f49267eb70875c7f8a2e451e4e WatchSource:0}: Error finding container e5863ee594cde86aeedce8416be9b249f569b2f49267eb70875c7f8a2e451e4e: Status 404 returned error can't find the container with id e5863ee594cde86aeedce8416be9b249f569b2f49267eb70875c7f8a2e451e4e Dec 05 12:29:19.538340 master-0 kubenswrapper[4780]: I1205 12:29:19.537994 4780 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:29:19.546995 master-0 kubenswrapper[4780]: W1205 12:29:19.546894 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:19.547106 master-0 kubenswrapper[4780]: E1205 12:29:19.547014 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:19.552396 master-0 kubenswrapper[4780]: W1205 12:29:19.552303 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b47694fcc32464ab24d09c23d6efb57.slice/crio-34418050489e8b48781fa5128a0548228f5bdb58f7e6a5f88226bbd7dacf7bb5 WatchSource:0}: Error finding container 34418050489e8b48781fa5128a0548228f5bdb58f7e6a5f88226bbd7dacf7bb5: Status 404 returned error can't find the container with id 34418050489e8b48781fa5128a0548228f5bdb58f7e6a5f88226bbd7dacf7bb5 Dec 05 12:29:19.611361 master-0 kubenswrapper[4780]: E1205 12:29:19.611269 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Dec 05 12:29:19.615064 master-0 kubenswrapper[4780]: W1205 12:29:19.614968 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3169f44496ed8a28c6d6a15511ab0eec.slice/crio-5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76 WatchSource:0}: Error finding container 5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76: Status 404 returned error can't find the container with id 5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76 Dec 05 12:29:19.622992 master-0 kubenswrapper[4780]: W1205 12:29:19.622946 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc0396a9a2689b3e8c132c12640cbe83.slice/crio-e0816ccdefc3d19a555c704cf7914804a097b5a95e2655805ebd92880ab7a03f WatchSource:0}: Error finding container e0816ccdefc3d19a555c704cf7914804a097b5a95e2655805ebd92880ab7a03f: Status 404 returned error can't find the container with id e0816ccdefc3d19a555c704cf7914804a097b5a95e2655805ebd92880ab7a03f Dec 05 12:29:19.960338 master-0 kubenswrapper[4780]: W1205 12:29:19.960068 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:19.960338 master-0 kubenswrapper[4780]: E1205 12:29:19.960217 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:20.053733 master-0 kubenswrapper[4780]: I1205 12:29:20.053621 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:20.055479 master-0 kubenswrapper[4780]: I1205 12:29:20.055428 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:20.055479 master-0 kubenswrapper[4780]: I1205 12:29:20.055484 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:20.055688 master-0 kubenswrapper[4780]: I1205 12:29:20.055493 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:20.055688 master-0 kubenswrapper[4780]: I1205 12:29:20.055637 4780 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:29:20.056870 master-0 kubenswrapper[4780]: E1205 12:29:20.056800 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 05 12:29:20.177866 master-0 kubenswrapper[4780]: I1205 12:29:20.177789 4780 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 12:29:20.179648 master-0 kubenswrapper[4780]: E1205 12:29:20.179601 4780 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:20.198015 master-0 kubenswrapper[4780]: I1205 12:29:20.197954 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:20.537063 master-0 kubenswrapper[4780]: I1205 12:29:20.536828 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76"} Dec 05 12:29:20.537982 master-0 kubenswrapper[4780]: I1205 12:29:20.537876 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"1bd06826dd54922214ff0bdf4dd49e3e4fb5917fe2431fd30da1ce39eb71cae2"} Dec 05 12:29:20.539083 master-0 kubenswrapper[4780]: I1205 12:29:20.539053 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"34418050489e8b48781fa5128a0548228f5bdb58f7e6a5f88226bbd7dacf7bb5"} Dec 05 12:29:20.540711 master-0 kubenswrapper[4780]: I1205 12:29:20.540590 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"e5863ee594cde86aeedce8416be9b249f569b2f49267eb70875c7f8a2e451e4e"} Dec 05 12:29:20.541981 master-0 kubenswrapper[4780]: I1205 12:29:20.541876 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"e0816ccdefc3d19a555c704cf7914804a097b5a95e2655805ebd92880ab7a03f"} Dec 05 12:29:21.198377 master-0 kubenswrapper[4780]: I1205 12:29:21.198260 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:21.213087 master-0 kubenswrapper[4780]: E1205 12:29:21.212972 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Dec 05 12:29:21.441825 master-0 kubenswrapper[4780]: W1205 12:29:21.441615 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:21.441825 master-0 kubenswrapper[4780]: E1205 12:29:21.441742 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:21.546938 master-0 kubenswrapper[4780]: I1205 12:29:21.546862 4780 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="419ef08e96de1310d58b89d9dc91be12123ef06b7cf2f7b293589e349077e04c" exitCode=0 Dec 05 12:29:21.546938 master-0 kubenswrapper[4780]: I1205 12:29:21.546944 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"419ef08e96de1310d58b89d9dc91be12123ef06b7cf2f7b293589e349077e04c"} Dec 05 12:29:21.547248 master-0 kubenswrapper[4780]: I1205 12:29:21.547008 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:21.548339 master-0 kubenswrapper[4780]: I1205 12:29:21.548309 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:21.548389 master-0 kubenswrapper[4780]: I1205 12:29:21.548355 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:21.548389 master-0 kubenswrapper[4780]: I1205 12:29:21.548368 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:21.560668 master-0 kubenswrapper[4780]: W1205 12:29:21.560585 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:21.560774 master-0 kubenswrapper[4780]: E1205 12:29:21.560681 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:21.657224 master-0 kubenswrapper[4780]: I1205 12:29:21.657142 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:21.658533 master-0 kubenswrapper[4780]: I1205 12:29:21.658505 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:21.658602 master-0 kubenswrapper[4780]: I1205 12:29:21.658552 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:21.658602 master-0 kubenswrapper[4780]: I1205 12:29:21.658561 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:21.658654 master-0 kubenswrapper[4780]: I1205 12:29:21.658628 4780 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:29:21.659845 master-0 kubenswrapper[4780]: E1205 12:29:21.659776 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 05 12:29:21.678734 master-0 kubenswrapper[4780]: W1205 12:29:21.678671 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:21.678734 master-0 kubenswrapper[4780]: E1205 12:29:21.678737 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:22.198618 master-0 kubenswrapper[4780]: I1205 12:29:22.198567 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:22.551605 master-0 kubenswrapper[4780]: I1205 12:29:22.551428 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954"} Dec 05 12:29:22.551605 master-0 kubenswrapper[4780]: I1205 12:29:22.551490 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715"} Dec 05 12:29:22.551605 master-0 kubenswrapper[4780]: I1205 12:29:22.551578 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:22.552615 master-0 kubenswrapper[4780]: I1205 12:29:22.552581 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:22.552683 master-0 kubenswrapper[4780]: I1205 12:29:22.552625 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:22.552683 master-0 kubenswrapper[4780]: I1205 12:29:22.552640 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:22.567899 master-0 kubenswrapper[4780]: I1205 12:29:22.567848 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/0.log" Dec 05 12:29:22.568685 master-0 kubenswrapper[4780]: I1205 12:29:22.568645 4780 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="dc9ed24e389f1992b6050a7087f3f28f5704a10c8708c9215a6eb2f4f91d30bf" exitCode=1 Dec 05 12:29:22.568726 master-0 kubenswrapper[4780]: I1205 12:29:22.568700 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"dc9ed24e389f1992b6050a7087f3f28f5704a10c8708c9215a6eb2f4f91d30bf"} Dec 05 12:29:22.568844 master-0 kubenswrapper[4780]: I1205 12:29:22.568805 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:22.569863 master-0 kubenswrapper[4780]: I1205 12:29:22.569819 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:22.569929 master-0 kubenswrapper[4780]: I1205 12:29:22.569909 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:22.569966 master-0 kubenswrapper[4780]: I1205 12:29:22.569942 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:22.570723 master-0 kubenswrapper[4780]: I1205 12:29:22.570691 4780 scope.go:117] "RemoveContainer" containerID="dc9ed24e389f1992b6050a7087f3f28f5704a10c8708c9215a6eb2f4f91d30bf" Dec 05 12:29:22.742091 master-0 kubenswrapper[4780]: W1205 12:29:22.742006 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:22.742230 master-0 kubenswrapper[4780]: E1205 12:29:22.742096 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:23.197868 master-0 kubenswrapper[4780]: I1205 12:29:23.197720 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:23.528855 master-0 kubenswrapper[4780]: E1205 12:29:23.528608 4780 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.187e518a3feabbc6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.195497926 +0000 UTC m=+0.218238652,LastTimestamp:2025-12-05 12:29:18.195497926 +0000 UTC m=+0.218238652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:23.572856 master-0 kubenswrapper[4780]: I1205 12:29:23.572808 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/1.log" Dec 05 12:29:23.573632 master-0 kubenswrapper[4780]: I1205 12:29:23.573580 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/0.log" Dec 05 12:29:23.574609 master-0 kubenswrapper[4780]: I1205 12:29:23.574355 4780 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="cfb62573630f2f15a19f4e137dc5aead8f45b739f70fb38341cbbd40d372a318" exitCode=1 Dec 05 12:29:23.574609 master-0 kubenswrapper[4780]: I1205 12:29:23.574412 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"cfb62573630f2f15a19f4e137dc5aead8f45b739f70fb38341cbbd40d372a318"} Dec 05 12:29:23.574609 master-0 kubenswrapper[4780]: I1205 12:29:23.574469 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:23.574609 master-0 kubenswrapper[4780]: I1205 12:29:23.574485 4780 scope.go:117] "RemoveContainer" containerID="dc9ed24e389f1992b6050a7087f3f28f5704a10c8708c9215a6eb2f4f91d30bf" Dec 05 12:29:23.574609 master-0 kubenswrapper[4780]: I1205 12:29:23.574491 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:23.575298 master-0 kubenswrapper[4780]: I1205 12:29:23.575254 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:23.575342 master-0 kubenswrapper[4780]: I1205 12:29:23.575320 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:23.575342 master-0 kubenswrapper[4780]: I1205 12:29:23.575340 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:23.575488 master-0 kubenswrapper[4780]: I1205 12:29:23.575448 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:23.575527 master-0 kubenswrapper[4780]: I1205 12:29:23.575498 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:23.575527 master-0 kubenswrapper[4780]: I1205 12:29:23.575510 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:23.575939 master-0 kubenswrapper[4780]: I1205 12:29:23.575902 4780 scope.go:117] "RemoveContainer" containerID="cfb62573630f2f15a19f4e137dc5aead8f45b739f70fb38341cbbd40d372a318" Dec 05 12:29:23.576145 master-0 kubenswrapper[4780]: E1205 12:29:23.576108 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="3169f44496ed8a28c6d6a15511ab0eec" Dec 05 12:29:24.199028 master-0 kubenswrapper[4780]: I1205 12:29:24.198880 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:24.295106 master-0 kubenswrapper[4780]: I1205 12:29:24.294910 4780 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 12:29:24.298912 master-0 kubenswrapper[4780]: E1205 12:29:24.298121 4780 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:24.415699 master-0 kubenswrapper[4780]: E1205 12:29:24.415576 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Dec 05 12:29:24.576779 master-0 kubenswrapper[4780]: I1205 12:29:24.576714 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:24.577944 master-0 kubenswrapper[4780]: I1205 12:29:24.577909 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:24.577999 master-0 kubenswrapper[4780]: I1205 12:29:24.577965 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:24.577999 master-0 kubenswrapper[4780]: I1205 12:29:24.577977 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:24.578678 master-0 kubenswrapper[4780]: I1205 12:29:24.578646 4780 scope.go:117] "RemoveContainer" containerID="cfb62573630f2f15a19f4e137dc5aead8f45b739f70fb38341cbbd40d372a318" Dec 05 12:29:24.578864 master-0 kubenswrapper[4780]: E1205 12:29:24.578832 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="3169f44496ed8a28c6d6a15511ab0eec" Dec 05 12:29:24.861049 master-0 kubenswrapper[4780]: I1205 12:29:24.860876 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:24.862966 master-0 kubenswrapper[4780]: I1205 12:29:24.862907 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:24.863081 master-0 kubenswrapper[4780]: I1205 12:29:24.862979 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:24.863081 master-0 kubenswrapper[4780]: I1205 12:29:24.862995 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:24.863081 master-0 kubenswrapper[4780]: I1205 12:29:24.863072 4780 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:29:24.864543 master-0 kubenswrapper[4780]: E1205 12:29:24.864465 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 05 12:29:24.867399 master-0 kubenswrapper[4780]: W1205 12:29:24.867256 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:24.867491 master-0 kubenswrapper[4780]: E1205 12:29:24.867433 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:25.198827 master-0 kubenswrapper[4780]: I1205 12:29:25.198752 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:25.581627 master-0 kubenswrapper[4780]: I1205 12:29:25.581422 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/1.log" Dec 05 12:29:25.813275 master-0 kubenswrapper[4780]: W1205 12:29:25.813161 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:25.813275 master-0 kubenswrapper[4780]: E1205 12:29:25.813281 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:26.198196 master-0 kubenswrapper[4780]: I1205 12:29:26.198098 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:26.497339 master-0 kubenswrapper[4780]: W1205 12:29:26.497120 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:26.497339 master-0 kubenswrapper[4780]: E1205 12:29:26.497234 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:26.676865 master-0 kubenswrapper[4780]: W1205 12:29:26.676700 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:26.676865 master-0 kubenswrapper[4780]: E1205 12:29:26.676827 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 05 12:29:27.197880 master-0 kubenswrapper[4780]: I1205 12:29:27.197797 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:28.198625 master-0 kubenswrapper[4780]: I1205 12:29:28.198528 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:28.545677 master-0 kubenswrapper[4780]: E1205 12:29:28.545596 4780 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 05 12:29:29.198866 master-0 kubenswrapper[4780]: I1205 12:29:29.198775 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 05 12:29:29.591892 master-0 kubenswrapper[4780]: I1205 12:29:29.591802 4780 generic.go:334] "Generic (PLEG): container finished" podID="d75143d9bc4a2dc15781dc51ccff632a" containerID="697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334" exitCode=0 Dec 05 12:29:29.591892 master-0 kubenswrapper[4780]: I1205 12:29:29.591888 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerDied","Data":"697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334"} Dec 05 12:29:29.592229 master-0 kubenswrapper[4780]: I1205 12:29:29.591952 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:29.593103 master-0 kubenswrapper[4780]: I1205 12:29:29.593061 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:29.593153 master-0 kubenswrapper[4780]: I1205 12:29:29.593108 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:29.593153 master-0 kubenswrapper[4780]: I1205 12:29:29.593123 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:29.594454 master-0 kubenswrapper[4780]: I1205 12:29:29.594406 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"8c7e83119fdbf7fba596a8756e22362ec175fbd883171a7a50b5c673c4302ba8"} Dec 05 12:29:29.594454 master-0 kubenswrapper[4780]: I1205 12:29:29.594441 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:29.595142 master-0 kubenswrapper[4780]: I1205 12:29:29.595101 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:29.595220 master-0 kubenswrapper[4780]: I1205 12:29:29.595146 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:29.595220 master-0 kubenswrapper[4780]: I1205 12:29:29.595162 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:29.595709 master-0 kubenswrapper[4780]: I1205 12:29:29.595684 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:29.596328 master-0 kubenswrapper[4780]: I1205 12:29:29.596281 4780 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="123ca114b6002ab3cd24848ba210c8015d871a3bf5c2f6653a7daa022e0dea48" exitCode=1 Dec 05 12:29:29.596383 master-0 kubenswrapper[4780]: I1205 12:29:29.596359 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"123ca114b6002ab3cd24848ba210c8015d871a3bf5c2f6653a7daa022e0dea48"} Dec 05 12:29:29.596608 master-0 kubenswrapper[4780]: I1205 12:29:29.596572 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:29.596643 master-0 kubenswrapper[4780]: I1205 12:29:29.596614 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:29.596643 master-0 kubenswrapper[4780]: I1205 12:29:29.596629 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:30.608493 master-0 kubenswrapper[4780]: I1205 12:29:30.608292 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"878914476f342bbe09935d11750836541a3cd256e73418d2dbee280993c5f191"} Dec 05 12:29:30.608493 master-0 kubenswrapper[4780]: I1205 12:29:30.608412 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:30.610452 master-0 kubenswrapper[4780]: I1205 12:29:30.609618 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:30.610452 master-0 kubenswrapper[4780]: I1205 12:29:30.609651 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:30.610452 master-0 kubenswrapper[4780]: I1205 12:29:30.609662 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:30.610452 master-0 kubenswrapper[4780]: I1205 12:29:30.609942 4780 scope.go:117] "RemoveContainer" containerID="123ca114b6002ab3cd24848ba210c8015d871a3bf5c2f6653a7daa022e0dea48" Dec 05 12:29:30.611727 master-0 kubenswrapper[4780]: I1205 12:29:30.611678 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:30.612008 master-0 kubenswrapper[4780]: I1205 12:29:30.611961 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc"} Dec 05 12:29:30.612611 master-0 kubenswrapper[4780]: I1205 12:29:30.612583 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:30.612669 master-0 kubenswrapper[4780]: I1205 12:29:30.612616 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:30.612669 master-0 kubenswrapper[4780]: I1205 12:29:30.612640 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:31.019488 master-0 kubenswrapper[4780]: I1205 12:29:31.019399 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:31.019488 master-0 kubenswrapper[4780]: I1205 12:29:31.019488 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:31.206610 master-0 kubenswrapper[4780]: I1205 12:29:31.205359 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:31.206610 master-0 kubenswrapper[4780]: E1205 12:29:31.205450 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 05 12:29:31.265331 master-0 kubenswrapper[4780]: I1205 12:29:31.265268 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:31.266499 master-0 kubenswrapper[4780]: I1205 12:29:31.266458 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:31.266590 master-0 kubenswrapper[4780]: I1205 12:29:31.266512 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:31.266590 master-0 kubenswrapper[4780]: I1205 12:29:31.266524 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:31.266671 master-0 kubenswrapper[4780]: I1205 12:29:31.266594 4780 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:29:31.271870 master-0 kubenswrapper[4780]: E1205 12:29:31.271750 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 05 12:29:31.618149 master-0 kubenswrapper[4780]: I1205 12:29:31.617952 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18"} Dec 05 12:29:31.618621 master-0 kubenswrapper[4780]: I1205 12:29:31.618141 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:31.619696 master-0 kubenswrapper[4780]: I1205 12:29:31.619149 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:31.619696 master-0 kubenswrapper[4780]: I1205 12:29:31.619204 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:31.619696 master-0 kubenswrapper[4780]: I1205 12:29:31.619214 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:32.208002 master-0 kubenswrapper[4780]: I1205 12:29:32.207932 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:32.605237 master-0 kubenswrapper[4780]: I1205 12:29:32.605024 4780 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 12:29:32.620195 master-0 kubenswrapper[4780]: I1205 12:29:32.620142 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:32.621156 master-0 kubenswrapper[4780]: I1205 12:29:32.621106 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:32.621230 master-0 kubenswrapper[4780]: I1205 12:29:32.621203 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:32.621230 master-0 kubenswrapper[4780]: I1205 12:29:32.621224 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:33.070219 master-0 kubenswrapper[4780]: I1205 12:29:33.070158 4780 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 05 12:29:33.204617 master-0 kubenswrapper[4780]: I1205 12:29:33.204553 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:33.535933 master-0 kubenswrapper[4780]: E1205 12:29:33.535728 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a3feabbc6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.195497926 +0000 UTC m=+0.218238652,LastTimestamp:2025-12-05 12:29:18.195497926 +0000 UTC m=+0.218238652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.540725 master-0 kubenswrapper[4780]: E1205 12:29:33.540598 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cb371 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237569905 +0000 UTC m=+0.260310631,LastTimestamp:2025-12-05 12:29:18.237569905 +0000 UTC m=+0.260310631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.545258 master-0 kubenswrapper[4780]: E1205 12:29:33.544910 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cf242 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237585986 +0000 UTC m=+0.260326712,LastTimestamp:2025-12-05 12:29:18.237585986 +0000 UTC m=+0.260326712,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.549597 master-0 kubenswrapper[4780]: E1205 12:29:33.549300 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426d166e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237595246 +0000 UTC m=+0.260335972,LastTimestamp:2025-12-05 12:29:18.237595246 +0000 UTC m=+0.260335972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.553815 master-0 kubenswrapper[4780]: E1205 12:29:33.553547 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a54e7491d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.547593501 +0000 UTC m=+0.570334257,LastTimestamp:2025-12-05 12:29:18.547593501 +0000 UTC m=+0.570334257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.560866 master-0 kubenswrapper[4780]: E1205 12:29:33.560718 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cb371\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cb371 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237569905 +0000 UTC m=+0.260310631,LastTimestamp:2025-12-05 12:29:18.63261194 +0000 UTC m=+0.655352686,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.566993 master-0 kubenswrapper[4780]: E1205 12:29:33.566870 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cf242\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cf242 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237585986 +0000 UTC m=+0.260326712,LastTimestamp:2025-12-05 12:29:18.632641721 +0000 UTC m=+0.655382467,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.572069 master-0 kubenswrapper[4780]: E1205 12:29:33.571947 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426d166e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426d166e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237595246 +0000 UTC m=+0.260335972,LastTimestamp:2025-12-05 12:29:18.632658832 +0000 UTC m=+0.655399578,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.576713 master-0 kubenswrapper[4780]: E1205 12:29:33.576596 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cb371\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cb371 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237569905 +0000 UTC m=+0.260310631,LastTimestamp:2025-12-05 12:29:18.633976095 +0000 UTC m=+0.656716841,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.581514 master-0 kubenswrapper[4780]: E1205 12:29:33.581298 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cf242\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cf242 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237585986 +0000 UTC m=+0.260326712,LastTimestamp:2025-12-05 12:29:18.633997557 +0000 UTC m=+0.656738293,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.586172 master-0 kubenswrapper[4780]: E1205 12:29:33.586060 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426d166e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426d166e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237595246 +0000 UTC m=+0.260335972,LastTimestamp:2025-12-05 12:29:18.634013267 +0000 UTC m=+0.656754013,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.590555 master-0 kubenswrapper[4780]: E1205 12:29:33.590456 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cb371\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cb371 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237569905 +0000 UTC m=+0.260310631,LastTimestamp:2025-12-05 12:29:18.635064688 +0000 UTC m=+0.657805444,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.595166 master-0 kubenswrapper[4780]: E1205 12:29:33.595068 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cf242\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cf242 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237585986 +0000 UTC m=+0.260326712,LastTimestamp:2025-12-05 12:29:18.6351094 +0000 UTC m=+0.657850156,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.601920 master-0 kubenswrapper[4780]: E1205 12:29:33.601771 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426d166e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426d166e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237595246 +0000 UTC m=+0.260335972,LastTimestamp:2025-12-05 12:29:18.635126041 +0000 UTC m=+0.657866797,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.606522 master-0 kubenswrapper[4780]: E1205 12:29:33.606387 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cb371\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cb371 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237569905 +0000 UTC m=+0.260310631,LastTimestamp:2025-12-05 12:29:18.635760331 +0000 UTC m=+0.658501067,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.610833 master-0 kubenswrapper[4780]: E1205 12:29:33.610760 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cf242\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cf242 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237585986 +0000 UTC m=+0.260326712,LastTimestamp:2025-12-05 12:29:18.635794553 +0000 UTC m=+0.658535299,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.615529 master-0 kubenswrapper[4780]: E1205 12:29:33.615389 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426d166e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426d166e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237595246 +0000 UTC m=+0.260335972,LastTimestamp:2025-12-05 12:29:18.635861236 +0000 UTC m=+0.658601982,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.620673 master-0 kubenswrapper[4780]: E1205 12:29:33.620523 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cb371\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cb371 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237569905 +0000 UTC m=+0.260310631,LastTimestamp:2025-12-05 12:29:18.635909358 +0000 UTC m=+0.658650124,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.626260 master-0 kubenswrapper[4780]: E1205 12:29:33.626116 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cf242\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cf242 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237585986 +0000 UTC m=+0.260326712,LastTimestamp:2025-12-05 12:29:18.63593949 +0000 UTC m=+0.658680256,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.632290 master-0 kubenswrapper[4780]: E1205 12:29:33.632060 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426d166e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426d166e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237595246 +0000 UTC m=+0.260335972,LastTimestamp:2025-12-05 12:29:18.635966951 +0000 UTC m=+0.658707717,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.639124 master-0 kubenswrapper[4780]: E1205 12:29:33.638964 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cb371\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cb371 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237569905 +0000 UTC m=+0.260310631,LastTimestamp:2025-12-05 12:29:18.636999781 +0000 UTC m=+0.659740517,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.645048 master-0 kubenswrapper[4780]: E1205 12:29:33.644795 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cf242\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cf242 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237585986 +0000 UTC m=+0.260326712,LastTimestamp:2025-12-05 12:29:18.637014872 +0000 UTC m=+0.659755608,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.651275 master-0 kubenswrapper[4780]: E1205 12:29:33.651062 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426d166e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426d166e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237595246 +0000 UTC m=+0.260335972,LastTimestamp:2025-12-05 12:29:18.637028012 +0000 UTC m=+0.659768748,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.656609 master-0 kubenswrapper[4780]: E1205 12:29:33.656444 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cb371\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cb371 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237569905 +0000 UTC m=+0.260310631,LastTimestamp:2025-12-05 12:29:18.637266214 +0000 UTC m=+0.660006970,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.662341 master-0 kubenswrapper[4780]: E1205 12:29:33.662161 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187e518a426cf242\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187e518a426cf242 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:18.237585986 +0000 UTC m=+0.260326712,LastTimestamp:2025-12-05 12:29:18.637301615 +0000 UTC m=+0.660042371,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.668671 master-0 kubenswrapper[4780]: E1205 12:29:33.668542 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518a8fee5c2a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:19.537912874 +0000 UTC m=+1.560653600,LastTimestamp:2025-12-05 12:29:19.537912874 +0000 UTC m=+1.560653600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.672906 master-0 kubenswrapper[4780]: E1205 12:29:33.672773 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518a911f4d4c kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:19.557897548 +0000 UTC m=+1.580638274,LastTimestamp:2025-12-05 12:29:19.557897548 +0000 UTC m=+1.580638274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.676993 master-0 kubenswrapper[4780]: E1205 12:29:33.676889 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e518a91bcb8d1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:19.568214225 +0000 UTC m=+1.590954971,LastTimestamp:2025-12-05 12:29:19.568214225 +0000 UTC m=+1.590954971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.681730 master-0 kubenswrapper[4780]: E1205 12:29:33.681588 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518a94af00a1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:19.617646753 +0000 UTC m=+1.640387489,LastTimestamp:2025-12-05 12:29:19.617646753 +0000 UTC m=+1.640387489,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.685870 master-0 kubenswrapper[4780]: E1205 12:29:33.685717 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e518a9537e934 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:19.626619188 +0000 UTC m=+1.649359914,LastTimestamp:2025-12-05 12:29:19.626619188 +0000 UTC m=+1.649359914,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.691519 master-0 kubenswrapper[4780]: E1205 12:29:33.691358 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518ae1bc893c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\" in 1.292s (1.292s including waiting). Image size: 459552216 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:20.910379324 +0000 UTC m=+2.933120050,LastTimestamp:2025-12-05 12:29:20.910379324 +0000 UTC m=+2.933120050,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.697061 master-0 kubenswrapper[4780]: E1205 12:29:33.696886 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518af02cfd1a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:21.152630042 +0000 UTC m=+3.175370768,LastTimestamp:2025-12-05 12:29:21.152630042 +0000 UTC m=+3.175370768,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.703952 master-0 kubenswrapper[4780]: E1205 12:29:33.703633 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518af0e260b9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:21.164517561 +0000 UTC m=+3.187258297,LastTimestamp:2025-12-05 12:29:21.164517561 +0000 UTC m=+3.187258297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.710125 master-0 kubenswrapper[4780]: E1205 12:29:33.710002 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b23535463 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.010780771 +0000 UTC m=+4.033521497,LastTimestamp:2025-12-05 12:29:22.010780771 +0000 UTC m=+4.033521497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.715670 master-0 kubenswrapper[4780]: E1205 12:29:33.714822 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e518b2662d3b0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\" in 2.435s (2.435s including waiting). Image size: 532719167 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.062128048 +0000 UTC m=+4.084868784,LastTimestamp:2025-12-05 12:29:22.062128048 +0000 UTC m=+4.084868784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.720735 master-0 kubenswrapper[4780]: E1205 12:29:33.720616 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b306cbaef openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.230549231 +0000 UTC m=+4.253289957,LastTimestamp:2025-12-05 12:29:22.230549231 +0000 UTC m=+4.253289957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.724626 master-0 kubenswrapper[4780]: E1205 12:29:33.724530 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e518b30f60a38 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.23954796 +0000 UTC m=+4.262288686,LastTimestamp:2025-12-05 12:29:22.23954796 +0000 UTC m=+4.262288686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.728647 master-0 kubenswrapper[4780]: E1205 12:29:33.728464 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b31764d00 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.247953664 +0000 UTC m=+4.270694390,LastTimestamp:2025-12-05 12:29:22.247953664 +0000 UTC m=+4.270694390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.734328 master-0 kubenswrapper[4780]: E1205 12:29:33.734096 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e518b31d03065 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.253844581 +0000 UTC m=+4.276585307,LastTimestamp:2025-12-05 12:29:22.253844581 +0000 UTC m=+4.276585307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.738759 master-0 kubenswrapper[4780]: E1205 12:29:33.738643 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e518b31f6d064 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.256375908 +0000 UTC m=+4.279116634,LastTimestamp:2025-12-05 12:29:22.256375908 +0000 UTC m=+4.279116634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.743934 master-0 kubenswrapper[4780]: E1205 12:29:33.743751 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e518b3d80801e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.449948702 +0000 UTC m=+4.472689428,LastTimestamp:2025-12-05 12:29:22.449948702 +0000 UTC m=+4.472689428,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.749384 master-0 kubenswrapper[4780]: E1205 12:29:33.749199 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e518b3e192a36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.459953718 +0000 UTC m=+4.482694444,LastTimestamp:2025-12-05 12:29:22.459953718 +0000 UTC m=+4.482694444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.753351 master-0 kubenswrapper[4780]: E1205 12:29:33.753244 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e518b23535463\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b23535463 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.010780771 +0000 UTC m=+4.033521497,LastTimestamp:2025-12-05 12:29:22.573429209 +0000 UTC m=+4.596169935,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.759009 master-0 kubenswrapper[4780]: E1205 12:29:33.758900 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e518b306cbaef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b306cbaef openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.230549231 +0000 UTC m=+4.253289957,LastTimestamp:2025-12-05 12:29:22.787763705 +0000 UTC m=+4.810504431,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.766464 master-0 kubenswrapper[4780]: E1205 12:29:33.766319 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e518b31764d00\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b31764d00 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.247953664 +0000 UTC m=+4.270694390,LastTimestamp:2025-12-05 12:29:22.802342742 +0000 UTC m=+4.825083468,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.773946 master-0 kubenswrapper[4780]: E1205 12:29:33.773783 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b809fa9b7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:23.576064439 +0000 UTC m=+5.598805165,LastTimestamp:2025-12-05 12:29:23.576064439 +0000 UTC m=+5.598805165,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.778952 master-0 kubenswrapper[4780]: E1205 12:29:33.778708 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e518b809fa9b7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b809fa9b7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:23.576064439 +0000 UTC m=+5.598805165,LastTimestamp:2025-12-05 12:29:24.578800811 +0000 UTC m=+6.601541537,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.783963 master-0 kubenswrapper[4780]: E1205 12:29:33.783781 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e518ca6953955 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" in 8.939s (8.939s including waiting). Image size: 938303566 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.507881813 +0000 UTC m=+10.530622539,LastTimestamp:2025-12-05 12:29:28.507881813 +0000 UTC m=+10.530622539,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.788383 master-0 kubenswrapper[4780]: E1205 12:29:33.788218 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518ca93e3173 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" in 9.014s (9.014s including waiting). Image size: 938303566 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.552509811 +0000 UTC m=+10.575250537,LastTimestamp:2025-12-05 12:29:28.552509811 +0000 UTC m=+10.575250537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.795892 master-0 kubenswrapper[4780]: E1205 12:29:33.795286 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518ca94abe06 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" in 8.995s (8.995s including waiting). Image size: 938303566 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.55333223 +0000 UTC m=+10.576072966,LastTimestamp:2025-12-05 12:29:28.55333223 +0000 UTC m=+10.576072966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.799540 master-0 kubenswrapper[4780]: E1205 12:29:33.799482 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e518cb1aec6bd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.694105789 +0000 UTC m=+10.716846525,LastTimestamp:2025-12-05 12:29:28.694105789 +0000 UTC m=+10.716846525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.804225 master-0 kubenswrapper[4780]: E1205 12:29:33.804142 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e518cb265bf9d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.706097053 +0000 UTC m=+10.728837779,LastTimestamp:2025-12-05 12:29:28.706097053 +0000 UTC m=+10.728837779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.807888 master-0 kubenswrapper[4780]: E1205 12:29:33.807792 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518cb32824c6 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.718836934 +0000 UTC m=+10.741577660,LastTimestamp:2025-12-05 12:29:28.718836934 +0000 UTC m=+10.741577660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.811319 master-0 kubenswrapper[4780]: E1205 12:29:33.811111 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518cb46fb8cf kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.740305103 +0000 UTC m=+10.763045829,LastTimestamp:2025-12-05 12:29:28.740305103 +0000 UTC m=+10.763045829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.814601 master-0 kubenswrapper[4780]: E1205 12:29:33.814539 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518cb47c7c4e kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.741141582 +0000 UTC m=+10.763882308,LastTimestamp:2025-12-05 12:29:28.741141582 +0000 UTC m=+10.763882308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.817990 master-0 kubenswrapper[4780]: E1205 12:29:33.817908 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518cb48a2128 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.742035752 +0000 UTC m=+10.764776478,LastTimestamp:2025-12-05 12:29:28.742035752 +0000 UTC m=+10.764776478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.821831 master-0 kubenswrapper[4780]: E1205 12:29:33.821762 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518cb52e415a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.752791898 +0000 UTC m=+10.775532624,LastTimestamp:2025-12-05 12:29:28.752791898 +0000 UTC m=+10.775532624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.825840 master-0 kubenswrapper[4780]: E1205 12:29:33.825757 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518ce7697e6a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:29.595534954 +0000 UTC m=+11.618275680,LastTimestamp:2025-12-05 12:29:29.595534954 +0000 UTC m=+11.618275680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.829989 master-0 kubenswrapper[4780]: E1205 12:29:33.829863 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518cf1b871d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:29.768481238 +0000 UTC m=+11.791221974,LastTimestamp:2025-12-05 12:29:29.768481238 +0000 UTC m=+11.791221974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.834090 master-0 kubenswrapper[4780]: E1205 12:29:33.833979 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518cf296771c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:29.78303158 +0000 UTC m=+11.805772296,LastTimestamp:2025-12-05 12:29:29.78303158 +0000 UTC m=+11.805772296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.837798 master-0 kubenswrapper[4780]: E1205 12:29:33.837726 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518cf2a9e1f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:29.784304119 +0000 UTC m=+11.807044845,LastTimestamp:2025-12-05 12:29:29.784304119 +0000 UTC m=+11.807044845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.841717 master-0 kubenswrapper[4780]: E1205 12:29:33.841537 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518d14d032cb kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\" in 1.616s (1.616s including waiting). Image size: 499705918 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:30.357240523 +0000 UTC m=+12.379981249,LastTimestamp:2025-12-05 12:29:30.357240523 +0000 UTC m=+12.379981249,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.845709 master-0 kubenswrapper[4780]: E1205 12:29:33.845551 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518d204842b2 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:30.54965829 +0000 UTC m=+12.572399016,LastTimestamp:2025-12-05 12:29:30.54965829 +0000 UTC m=+12.572399016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.849692 master-0 kubenswrapper[4780]: E1205 12:29:33.849599 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518d210f7511 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:30.562712849 +0000 UTC m=+12.585453575,LastTimestamp:2025-12-05 12:29:30.562712849 +0000 UTC m=+12.585453575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.854838 master-0 kubenswrapper[4780]: E1205 12:29:33.854576 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518d2426df61 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:30.614579041 +0000 UTC m=+12.637319777,LastTimestamp:2025-12-05 12:29:30.614579041 +0000 UTC m=+12.637319777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.861232 master-0 kubenswrapper[4780]: E1205 12:29:33.861084 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.187e518cb32824c6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518cb32824c6 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.718836934 +0000 UTC m=+10.741577660,LastTimestamp:2025-12-05 12:29:30.843823739 +0000 UTC m=+12.866564465,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.865547 master-0 kubenswrapper[4780]: E1205 12:29:33.865468 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.187e518cb46fb8cf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e518cb46fb8cf kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:28.740305103 +0000 UTC m=+10.763045829,LastTimestamp:2025-12-05 12:29:30.852852314 +0000 UTC m=+12.875593040,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.870039 master-0 kubenswrapper[4780]: E1205 12:29:33.869874 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518dd9c403b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\" in 3.877s (3.877s including waiting). Image size: 509437356 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:33.66155359 +0000 UTC m=+15.684294316,LastTimestamp:2025-12-05 12:29:33.66155359 +0000 UTC m=+15.684294316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.875724 master-0 kubenswrapper[4780]: E1205 12:29:33.875593 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518de37dcb9f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:33.824723871 +0000 UTC m=+15.847464597,LastTimestamp:2025-12-05 12:29:33.824723871 +0000 UTC m=+15.847464597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:33.879565 master-0 kubenswrapper[4780]: E1205 12:29:33.879416 4780 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187e518de4291069 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:d75143d9bc4a2dc15781dc51ccff632a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:33.835948137 +0000 UTC m=+15.858688863,LastTimestamp:2025-12-05 12:29:33.835948137 +0000 UTC m=+15.858688863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:34.203999 master-0 kubenswrapper[4780]: I1205 12:29:34.203880 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:34.477484 master-0 kubenswrapper[4780]: W1205 12:29:34.477088 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:34.477484 master-0 kubenswrapper[4780]: E1205 12:29:34.477241 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 05 12:29:34.628895 master-0 kubenswrapper[4780]: I1205 12:29:34.628779 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d"} Dec 05 12:29:34.629781 master-0 kubenswrapper[4780]: I1205 12:29:34.628957 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:34.630394 master-0 kubenswrapper[4780]: I1205 12:29:34.630364 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:34.630394 master-0 kubenswrapper[4780]: I1205 12:29:34.630402 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:34.630657 master-0 kubenswrapper[4780]: I1205 12:29:34.630415 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:35.030058 master-0 kubenswrapper[4780]: I1205 12:29:35.029968 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:35.204946 master-0 kubenswrapper[4780]: I1205 12:29:35.204869 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:35.344499 master-0 kubenswrapper[4780]: W1205 12:29:35.344438 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Dec 05 12:29:35.344607 master-0 kubenswrapper[4780]: E1205 12:29:35.344539 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 05 12:29:35.632697 master-0 kubenswrapper[4780]: I1205 12:29:35.632449 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:35.634065 master-0 kubenswrapper[4780]: I1205 12:29:35.633995 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:35.634162 master-0 kubenswrapper[4780]: I1205 12:29:35.634076 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:35.634162 master-0 kubenswrapper[4780]: I1205 12:29:35.634100 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:35.700628 master-0 kubenswrapper[4780]: W1205 12:29:35.700502 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Dec 05 12:29:35.700628 master-0 kubenswrapper[4780]: E1205 12:29:35.700602 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 05 12:29:36.205130 master-0 kubenswrapper[4780]: I1205 12:29:36.205044 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:36.635453 master-0 kubenswrapper[4780]: I1205 12:29:36.635292 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:36.636778 master-0 kubenswrapper[4780]: I1205 12:29:36.636743 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:36.636858 master-0 kubenswrapper[4780]: I1205 12:29:36.636803 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:36.636858 master-0 kubenswrapper[4780]: I1205 12:29:36.636828 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:37.206889 master-0 kubenswrapper[4780]: I1205 12:29:37.206825 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:37.530987 master-0 kubenswrapper[4780]: I1205 12:29:37.530769 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:37.532553 master-0 kubenswrapper[4780]: I1205 12:29:37.532518 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:37.532659 master-0 kubenswrapper[4780]: I1205 12:29:37.532648 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:37.532717 master-0 kubenswrapper[4780]: I1205 12:29:37.532708 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:37.533261 master-0 kubenswrapper[4780]: I1205 12:29:37.533248 4780 scope.go:117] "RemoveContainer" containerID="cfb62573630f2f15a19f4e137dc5aead8f45b739f70fb38341cbbd40d372a318" Dec 05 12:29:37.544460 master-0 kubenswrapper[4780]: E1205 12:29:37.544276 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e518b23535463\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b23535463 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.010780771 +0000 UTC m=+4.033521497,LastTimestamp:2025-12-05 12:29:37.536428026 +0000 UTC m=+19.559168792,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:37.565212 master-0 kubenswrapper[4780]: I1205 12:29:37.565028 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:37.566035 master-0 kubenswrapper[4780]: I1205 12:29:37.565408 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:37.567252 master-0 kubenswrapper[4780]: I1205 12:29:37.567209 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:37.567314 master-0 kubenswrapper[4780]: I1205 12:29:37.567262 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:37.567314 master-0 kubenswrapper[4780]: I1205 12:29:37.567293 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:37.770643 master-0 kubenswrapper[4780]: E1205 12:29:37.770443 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e518b306cbaef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b306cbaef openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.230549231 +0000 UTC m=+4.253289957,LastTimestamp:2025-12-05 12:29:37.76027133 +0000 UTC m=+19.783012056,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:37.784306 master-0 kubenswrapper[4780]: E1205 12:29:37.783360 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e518b31764d00\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b31764d00 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:22.247953664 +0000 UTC m=+4.270694390,LastTimestamp:2025-12-05 12:29:37.774658268 +0000 UTC m=+19.797399014,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:37.811943 master-0 kubenswrapper[4780]: W1205 12:29:37.811880 4780 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Dec 05 12:29:37.812134 master-0 kubenswrapper[4780]: E1205 12:29:37.811938 4780 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 05 12:29:37.918838 master-0 kubenswrapper[4780]: I1205 12:29:37.918737 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:37.919275 master-0 kubenswrapper[4780]: I1205 12:29:37.918904 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:37.920628 master-0 kubenswrapper[4780]: I1205 12:29:37.920586 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:37.920628 master-0 kubenswrapper[4780]: I1205 12:29:37.920622 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:37.920628 master-0 kubenswrapper[4780]: I1205 12:29:37.920635 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:37.925427 master-0 kubenswrapper[4780]: I1205 12:29:37.925385 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:38.202613 master-0 kubenswrapper[4780]: I1205 12:29:38.202518 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:38.210838 master-0 kubenswrapper[4780]: E1205 12:29:38.210744 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 05 12:29:38.272795 master-0 kubenswrapper[4780]: I1205 12:29:38.272639 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:38.274702 master-0 kubenswrapper[4780]: I1205 12:29:38.274625 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:38.274702 master-0 kubenswrapper[4780]: I1205 12:29:38.274704 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:38.274874 master-0 kubenswrapper[4780]: I1205 12:29:38.274751 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:38.274943 master-0 kubenswrapper[4780]: I1205 12:29:38.274882 4780 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:29:38.280883 master-0 kubenswrapper[4780]: E1205 12:29:38.280799 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 05 12:29:38.546130 master-0 kubenswrapper[4780]: E1205 12:29:38.545872 4780 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 05 12:29:38.643557 master-0 kubenswrapper[4780]: I1205 12:29:38.643457 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/2.log" Dec 05 12:29:38.644651 master-0 kubenswrapper[4780]: I1205 12:29:38.644594 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/1.log" Dec 05 12:29:38.645532 master-0 kubenswrapper[4780]: I1205 12:29:38.645479 4780 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="1071666e1f17d7cf23c890a67d65947ba5ea19368b6f26e80669ec0f695e375b" exitCode=1 Dec 05 12:29:38.645672 master-0 kubenswrapper[4780]: I1205 12:29:38.645624 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"1071666e1f17d7cf23c890a67d65947ba5ea19368b6f26e80669ec0f695e375b"} Dec 05 12:29:38.645738 master-0 kubenswrapper[4780]: I1205 12:29:38.645697 4780 scope.go:117] "RemoveContainer" containerID="cfb62573630f2f15a19f4e137dc5aead8f45b739f70fb38341cbbd40d372a318" Dec 05 12:29:38.645806 master-0 kubenswrapper[4780]: I1205 12:29:38.645744 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:38.646126 master-0 kubenswrapper[4780]: I1205 12:29:38.646024 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:38.647401 master-0 kubenswrapper[4780]: I1205 12:29:38.647319 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:38.647550 master-0 kubenswrapper[4780]: I1205 12:29:38.647404 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:38.647550 master-0 kubenswrapper[4780]: I1205 12:29:38.647434 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:38.647550 master-0 kubenswrapper[4780]: I1205 12:29:38.647499 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:38.647550 master-0 kubenswrapper[4780]: I1205 12:29:38.647538 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:38.648010 master-0 kubenswrapper[4780]: I1205 12:29:38.647555 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:38.649081 master-0 kubenswrapper[4780]: I1205 12:29:38.648956 4780 scope.go:117] "RemoveContainer" containerID="1071666e1f17d7cf23c890a67d65947ba5ea19368b6f26e80669ec0f695e375b" Dec 05 12:29:38.649383 master-0 kubenswrapper[4780]: E1205 12:29:38.649304 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="3169f44496ed8a28c6d6a15511ab0eec" Dec 05 12:29:38.658524 master-0 kubenswrapper[4780]: E1205 12:29:38.658272 4780 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187e518b809fa9b7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187e518b809fa9b7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:3169f44496ed8a28c6d6a15511ab0eec,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:29:23.576064439 +0000 UTC m=+5.598805165,LastTimestamp:2025-12-05 12:29:38.64923208 +0000 UTC m=+20.671972846,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:29:38.952131 master-0 kubenswrapper[4780]: I1205 12:29:38.952011 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:39.203869 master-0 kubenswrapper[4780]: I1205 12:29:39.203727 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:39.482433 master-0 kubenswrapper[4780]: I1205 12:29:39.482161 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:39.482706 master-0 kubenswrapper[4780]: I1205 12:29:39.482563 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:39.484637 master-0 kubenswrapper[4780]: I1205 12:29:39.484525 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:39.484637 master-0 kubenswrapper[4780]: I1205 12:29:39.484609 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:39.484637 master-0 kubenswrapper[4780]: I1205 12:29:39.484627 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:39.490214 master-0 kubenswrapper[4780]: I1205 12:29:39.490124 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:39.650836 master-0 kubenswrapper[4780]: I1205 12:29:39.650768 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/2.log" Dec 05 12:29:39.651681 master-0 kubenswrapper[4780]: I1205 12:29:39.651487 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:39.651681 master-0 kubenswrapper[4780]: I1205 12:29:39.651559 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:39.652722 master-0 kubenswrapper[4780]: I1205 12:29:39.652676 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:39.652927 master-0 kubenswrapper[4780]: I1205 12:29:39.652751 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:39.652927 master-0 kubenswrapper[4780]: I1205 12:29:39.652691 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:39.652927 master-0 kubenswrapper[4780]: I1205 12:29:39.652774 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:39.652927 master-0 kubenswrapper[4780]: I1205 12:29:39.652823 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:39.652927 master-0 kubenswrapper[4780]: I1205 12:29:39.652845 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:39.656996 master-0 kubenswrapper[4780]: I1205 12:29:39.656896 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:29:40.204835 master-0 kubenswrapper[4780]: I1205 12:29:40.204734 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:40.654771 master-0 kubenswrapper[4780]: I1205 12:29:40.654562 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:40.656121 master-0 kubenswrapper[4780]: I1205 12:29:40.656067 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:40.656121 master-0 kubenswrapper[4780]: I1205 12:29:40.656120 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:40.656404 master-0 kubenswrapper[4780]: I1205 12:29:40.656134 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:41.018882 master-0 kubenswrapper[4780]: I1205 12:29:41.018771 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:41.019166 master-0 kubenswrapper[4780]: I1205 12:29:41.019029 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:41.020416 master-0 kubenswrapper[4780]: I1205 12:29:41.020370 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:41.020416 master-0 kubenswrapper[4780]: I1205 12:29:41.020408 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:41.020416 master-0 kubenswrapper[4780]: I1205 12:29:41.020421 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:41.023906 master-0 kubenswrapper[4780]: I1205 12:29:41.023856 4780 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:41.026385 master-0 kubenswrapper[4780]: I1205 12:29:41.026324 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:41.206044 master-0 kubenswrapper[4780]: I1205 12:29:41.205853 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:41.656250 master-0 kubenswrapper[4780]: I1205 12:29:41.656155 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:41.657629 master-0 kubenswrapper[4780]: I1205 12:29:41.657563 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:41.657702 master-0 kubenswrapper[4780]: I1205 12:29:41.657648 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:41.657702 master-0 kubenswrapper[4780]: I1205 12:29:41.657674 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:41.660711 master-0 kubenswrapper[4780]: I1205 12:29:41.660663 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:29:42.204670 master-0 kubenswrapper[4780]: I1205 12:29:42.204376 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:42.568346 master-0 kubenswrapper[4780]: I1205 12:29:42.568147 4780 csr.go:261] certificate signing request csr-pl4jp is approved, waiting to be issued Dec 05 12:29:42.658952 master-0 kubenswrapper[4780]: I1205 12:29:42.658894 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:42.660016 master-0 kubenswrapper[4780]: I1205 12:29:42.659982 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:42.660121 master-0 kubenswrapper[4780]: I1205 12:29:42.660030 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:42.660121 master-0 kubenswrapper[4780]: I1205 12:29:42.660044 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:43.203649 master-0 kubenswrapper[4780]: I1205 12:29:43.203558 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:43.661592 master-0 kubenswrapper[4780]: I1205 12:29:43.661355 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:43.662650 master-0 kubenswrapper[4780]: I1205 12:29:43.662575 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:43.662736 master-0 kubenswrapper[4780]: I1205 12:29:43.662658 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:43.662736 master-0 kubenswrapper[4780]: I1205 12:29:43.662682 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:44.204815 master-0 kubenswrapper[4780]: I1205 12:29:44.204707 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:45.205503 master-0 kubenswrapper[4780]: I1205 12:29:45.205403 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:45.218543 master-0 kubenswrapper[4780]: E1205 12:29:45.218489 4780 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 05 12:29:45.281430 master-0 kubenswrapper[4780]: I1205 12:29:45.281341 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:45.282957 master-0 kubenswrapper[4780]: I1205 12:29:45.282873 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:45.283028 master-0 kubenswrapper[4780]: I1205 12:29:45.282964 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:45.283028 master-0 kubenswrapper[4780]: I1205 12:29:45.282987 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:45.283243 master-0 kubenswrapper[4780]: I1205 12:29:45.283059 4780 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:29:45.290767 master-0 kubenswrapper[4780]: E1205 12:29:45.290699 4780 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 05 12:29:46.204709 master-0 kubenswrapper[4780]: I1205 12:29:46.204651 4780 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 05 12:29:47.128591 master-0 kubenswrapper[4780]: I1205 12:29:47.128525 4780 csr.go:257] certificate signing request csr-pl4jp is issued Dec 05 12:29:47.139661 master-0 kubenswrapper[4780]: I1205 12:29:47.139579 4780 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 05 12:29:47.211579 master-0 kubenswrapper[4780]: I1205 12:29:47.211499 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:47.231100 master-0 kubenswrapper[4780]: I1205 12:29:47.231058 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:47.289530 master-0 kubenswrapper[4780]: I1205 12:29:47.289457 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:47.566664 master-0 kubenswrapper[4780]: I1205 12:29:47.566623 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:47.566934 master-0 kubenswrapper[4780]: E1205 12:29:47.566920 4780 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 05 12:29:47.587901 master-0 kubenswrapper[4780]: I1205 12:29:47.587839 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:47.613206 master-0 kubenswrapper[4780]: I1205 12:29:47.613123 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:47.671696 master-0 kubenswrapper[4780]: I1205 12:29:47.671635 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:47.928198 master-0 kubenswrapper[4780]: I1205 12:29:47.928082 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:47.928480 master-0 kubenswrapper[4780]: E1205 12:29:47.928470 4780 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 05 12:29:48.032098 master-0 kubenswrapper[4780]: I1205 12:29:48.032027 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:48.048418 master-0 kubenswrapper[4780]: I1205 12:29:48.048345 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:48.108209 master-0 kubenswrapper[4780]: I1205 12:29:48.108099 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:48.130966 master-0 kubenswrapper[4780]: I1205 12:29:48.130562 4780 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-06 12:20:41 +0000 UTC, rotation deadline is 2025-12-06 08:32:44.248283239 +0000 UTC Dec 05 12:29:48.130966 master-0 kubenswrapper[4780]: I1205 12:29:48.130627 4780 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h2m56.117660506s for next certificate rotation Dec 05 12:29:48.379255 master-0 kubenswrapper[4780]: I1205 12:29:48.379165 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:48.379255 master-0 kubenswrapper[4780]: E1205 12:29:48.379237 4780 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 05 12:29:48.546478 master-0 kubenswrapper[4780]: E1205 12:29:48.546387 4780 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 05 12:29:48.940452 master-0 kubenswrapper[4780]: I1205 12:29:48.940389 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:48.957244 master-0 kubenswrapper[4780]: I1205 12:29:48.957098 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:49.015764 master-0 kubenswrapper[4780]: I1205 12:29:49.015674 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:49.299136 master-0 kubenswrapper[4780]: I1205 12:29:49.299002 4780 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 05 12:29:49.299136 master-0 kubenswrapper[4780]: E1205 12:29:49.299049 4780 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 05 12:29:49.712926 master-0 kubenswrapper[4780]: I1205 12:29:49.712850 4780 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 12:29:52.203077 master-0 kubenswrapper[4780]: I1205 12:29:52.202986 4780 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 12:29:52.225832 master-0 kubenswrapper[4780]: E1205 12:29:52.225739 4780 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Dec 05 12:29:52.291348 master-0 kubenswrapper[4780]: I1205 12:29:52.291234 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:52.292785 master-0 kubenswrapper[4780]: I1205 12:29:52.292737 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:52.292861 master-0 kubenswrapper[4780]: I1205 12:29:52.292800 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:52.292861 master-0 kubenswrapper[4780]: I1205 12:29:52.292823 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:52.292931 master-0 kubenswrapper[4780]: I1205 12:29:52.292903 4780 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:29:52.306463 master-0 kubenswrapper[4780]: I1205 12:29:52.306381 4780 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 05 12:29:52.306668 master-0 kubenswrapper[4780]: E1205 12:29:52.306491 4780 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Dec 05 12:29:52.318967 master-0 kubenswrapper[4780]: E1205 12:29:52.318866 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:52.419810 master-0 kubenswrapper[4780]: E1205 12:29:52.419685 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:52.520804 master-0 kubenswrapper[4780]: E1205 12:29:52.520628 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:52.621718 master-0 kubenswrapper[4780]: E1205 12:29:52.621661 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:52.722720 master-0 kubenswrapper[4780]: E1205 12:29:52.722644 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:52.823509 master-0 kubenswrapper[4780]: E1205 12:29:52.823358 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:52.924686 master-0 kubenswrapper[4780]: E1205 12:29:52.924577 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:53.026452 master-0 kubenswrapper[4780]: E1205 12:29:53.026340 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:53.126960 master-0 kubenswrapper[4780]: E1205 12:29:53.126752 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:53.223993 master-0 kubenswrapper[4780]: I1205 12:29:53.223923 4780 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Dec 05 12:29:53.227140 master-0 kubenswrapper[4780]: E1205 12:29:53.227060 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:53.237340 master-0 kubenswrapper[4780]: I1205 12:29:53.237290 4780 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 05 12:29:53.327865 master-0 kubenswrapper[4780]: E1205 12:29:53.327781 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:53.428730 master-0 kubenswrapper[4780]: E1205 12:29:53.428579 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:53.529295 master-0 kubenswrapper[4780]: E1205 12:29:53.529236 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:53.530630 master-0 kubenswrapper[4780]: I1205 12:29:53.530573 4780 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:29:53.532040 master-0 kubenswrapper[4780]: I1205 12:29:53.532000 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:29:53.532089 master-0 kubenswrapper[4780]: I1205 12:29:53.532049 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:29:53.532089 master-0 kubenswrapper[4780]: I1205 12:29:53.532062 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:29:53.535032 master-0 kubenswrapper[4780]: I1205 12:29:53.534967 4780 scope.go:117] "RemoveContainer" containerID="1071666e1f17d7cf23c890a67d65947ba5ea19368b6f26e80669ec0f695e375b" Dec 05 12:29:53.538093 master-0 kubenswrapper[4780]: E1205 12:29:53.538035 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(3169f44496ed8a28c6d6a15511ab0eec)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="3169f44496ed8a28c6d6a15511ab0eec" Dec 05 12:29:53.629646 master-0 kubenswrapper[4780]: E1205 12:29:53.629595 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:53.730680 master-0 kubenswrapper[4780]: E1205 12:29:53.730594 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:53.831693 master-0 kubenswrapper[4780]: E1205 12:29:53.831612 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:53.932762 master-0 kubenswrapper[4780]: E1205 12:29:53.932653 4780 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 05 12:29:54.023096 master-0 kubenswrapper[4780]: I1205 12:29:54.022929 4780 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 12:29:54.224442 master-0 kubenswrapper[4780]: I1205 12:29:54.224364 4780 apiserver.go:52] "Watching apiserver" Dec 05 12:29:54.228333 master-0 kubenswrapper[4780]: I1205 12:29:54.228280 4780 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 12:29:54.228508 master-0 kubenswrapper[4780]: I1205 12:29:54.228462 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh","openshift-network-operator/network-operator-79767b7ff9-h8qkj"] Dec 05 12:29:54.228959 master-0 kubenswrapper[4780]: I1205 12:29:54.228923 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.229089 master-0 kubenswrapper[4780]: I1205 12:29:54.229017 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.231438 master-0 kubenswrapper[4780]: I1205 12:29:54.231371 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 12:29:54.231976 master-0 kubenswrapper[4780]: I1205 12:29:54.231906 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 12:29:54.232506 master-0 kubenswrapper[4780]: I1205 12:29:54.232458 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 12:29:54.232758 master-0 kubenswrapper[4780]: I1205 12:29:54.232716 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 12:29:54.233172 master-0 kubenswrapper[4780]: I1205 12:29:54.233115 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 12:29:54.233391 master-0 kubenswrapper[4780]: I1205 12:29:54.233342 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 12:29:54.303487 master-0 kubenswrapper[4780]: I1205 12:29:54.303245 4780 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 05 12:29:54.351932 master-0 kubenswrapper[4780]: I1205 12:29:54.351817 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29812c4b-48ac-488c-863c-1d52e39ea2ae-service-ca\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.351932 master-0 kubenswrapper[4780]: I1205 12:29:54.351909 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5efad170-c154-42ec-a7c0-b36a98d2bfcc-metrics-tls\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.352393 master-0 kubenswrapper[4780]: I1205 12:29:54.351958 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.352393 master-0 kubenswrapper[4780]: I1205 12:29:54.352091 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29812c4b-48ac-488c-863c-1d52e39ea2ae-kube-api-access\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.352393 master-0 kubenswrapper[4780]: I1205 12:29:54.352257 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/5efad170-c154-42ec-a7c0-b36a98d2bfcc-host-etc-kube\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.352393 master-0 kubenswrapper[4780]: I1205 12:29:54.352343 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.352393 master-0 kubenswrapper[4780]: I1205 12:29:54.352394 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-996h9\" (UniqueName: \"kubernetes.io/projected/5efad170-c154-42ec-a7c0-b36a98d2bfcc-kube-api-access-996h9\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.352393 master-0 kubenswrapper[4780]: I1205 12:29:54.352430 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.453406 master-0 kubenswrapper[4780]: I1205 12:29:54.453307 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/5efad170-c154-42ec-a7c0-b36a98d2bfcc-host-etc-kube\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.453406 master-0 kubenswrapper[4780]: I1205 12:29:54.453380 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5efad170-c154-42ec-a7c0-b36a98d2bfcc-metrics-tls\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.453406 master-0 kubenswrapper[4780]: I1205 12:29:54.453410 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.453861 master-0 kubenswrapper[4780]: I1205 12:29:54.453608 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/5efad170-c154-42ec-a7c0-b36a98d2bfcc-host-etc-kube\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.453861 master-0 kubenswrapper[4780]: I1205 12:29:54.453634 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29812c4b-48ac-488c-863c-1d52e39ea2ae-kube-api-access\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.453861 master-0 kubenswrapper[4780]: I1205 12:29:54.453743 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.453861 master-0 kubenswrapper[4780]: I1205 12:29:54.453770 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-996h9\" (UniqueName: \"kubernetes.io/projected/5efad170-c154-42ec-a7c0-b36a98d2bfcc-kube-api-access-996h9\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.453861 master-0 kubenswrapper[4780]: I1205 12:29:54.453764 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.453861 master-0 kubenswrapper[4780]: I1205 12:29:54.453842 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.454249 master-0 kubenswrapper[4780]: I1205 12:29:54.454052 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.454249 master-0 kubenswrapper[4780]: I1205 12:29:54.454160 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29812c4b-48ac-488c-863c-1d52e39ea2ae-service-ca\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.454370 master-0 kubenswrapper[4780]: E1205 12:29:54.454267 4780 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:29:54.454370 master-0 kubenswrapper[4780]: E1205 12:29:54.454368 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:29:54.954339126 +0000 UTC m=+36.977080032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:29:54.455528 master-0 kubenswrapper[4780]: I1205 12:29:54.455485 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29812c4b-48ac-488c-863c-1d52e39ea2ae-service-ca\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.455610 master-0 kubenswrapper[4780]: I1205 12:29:54.455551 4780 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 12:29:54.461690 master-0 kubenswrapper[4780]: I1205 12:29:54.461614 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5efad170-c154-42ec-a7c0-b36a98d2bfcc-metrics-tls\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.476042 master-0 kubenswrapper[4780]: I1205 12:29:54.475967 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-996h9\" (UniqueName: \"kubernetes.io/projected/5efad170-c154-42ec-a7c0-b36a98d2bfcc-kube-api-access-996h9\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.478815 master-0 kubenswrapper[4780]: I1205 12:29:54.478760 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29812c4b-48ac-488c-863c-1d52e39ea2ae-kube-api-access\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.552243 master-0 kubenswrapper[4780]: I1205 12:29:54.552123 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:29:54.572808 master-0 kubenswrapper[4780]: W1205 12:29:54.572711 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5efad170_c154_42ec_a7c0_b36a98d2bfcc.slice/crio-38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8 WatchSource:0}: Error finding container 38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8: Status 404 returned error can't find the container with id 38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8 Dec 05 12:29:54.693790 master-0 kubenswrapper[4780]: I1205 12:29:54.693693 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerStarted","Data":"38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8"} Dec 05 12:29:54.957756 master-0 kubenswrapper[4780]: I1205 12:29:54.957631 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:54.958066 master-0 kubenswrapper[4780]: E1205 12:29:54.957849 4780 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:29:54.958066 master-0 kubenswrapper[4780]: E1205 12:29:54.958013 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:29:55.957908266 +0000 UTC m=+37.980649002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:29:55.964662 master-0 kubenswrapper[4780]: I1205 12:29:55.964590 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:55.965286 master-0 kubenswrapper[4780]: E1205 12:29:55.964812 4780 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:29:55.965286 master-0 kubenswrapper[4780]: E1205 12:29:55.964927 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:29:57.964904111 +0000 UTC m=+39.987644837 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:29:56.295389 master-0 kubenswrapper[4780]: I1205 12:29:56.295104 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-m6pn4"] Dec 05 12:29:56.296282 master-0 kubenswrapper[4780]: I1205 12:29:56.296227 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.298690 master-0 kubenswrapper[4780]: I1205 12:29:56.298636 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Dec 05 12:29:56.299037 master-0 kubenswrapper[4780]: I1205 12:29:56.298990 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Dec 05 12:29:56.299443 master-0 kubenswrapper[4780]: I1205 12:29:56.299396 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Dec 05 12:29:56.299856 master-0 kubenswrapper[4780]: I1205 12:29:56.299660 4780 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Dec 05 12:29:56.367316 master-0 kubenswrapper[4780]: I1205 12:29:56.367173 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-ca-bundle\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.367316 master-0 kubenswrapper[4780]: I1205 12:29:56.367280 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjlnw\" (UniqueName: \"kubernetes.io/projected/e7807b90-1059-4c0d-9224-a0d57a572bfc-kube-api-access-mjlnw\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.367316 master-0 kubenswrapper[4780]: I1205 12:29:56.367307 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-var-run-resolv-conf\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.367316 master-0 kubenswrapper[4780]: I1205 12:29:56.367329 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-sno-bootstrap-files\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.367746 master-0 kubenswrapper[4780]: I1205 12:29:56.367488 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-resolv-conf\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.468164 master-0 kubenswrapper[4780]: I1205 12:29:56.468083 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-resolv-conf\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.468164 master-0 kubenswrapper[4780]: I1205 12:29:56.468166 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-ca-bundle\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.468164 master-0 kubenswrapper[4780]: I1205 12:29:56.468203 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjlnw\" (UniqueName: \"kubernetes.io/projected/e7807b90-1059-4c0d-9224-a0d57a572bfc-kube-api-access-mjlnw\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.468711 master-0 kubenswrapper[4780]: I1205 12:29:56.468335 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-resolv-conf\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.468711 master-0 kubenswrapper[4780]: I1205 12:29:56.468498 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-var-run-resolv-conf\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.468711 master-0 kubenswrapper[4780]: I1205 12:29:56.468595 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-sno-bootstrap-files\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.468711 master-0 kubenswrapper[4780]: I1205 12:29:56.468606 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-var-run-resolv-conf\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.469087 master-0 kubenswrapper[4780]: I1205 12:29:56.468727 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-sno-bootstrap-files\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.469087 master-0 kubenswrapper[4780]: I1205 12:29:56.468737 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-ca-bundle\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.493462 master-0 kubenswrapper[4780]: I1205 12:29:56.493392 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjlnw\" (UniqueName: \"kubernetes.io/projected/e7807b90-1059-4c0d-9224-a0d57a572bfc-kube-api-access-mjlnw\") pod \"assisted-installer-controller-m6pn4\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.637595 master-0 kubenswrapper[4780]: I1205 12:29:56.637419 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:29:56.890619 master-0 kubenswrapper[4780]: I1205 12:29:56.890422 4780 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 12:29:57.357910 master-0 kubenswrapper[4780]: W1205 12:29:57.357525 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7807b90_1059_4c0d_9224_a0d57a572bfc.slice/crio-75e9032b6429595bd1a6f97d2cb17682f851991bfdb1cc650ef529d5407494ac WatchSource:0}: Error finding container 75e9032b6429595bd1a6f97d2cb17682f851991bfdb1cc650ef529d5407494ac: Status 404 returned error can't find the container with id 75e9032b6429595bd1a6f97d2cb17682f851991bfdb1cc650ef529d5407494ac Dec 05 12:29:57.702729 master-0 kubenswrapper[4780]: I1205 12:29:57.702669 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerStarted","Data":"0caaca757a34c0215195111520c95615b587485cd660ccd63c3b233f466666bb"} Dec 05 12:29:57.703578 master-0 kubenswrapper[4780]: I1205 12:29:57.703547 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-m6pn4" event={"ID":"e7807b90-1059-4c0d-9224-a0d57a572bfc","Type":"ContainerStarted","Data":"75e9032b6429595bd1a6f97d2cb17682f851991bfdb1cc650ef529d5407494ac"} Dec 05 12:29:57.978718 master-0 kubenswrapper[4780]: I1205 12:29:57.978527 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:29:57.978953 master-0 kubenswrapper[4780]: E1205 12:29:57.978792 4780 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:29:57.978953 master-0 kubenswrapper[4780]: E1205 12:29:57.978902 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:30:01.978867892 +0000 UTC m=+44.001608658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:30:02.007583 master-0 kubenswrapper[4780]: I1205 12:30:02.006711 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:30:02.007583 master-0 kubenswrapper[4780]: E1205 12:30:02.006930 4780 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:30:02.007583 master-0 kubenswrapper[4780]: E1205 12:30:02.007063 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:30:10.007036017 +0000 UTC m=+52.029776743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:30:02.501490 master-0 kubenswrapper[4780]: I1205 12:30:02.501292 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" podStartSLOduration=7.677149856 podStartE2EDuration="10.501250446s" podCreationTimestamp="2025-12-05 12:29:52 +0000 UTC" firstStartedPulling="2025-12-05 12:29:54.575419322 +0000 UTC m=+36.598160058" lastFinishedPulling="2025-12-05 12:29:57.399519922 +0000 UTC m=+39.422260648" observedRunningTime="2025-12-05 12:29:57.72197167 +0000 UTC m=+39.744712416" watchObservedRunningTime="2025-12-05 12:30:02.501250446 +0000 UTC m=+44.523991172" Dec 05 12:30:02.501490 master-0 kubenswrapper[4780]: I1205 12:30:02.501515 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-ljsd2"] Dec 05 12:30:02.502429 master-0 kubenswrapper[4780]: I1205 12:30:02.501864 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-ljsd2" Dec 05 12:30:02.610369 master-0 kubenswrapper[4780]: I1205 12:30:02.610273 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fzzp\" (UniqueName: \"kubernetes.io/projected/b58aa15d-cdea-4a90-ba40-706d6a85735e-kube-api-access-7fzzp\") pod \"mtu-prober-ljsd2\" (UID: \"b58aa15d-cdea-4a90-ba40-706d6a85735e\") " pod="openshift-network-operator/mtu-prober-ljsd2" Dec 05 12:30:02.711163 master-0 kubenswrapper[4780]: I1205 12:30:02.711090 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fzzp\" (UniqueName: \"kubernetes.io/projected/b58aa15d-cdea-4a90-ba40-706d6a85735e-kube-api-access-7fzzp\") pod \"mtu-prober-ljsd2\" (UID: \"b58aa15d-cdea-4a90-ba40-706d6a85735e\") " pod="openshift-network-operator/mtu-prober-ljsd2" Dec 05 12:30:02.716929 master-0 kubenswrapper[4780]: I1205 12:30:02.716868 4780 generic.go:334] "Generic (PLEG): container finished" podID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerID="ded126662555b11ef5f6022975feef5471a12cb6870d5933adf38dcb51422cc7" exitCode=0 Dec 05 12:30:02.716929 master-0 kubenswrapper[4780]: I1205 12:30:02.716920 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-m6pn4" event={"ID":"e7807b90-1059-4c0d-9224-a0d57a572bfc","Type":"ContainerDied","Data":"ded126662555b11ef5f6022975feef5471a12cb6870d5933adf38dcb51422cc7"} Dec 05 12:30:02.731145 master-0 kubenswrapper[4780]: I1205 12:30:02.731091 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fzzp\" (UniqueName: \"kubernetes.io/projected/b58aa15d-cdea-4a90-ba40-706d6a85735e-kube-api-access-7fzzp\") pod \"mtu-prober-ljsd2\" (UID: \"b58aa15d-cdea-4a90-ba40-706d6a85735e\") " pod="openshift-network-operator/mtu-prober-ljsd2" Dec 05 12:30:02.813834 master-0 kubenswrapper[4780]: I1205 12:30:02.813763 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-ljsd2" Dec 05 12:30:02.833546 master-0 kubenswrapper[4780]: W1205 12:30:02.833430 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb58aa15d_cdea_4a90_ba40_706d6a85735e.slice/crio-3564c4f7be418f0fb673bf15e0683bd1a8124446576d69fea4a76db52877c172 WatchSource:0}: Error finding container 3564c4f7be418f0fb673bf15e0683bd1a8124446576d69fea4a76db52877c172: Status 404 returned error can't find the container with id 3564c4f7be418f0fb673bf15e0683bd1a8124446576d69fea4a76db52877c172 Dec 05 12:30:03.405910 master-0 kubenswrapper[4780]: I1205 12:30:03.405515 4780 csr.go:261] certificate signing request csr-p7hf4 is approved, waiting to be issued Dec 05 12:30:03.413289 master-0 kubenswrapper[4780]: I1205 12:30:03.413132 4780 csr.go:257] certificate signing request csr-p7hf4 is issued Dec 05 12:30:03.721889 master-0 kubenswrapper[4780]: I1205 12:30:03.721781 4780 generic.go:334] "Generic (PLEG): container finished" podID="b58aa15d-cdea-4a90-ba40-706d6a85735e" containerID="87e2f0751f7349d9f2700480abbb17089facf86a7329bd4aecf04d7f2bed205a" exitCode=0 Dec 05 12:30:03.722317 master-0 kubenswrapper[4780]: I1205 12:30:03.721946 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-ljsd2" event={"ID":"b58aa15d-cdea-4a90-ba40-706d6a85735e","Type":"ContainerDied","Data":"87e2f0751f7349d9f2700480abbb17089facf86a7329bd4aecf04d7f2bed205a"} Dec 05 12:30:03.722317 master-0 kubenswrapper[4780]: I1205 12:30:03.722070 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-ljsd2" event={"ID":"b58aa15d-cdea-4a90-ba40-706d6a85735e","Type":"ContainerStarted","Data":"3564c4f7be418f0fb673bf15e0683bd1a8124446576d69fea4a76db52877c172"} Dec 05 12:30:03.742079 master-0 kubenswrapper[4780]: I1205 12:30:03.742002 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:30:03.820574 master-0 kubenswrapper[4780]: I1205 12:30:03.820435 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-resolv-conf\") pod \"e7807b90-1059-4c0d-9224-a0d57a572bfc\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " Dec 05 12:30:03.820574 master-0 kubenswrapper[4780]: I1205 12:30:03.820531 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-sno-bootstrap-files\") pod \"e7807b90-1059-4c0d-9224-a0d57a572bfc\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " Dec 05 12:30:03.820574 master-0 kubenswrapper[4780]: I1205 12:30:03.820578 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-var-run-resolv-conf\") pod \"e7807b90-1059-4c0d-9224-a0d57a572bfc\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " Dec 05 12:30:03.821030 master-0 kubenswrapper[4780]: I1205 12:30:03.820634 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-ca-bundle\") pod \"e7807b90-1059-4c0d-9224-a0d57a572bfc\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " Dec 05 12:30:03.821030 master-0 kubenswrapper[4780]: I1205 12:30:03.820637 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "e7807b90-1059-4c0d-9224-a0d57a572bfc" (UID: "e7807b90-1059-4c0d-9224-a0d57a572bfc"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:30:03.821030 master-0 kubenswrapper[4780]: I1205 12:30:03.820690 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjlnw\" (UniqueName: \"kubernetes.io/projected/e7807b90-1059-4c0d-9224-a0d57a572bfc-kube-api-access-mjlnw\") pod \"e7807b90-1059-4c0d-9224-a0d57a572bfc\" (UID: \"e7807b90-1059-4c0d-9224-a0d57a572bfc\") " Dec 05 12:30:03.821030 master-0 kubenswrapper[4780]: I1205 12:30:03.820829 4780 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Dec 05 12:30:03.821030 master-0 kubenswrapper[4780]: I1205 12:30:03.820683 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "e7807b90-1059-4c0d-9224-a0d57a572bfc" (UID: "e7807b90-1059-4c0d-9224-a0d57a572bfc"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:30:03.821030 master-0 kubenswrapper[4780]: I1205 12:30:03.820734 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "e7807b90-1059-4c0d-9224-a0d57a572bfc" (UID: "e7807b90-1059-4c0d-9224-a0d57a572bfc"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:30:03.821030 master-0 kubenswrapper[4780]: I1205 12:30:03.820774 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "e7807b90-1059-4c0d-9224-a0d57a572bfc" (UID: "e7807b90-1059-4c0d-9224-a0d57a572bfc"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:30:03.825887 master-0 kubenswrapper[4780]: I1205 12:30:03.825789 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7807b90-1059-4c0d-9224-a0d57a572bfc-kube-api-access-mjlnw" (OuterVolumeSpecName: "kube-api-access-mjlnw") pod "e7807b90-1059-4c0d-9224-a0d57a572bfc" (UID: "e7807b90-1059-4c0d-9224-a0d57a572bfc"). InnerVolumeSpecName "kube-api-access-mjlnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:30:03.922109 master-0 kubenswrapper[4780]: I1205 12:30:03.922050 4780 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:30:03.922109 master-0 kubenswrapper[4780]: I1205 12:30:03.922091 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjlnw\" (UniqueName: \"kubernetes.io/projected/e7807b90-1059-4c0d-9224-a0d57a572bfc-kube-api-access-mjlnw\") on node \"master-0\" DevicePath \"\"" Dec 05 12:30:03.922109 master-0 kubenswrapper[4780]: I1205 12:30:03.922101 4780 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Dec 05 12:30:03.922109 master-0 kubenswrapper[4780]: I1205 12:30:03.922110 4780 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/e7807b90-1059-4c0d-9224-a0d57a572bfc-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Dec 05 12:30:04.415694 master-0 kubenswrapper[4780]: I1205 12:30:04.415614 4780 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-06 12:20:41 +0000 UTC, rotation deadline is 2025-12-06 05:40:51.584119575 +0000 UTC Dec 05 12:30:04.415694 master-0 kubenswrapper[4780]: I1205 12:30:04.415675 4780 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h10m47.16844969s for next certificate rotation Dec 05 12:30:04.724967 master-0 kubenswrapper[4780]: I1205 12:30:04.724859 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:30:04.724967 master-0 kubenswrapper[4780]: I1205 12:30:04.724955 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-m6pn4" event={"ID":"e7807b90-1059-4c0d-9224-a0d57a572bfc","Type":"ContainerDied","Data":"75e9032b6429595bd1a6f97d2cb17682f851991bfdb1cc650ef529d5407494ac"} Dec 05 12:30:04.725391 master-0 kubenswrapper[4780]: I1205 12:30:04.725003 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75e9032b6429595bd1a6f97d2cb17682f851991bfdb1cc650ef529d5407494ac" Dec 05 12:30:04.741313 master-0 kubenswrapper[4780]: I1205 12:30:04.741264 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-ljsd2" Dec 05 12:30:04.828407 master-0 kubenswrapper[4780]: I1205 12:30:04.828268 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fzzp\" (UniqueName: \"kubernetes.io/projected/b58aa15d-cdea-4a90-ba40-706d6a85735e-kube-api-access-7fzzp\") pod \"b58aa15d-cdea-4a90-ba40-706d6a85735e\" (UID: \"b58aa15d-cdea-4a90-ba40-706d6a85735e\") " Dec 05 12:30:04.833208 master-0 kubenswrapper[4780]: I1205 12:30:04.833132 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58aa15d-cdea-4a90-ba40-706d6a85735e-kube-api-access-7fzzp" (OuterVolumeSpecName: "kube-api-access-7fzzp") pod "b58aa15d-cdea-4a90-ba40-706d6a85735e" (UID: "b58aa15d-cdea-4a90-ba40-706d6a85735e"). InnerVolumeSpecName "kube-api-access-7fzzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:30:04.928683 master-0 kubenswrapper[4780]: I1205 12:30:04.928625 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fzzp\" (UniqueName: \"kubernetes.io/projected/b58aa15d-cdea-4a90-ba40-706d6a85735e-kube-api-access-7fzzp\") on node \"master-0\" DevicePath \"\"" Dec 05 12:30:05.416532 master-0 kubenswrapper[4780]: I1205 12:30:05.416458 4780 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-06 12:20:41 +0000 UTC, rotation deadline is 2025-12-06 08:04:22.48424845 +0000 UTC Dec 05 12:30:05.416532 master-0 kubenswrapper[4780]: I1205 12:30:05.416519 4780 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 19h34m17.067733598s for next certificate rotation Dec 05 12:30:05.730128 master-0 kubenswrapper[4780]: I1205 12:30:05.730048 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-ljsd2" event={"ID":"b58aa15d-cdea-4a90-ba40-706d6a85735e","Type":"ContainerDied","Data":"3564c4f7be418f0fb673bf15e0683bd1a8124446576d69fea4a76db52877c172"} Dec 05 12:30:05.730128 master-0 kubenswrapper[4780]: I1205 12:30:05.730097 4780 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3564c4f7be418f0fb673bf15e0683bd1a8124446576d69fea4a76db52877c172" Dec 05 12:30:05.730674 master-0 kubenswrapper[4780]: I1205 12:30:05.730168 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-ljsd2" Dec 05 12:30:07.523249 master-0 kubenswrapper[4780]: I1205 12:30:07.523171 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-ljsd2"] Dec 05 12:30:07.525710 master-0 kubenswrapper[4780]: I1205 12:30:07.525684 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-ljsd2"] Dec 05 12:30:08.533961 master-0 kubenswrapper[4780]: I1205 12:30:08.533916 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58aa15d-cdea-4a90-ba40-706d6a85735e" path="/var/lib/kubelet/pods/b58aa15d-cdea-4a90-ba40-706d6a85735e/volumes" Dec 05 12:30:08.546861 master-0 kubenswrapper[4780]: I1205 12:30:08.546827 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Dec 05 12:30:08.547209 master-0 kubenswrapper[4780]: I1205 12:30:08.547162 4780 scope.go:117] "RemoveContainer" containerID="1071666e1f17d7cf23c890a67d65947ba5ea19368b6f26e80669ec0f695e375b" Dec 05 12:30:09.745678 master-0 kubenswrapper[4780]: I1205 12:30:09.745125 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/2.log" Dec 05 12:30:09.746791 master-0 kubenswrapper[4780]: I1205 12:30:09.746325 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"f9824f2538239be2916d2115cdd6e15355f5d12571e5c02316bdba7857f30ff8"} Dec 05 12:30:10.097403 master-0 kubenswrapper[4780]: I1205 12:30:10.097174 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:30:10.097649 master-0 kubenswrapper[4780]: E1205 12:30:10.097426 4780 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:30:10.097649 master-0 kubenswrapper[4780]: E1205 12:30:10.097531 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:30:26.097502939 +0000 UTC m=+68.120243695 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:30:12.401673 master-0 kubenswrapper[4780]: I1205 12:30:12.401592 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=4.401559626 podStartE2EDuration="4.401559626s" podCreationTimestamp="2025-12-05 12:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:30:09.765905646 +0000 UTC m=+51.788646442" watchObservedRunningTime="2025-12-05 12:30:12.401559626 +0000 UTC m=+54.424300352" Dec 05 12:30:12.402569 master-0 kubenswrapper[4780]: I1205 12:30:12.402547 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5nqhk"] Dec 05 12:30:12.402718 master-0 kubenswrapper[4780]: E1205 12:30:12.402703 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerName="assisted-installer-controller" Dec 05 12:30:12.402791 master-0 kubenswrapper[4780]: I1205 12:30:12.402781 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerName="assisted-installer-controller" Dec 05 12:30:12.402848 master-0 kubenswrapper[4780]: E1205 12:30:12.402838 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58aa15d-cdea-4a90-ba40-706d6a85735e" containerName="prober" Dec 05 12:30:12.402900 master-0 kubenswrapper[4780]: I1205 12:30:12.402891 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58aa15d-cdea-4a90-ba40-706d6a85735e" containerName="prober" Dec 05 12:30:12.402996 master-0 kubenswrapper[4780]: I1205 12:30:12.402985 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerName="assisted-installer-controller" Dec 05 12:30:12.403053 master-0 kubenswrapper[4780]: I1205 12:30:12.403044 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58aa15d-cdea-4a90-ba40-706d6a85735e" containerName="prober" Dec 05 12:30:12.403366 master-0 kubenswrapper[4780]: I1205 12:30:12.403351 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.407065 master-0 kubenswrapper[4780]: I1205 12:30:12.407008 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 12:30:12.407241 master-0 kubenswrapper[4780]: I1205 12:30:12.407096 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 12:30:12.407589 master-0 kubenswrapper[4780]: I1205 12:30:12.407563 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 12:30:12.408019 master-0 kubenswrapper[4780]: I1205 12:30:12.407630 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 12:30:12.516050 master-0 kubenswrapper[4780]: I1205 12:30:12.515955 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-hostroot\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.516545 master-0 kubenswrapper[4780]: I1205 12:30:12.516516 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-conf-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.516714 master-0 kubenswrapper[4780]: I1205 12:30:12.516696 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-system-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.516817 master-0 kubenswrapper[4780]: I1205 12:30:12.516802 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cni-binary-copy\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.516906 master-0 kubenswrapper[4780]: I1205 12:30:12.516892 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cnibin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.517007 master-0 kubenswrapper[4780]: I1205 12:30:12.516990 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-socket-dir-parent\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.517095 master-0 kubenswrapper[4780]: I1205 12:30:12.517081 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-bin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.517222 master-0 kubenswrapper[4780]: I1205 12:30:12.517208 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.517325 master-0 kubenswrapper[4780]: I1205 12:30:12.517311 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-etc-kubernetes\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.517409 master-0 kubenswrapper[4780]: I1205 12:30:12.517396 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-daemon-config\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.517510 master-0 kubenswrapper[4780]: I1205 12:30:12.517494 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-k8s-cni-cncf-io\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.517626 master-0 kubenswrapper[4780]: I1205 12:30:12.517604 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-kubelet\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.517734 master-0 kubenswrapper[4780]: I1205 12:30:12.517715 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-netns\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.517955 master-0 kubenswrapper[4780]: I1205 12:30:12.517876 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-multus-certs\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.518024 master-0 kubenswrapper[4780]: I1205 12:30:12.517962 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x59kd\" (UniqueName: \"kubernetes.io/projected/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-kube-api-access-x59kd\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.518024 master-0 kubenswrapper[4780]: I1205 12:30:12.518014 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-multus\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.518106 master-0 kubenswrapper[4780]: I1205 12:30:12.518059 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-os-release\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.584424 master-0 kubenswrapper[4780]: I1205 12:30:12.584379 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-prt97"] Dec 05 12:30:12.585226 master-0 kubenswrapper[4780]: I1205 12:30:12.585208 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.587216 master-0 kubenswrapper[4780]: I1205 12:30:12.587171 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 05 12:30:12.587634 master-0 kubenswrapper[4780]: I1205 12:30:12.587593 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 12:30:12.618786 master-0 kubenswrapper[4780]: I1205 12:30:12.618727 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-netns\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.618786 master-0 kubenswrapper[4780]: I1205 12:30:12.618775 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-multus-certs\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.618786 master-0 kubenswrapper[4780]: I1205 12:30:12.618797 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x59kd\" (UniqueName: \"kubernetes.io/projected/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-kube-api-access-x59kd\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.618812 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-multus\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.618829 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-os-release\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.618848 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-hostroot\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.618868 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-system-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.618891 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cni-binary-copy\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.618919 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-conf-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.618953 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.618976 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cnibin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.618997 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-socket-dir-parent\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.619017 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-bin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619033 master-0 kubenswrapper[4780]: I1205 12:30:12.619038 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-etc-kubernetes\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619357 master-0 kubenswrapper[4780]: I1205 12:30:12.619066 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-k8s-cni-cncf-io\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619357 master-0 kubenswrapper[4780]: I1205 12:30:12.619087 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-kubelet\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619357 master-0 kubenswrapper[4780]: I1205 12:30:12.619112 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-daemon-config\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619449 master-0 kubenswrapper[4780]: I1205 12:30:12.619417 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-etc-kubernetes\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619505 master-0 kubenswrapper[4780]: I1205 12:30:12.619480 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-conf-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619543 master-0 kubenswrapper[4780]: I1205 12:30:12.619480 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619543 master-0 kubenswrapper[4780]: I1205 12:30:12.619524 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cnibin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619607 master-0 kubenswrapper[4780]: I1205 12:30:12.619541 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-bin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619607 master-0 kubenswrapper[4780]: I1205 12:30:12.619421 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-socket-dir-parent\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619607 master-0 kubenswrapper[4780]: I1205 12:30:12.619586 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-hostroot\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619690 master-0 kubenswrapper[4780]: I1205 12:30:12.619627 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-kubelet\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619690 master-0 kubenswrapper[4780]: I1205 12:30:12.619628 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-k8s-cni-cncf-io\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619690 master-0 kubenswrapper[4780]: I1205 12:30:12.619653 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-os-release\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619690 master-0 kubenswrapper[4780]: I1205 12:30:12.619663 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-system-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619791 master-0 kubenswrapper[4780]: I1205 12:30:12.619691 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-multus-certs\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.619791 master-0 kubenswrapper[4780]: I1205 12:30:12.619715 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-netns\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.620017 master-0 kubenswrapper[4780]: I1205 12:30:12.619916 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-multus\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.620094 master-0 kubenswrapper[4780]: I1205 12:30:12.619992 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cni-binary-copy\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.620165 master-0 kubenswrapper[4780]: I1205 12:30:12.620137 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-daemon-config\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.635255 master-0 kubenswrapper[4780]: I1205 12:30:12.635211 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x59kd\" (UniqueName: \"kubernetes.io/projected/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-kube-api-access-x59kd\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.720267 master-0 kubenswrapper[4780]: I1205 12:30:12.720216 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-cnibin\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.720520 master-0 kubenswrapper[4780]: I1205 12:30:12.720290 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.720520 master-0 kubenswrapper[4780]: I1205 12:30:12.720321 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-os-release\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.720520 master-0 kubenswrapper[4780]: I1205 12:30:12.720386 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.720520 master-0 kubenswrapper[4780]: I1205 12:30:12.720409 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.720520 master-0 kubenswrapper[4780]: I1205 12:30:12.720432 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vx2z\" (UniqueName: \"kubernetes.io/projected/708bf629-9949-4b79-a88a-c73ba033475b-kube-api-access-6vx2z\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.720520 master-0 kubenswrapper[4780]: I1205 12:30:12.720454 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-system-cni-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.720520 master-0 kubenswrapper[4780]: I1205 12:30:12.720474 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-binary-copy\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.727460 master-0 kubenswrapper[4780]: I1205 12:30:12.727418 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5nqhk" Dec 05 12:30:12.737363 master-0 kubenswrapper[4780]: W1205 12:30:12.737299 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf725fa37_ef11_479a_8cf9_f4b90fe5e7a1.slice/crio-7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448 WatchSource:0}: Error finding container 7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448: Status 404 returned error can't find the container with id 7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448 Dec 05 12:30:12.757429 master-0 kubenswrapper[4780]: I1205 12:30:12.757358 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5nqhk" event={"ID":"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1","Type":"ContainerStarted","Data":"7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448"} Dec 05 12:30:12.821997 master-0 kubenswrapper[4780]: I1205 12:30:12.821884 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-cnibin\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.822358 master-0 kubenswrapper[4780]: I1205 12:30:12.822047 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-cnibin\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.822358 master-0 kubenswrapper[4780]: I1205 12:30:12.822126 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.822358 master-0 kubenswrapper[4780]: I1205 12:30:12.822165 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-os-release\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.822358 master-0 kubenswrapper[4780]: I1205 12:30:12.822237 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.822698 master-0 kubenswrapper[4780]: I1205 12:30:12.822657 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.822752 master-0 kubenswrapper[4780]: I1205 12:30:12.822700 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vx2z\" (UniqueName: \"kubernetes.io/projected/708bf629-9949-4b79-a88a-c73ba033475b-kube-api-access-6vx2z\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.822795 master-0 kubenswrapper[4780]: I1205 12:30:12.822761 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-system-cni-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.822795 master-0 kubenswrapper[4780]: I1205 12:30:12.822788 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-binary-copy\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.823778 master-0 kubenswrapper[4780]: I1205 12:30:12.823734 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-binary-copy\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.823858 master-0 kubenswrapper[4780]: I1205 12:30:12.823750 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-system-cni-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.823949 master-0 kubenswrapper[4780]: I1205 12:30:12.823906 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-os-release\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.824111 master-0 kubenswrapper[4780]: I1205 12:30:12.824076 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.824258 master-0 kubenswrapper[4780]: I1205 12:30:12.824217 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.824415 master-0 kubenswrapper[4780]: I1205 12:30:12.824238 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.855174 master-0 kubenswrapper[4780]: I1205 12:30:12.855114 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vx2z\" (UniqueName: \"kubernetes.io/projected/708bf629-9949-4b79-a88a-c73ba033475b-kube-api-access-6vx2z\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.902824 master-0 kubenswrapper[4780]: I1205 12:30:12.902750 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:30:12.917098 master-0 kubenswrapper[4780]: W1205 12:30:12.917040 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod708bf629_9949_4b79_a88a_c73ba033475b.slice/crio-b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b WatchSource:0}: Error finding container b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b: Status 404 returned error can't find the container with id b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b Dec 05 12:30:13.379936 master-0 kubenswrapper[4780]: I1205 12:30:13.379882 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-99djw"] Dec 05 12:30:13.380471 master-0 kubenswrapper[4780]: I1205 12:30:13.380435 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:13.380604 master-0 kubenswrapper[4780]: E1205 12:30:13.380581 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:13.529775 master-0 kubenswrapper[4780]: I1205 12:30:13.529700 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69n5s\" (UniqueName: \"kubernetes.io/projected/fb7003a6-4341-49eb-bec3-76ba8610fa12-kube-api-access-69n5s\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:13.529775 master-0 kubenswrapper[4780]: I1205 12:30:13.529772 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:13.630660 master-0 kubenswrapper[4780]: I1205 12:30:13.630519 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:13.630879 master-0 kubenswrapper[4780]: E1205 12:30:13.630749 4780 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:13.630879 master-0 kubenswrapper[4780]: E1205 12:30:13.630873 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:14.130848902 +0000 UTC m=+56.153589628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:13.631238 master-0 kubenswrapper[4780]: I1205 12:30:13.631105 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69n5s\" (UniqueName: \"kubernetes.io/projected/fb7003a6-4341-49eb-bec3-76ba8610fa12-kube-api-access-69n5s\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:13.649827 master-0 kubenswrapper[4780]: I1205 12:30:13.649760 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69n5s\" (UniqueName: \"kubernetes.io/projected/fb7003a6-4341-49eb-bec3-76ba8610fa12-kube-api-access-69n5s\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:13.765772 master-0 kubenswrapper[4780]: I1205 12:30:13.765651 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerStarted","Data":"b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b"} Dec 05 12:30:14.134758 master-0 kubenswrapper[4780]: I1205 12:30:14.134662 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:14.135092 master-0 kubenswrapper[4780]: E1205 12:30:14.134834 4780 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:14.135092 master-0 kubenswrapper[4780]: E1205 12:30:14.134901 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:15.134884053 +0000 UTC m=+57.157624789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:15.143654 master-0 kubenswrapper[4780]: I1205 12:30:15.143610 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:15.143966 master-0 kubenswrapper[4780]: E1205 12:30:15.143902 4780 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:15.144107 master-0 kubenswrapper[4780]: E1205 12:30:15.144078 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:17.144040654 +0000 UTC m=+59.166781390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:15.530278 master-0 kubenswrapper[4780]: I1205 12:30:15.530216 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:15.530538 master-0 kubenswrapper[4780]: E1205 12:30:15.530379 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:15.774705 master-0 kubenswrapper[4780]: I1205 12:30:15.774394 4780 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="503a0b99be77d72f51d7afcf8403bc7d040b77fef62f126cd910c2ff4b520892" exitCode=0 Dec 05 12:30:15.774705 master-0 kubenswrapper[4780]: I1205 12:30:15.774487 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"503a0b99be77d72f51d7afcf8403bc7d040b77fef62f126cd910c2ff4b520892"} Dec 05 12:30:17.162246 master-0 kubenswrapper[4780]: I1205 12:30:17.161290 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:17.162246 master-0 kubenswrapper[4780]: E1205 12:30:17.161469 4780 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:17.162246 master-0 kubenswrapper[4780]: E1205 12:30:17.161535 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:21.161516024 +0000 UTC m=+63.184256750 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:17.529792 master-0 kubenswrapper[4780]: I1205 12:30:17.529708 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:17.530143 master-0 kubenswrapper[4780]: E1205 12:30:17.529991 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:19.530133 master-0 kubenswrapper[4780]: I1205 12:30:19.530071 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:19.530823 master-0 kubenswrapper[4780]: E1205 12:30:19.530243 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:21.198226 master-0 kubenswrapper[4780]: I1205 12:30:21.198142 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:21.198828 master-0 kubenswrapper[4780]: E1205 12:30:21.198423 4780 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:21.198828 master-0 kubenswrapper[4780]: E1205 12:30:21.198563 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:29.198528279 +0000 UTC m=+71.221269005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:21.530152 master-0 kubenswrapper[4780]: I1205 12:30:21.529994 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:21.530434 master-0 kubenswrapper[4780]: E1205 12:30:21.530142 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:23.530519 master-0 kubenswrapper[4780]: I1205 12:30:23.530424 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:23.531315 master-0 kubenswrapper[4780]: E1205 12:30:23.530673 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:24.789422 master-0 kubenswrapper[4780]: I1205 12:30:24.789335 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb"] Dec 05 12:30:24.790450 master-0 kubenswrapper[4780]: I1205 12:30:24.789675 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:24.793450 master-0 kubenswrapper[4780]: I1205 12:30:24.793017 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 12:30:24.793450 master-0 kubenswrapper[4780]: I1205 12:30:24.793105 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 12:30:24.793450 master-0 kubenswrapper[4780]: I1205 12:30:24.793066 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 12:30:24.793450 master-0 kubenswrapper[4780]: I1205 12:30:24.793111 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 12:30:24.793450 master-0 kubenswrapper[4780]: I1205 12:30:24.793171 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 12:30:24.929616 master-0 kubenswrapper[4780]: I1205 12:30:24.929541 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:24.929616 master-0 kubenswrapper[4780]: I1205 12:30:24.929610 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:24.929935 master-0 kubenswrapper[4780]: I1205 12:30:24.929697 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:24.929935 master-0 kubenswrapper[4780]: I1205 12:30:24.929726 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z8h9\" (UniqueName: \"kubernetes.io/projected/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-kube-api-access-9z8h9\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:25.003114 master-0 kubenswrapper[4780]: I1205 12:30:25.003038 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nrs4v"] Dec 05 12:30:25.003969 master-0 kubenswrapper[4780]: I1205 12:30:25.003939 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.005988 master-0 kubenswrapper[4780]: I1205 12:30:25.005926 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 12:30:25.007334 master-0 kubenswrapper[4780]: I1205 12:30:25.007100 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 12:30:25.030552 master-0 kubenswrapper[4780]: I1205 12:30:25.030168 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:25.030850 master-0 kubenswrapper[4780]: I1205 12:30:25.030471 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:25.030850 master-0 kubenswrapper[4780]: I1205 12:30:25.030833 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:25.031356 master-0 kubenswrapper[4780]: I1205 12:30:25.031269 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z8h9\" (UniqueName: \"kubernetes.io/projected/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-kube-api-access-9z8h9\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:25.031543 master-0 kubenswrapper[4780]: I1205 12:30:25.031492 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:25.031543 master-0 kubenswrapper[4780]: I1205 12:30:25.031523 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:25.037006 master-0 kubenswrapper[4780]: I1205 12:30:25.036530 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:25.046425 master-0 kubenswrapper[4780]: I1205 12:30:25.046363 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z8h9\" (UniqueName: \"kubernetes.io/projected/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-kube-api-access-9z8h9\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:25.112344 master-0 kubenswrapper[4780]: I1205 12:30:25.112269 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132603 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-openvswitch\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132665 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-ovn\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132698 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-ovn-kubernetes\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132725 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-env-overrides\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132748 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-netd\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132767 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132796 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-systemd-units\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132813 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-systemd\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132837 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovn-node-metrics-cert\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132876 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-script-lib\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132896 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-kubelet\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132915 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-slash\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132933 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-node-log\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132954 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-netns\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132971 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-etc-openvswitch\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.133797 master-0 kubenswrapper[4780]: I1205 12:30:25.132991 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-log-socket\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.134879 master-0 kubenswrapper[4780]: I1205 12:30:25.133007 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-config\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.134879 master-0 kubenswrapper[4780]: I1205 12:30:25.133027 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk42r\" (UniqueName: \"kubernetes.io/projected/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-kube-api-access-pk42r\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.134879 master-0 kubenswrapper[4780]: I1205 12:30:25.133055 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-var-lib-openvswitch\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.134879 master-0 kubenswrapper[4780]: I1205 12:30:25.133071 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-bin\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.234746 master-0 kubenswrapper[4780]: I1205 12:30:25.234679 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-script-lib\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.234746 master-0 kubenswrapper[4780]: I1205 12:30:25.234746 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-kubelet\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235095 master-0 kubenswrapper[4780]: I1205 12:30:25.234967 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-slash\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235095 master-0 kubenswrapper[4780]: I1205 12:30:25.235030 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-kubelet\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235095 master-0 kubenswrapper[4780]: I1205 12:30:25.235048 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-node-log\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235213 master-0 kubenswrapper[4780]: I1205 12:30:25.235096 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-slash\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235213 master-0 kubenswrapper[4780]: I1205 12:30:25.235146 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-etc-openvswitch\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235213 master-0 kubenswrapper[4780]: I1205 12:30:25.235158 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-node-log\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235213 master-0 kubenswrapper[4780]: I1205 12:30:25.235172 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-log-socket\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235337 master-0 kubenswrapper[4780]: I1205 12:30:25.235220 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-config\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235337 master-0 kubenswrapper[4780]: I1205 12:30:25.235222 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-log-socket\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235414 master-0 kubenswrapper[4780]: I1205 12:30:25.235360 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-etc-openvswitch\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235531 master-0 kubenswrapper[4780]: I1205 12:30:25.235398 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk42r\" (UniqueName: \"kubernetes.io/projected/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-kube-api-access-pk42r\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235575 master-0 kubenswrapper[4780]: I1205 12:30:25.235560 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-netns\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235614 master-0 kubenswrapper[4780]: I1205 12:30:25.235588 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-var-lib-openvswitch\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235644 master-0 kubenswrapper[4780]: I1205 12:30:25.235623 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-bin\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235690 master-0 kubenswrapper[4780]: I1205 12:30:25.235675 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-var-lib-openvswitch\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235722 master-0 kubenswrapper[4780]: I1205 12:30:25.235680 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-netns\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235722 master-0 kubenswrapper[4780]: I1205 12:30:25.235696 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-script-lib\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235722 master-0 kubenswrapper[4780]: I1205 12:30:25.235706 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-openvswitch\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235845 master-0 kubenswrapper[4780]: I1205 12:30:25.235743 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-ovn\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.235845 master-0 kubenswrapper[4780]: I1205 12:30:25.235738 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-bin\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.236390 master-0 kubenswrapper[4780]: I1205 12:30:25.236355 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-config\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.236535 master-0 kubenswrapper[4780]: I1205 12:30:25.236508 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-openvswitch\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.236601 master-0 kubenswrapper[4780]: I1205 12:30:25.236566 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-ovn-kubernetes\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.236639 master-0 kubenswrapper[4780]: I1205 12:30:25.236604 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-netd\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.236639 master-0 kubenswrapper[4780]: I1205 12:30:25.236631 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.236724 master-0 kubenswrapper[4780]: I1205 12:30:25.236654 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-env-overrides\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.236724 master-0 kubenswrapper[4780]: I1205 12:30:25.236678 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-systemd-units\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.236724 master-0 kubenswrapper[4780]: I1205 12:30:25.236698 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-systemd\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.236860 master-0 kubenswrapper[4780]: I1205 12:30:25.236735 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovn-node-metrics-cert\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.237401 master-0 kubenswrapper[4780]: I1205 12:30:25.237345 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-systemd-units\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.237498 master-0 kubenswrapper[4780]: I1205 12:30:25.237398 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.237498 master-0 kubenswrapper[4780]: I1205 12:30:25.237364 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-netd\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.237498 master-0 kubenswrapper[4780]: I1205 12:30:25.237365 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-systemd\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.237498 master-0 kubenswrapper[4780]: I1205 12:30:25.237431 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-ovn-kubernetes\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.237498 master-0 kubenswrapper[4780]: I1205 12:30:25.237485 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-ovn\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.237953 master-0 kubenswrapper[4780]: I1205 12:30:25.237858 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-env-overrides\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.241531 master-0 kubenswrapper[4780]: I1205 12:30:25.241476 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovn-node-metrics-cert\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.254320 master-0 kubenswrapper[4780]: I1205 12:30:25.254257 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk42r\" (UniqueName: \"kubernetes.io/projected/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-kube-api-access-pk42r\") pod \"ovnkube-node-nrs4v\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.322167 master-0 kubenswrapper[4780]: I1205 12:30:25.322047 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:25.348365 master-0 kubenswrapper[4780]: W1205 12:30:25.348314 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda757f807_e1bf_4f1e_9787_6b4acc8d09cf.slice/crio-a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733 WatchSource:0}: Error finding container a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733: Status 404 returned error can't find the container with id a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733 Dec 05 12:30:25.353289 master-0 kubenswrapper[4780]: W1205 12:30:25.353197 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7342973_8d2b_4e8e_ad46_99d8dd7a6688.slice/crio-a021d8fc4cc2b621fc5a80784b2ce374483ed0ce0f8315b255679472aa810f64 WatchSource:0}: Error finding container a021d8fc4cc2b621fc5a80784b2ce374483ed0ce0f8315b255679472aa810f64: Status 404 returned error can't find the container with id a021d8fc4cc2b621fc5a80784b2ce374483ed0ce0f8315b255679472aa810f64 Dec 05 12:30:25.529688 master-0 kubenswrapper[4780]: I1205 12:30:25.529638 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:25.529901 master-0 kubenswrapper[4780]: E1205 12:30:25.529865 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:25.813968 master-0 kubenswrapper[4780]: I1205 12:30:25.813879 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5nqhk" event={"ID":"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1","Type":"ContainerStarted","Data":"60d32869d5d76c04555375fdfd9ab0f008a07a41f85b96737cde09fadc0deeb4"} Dec 05 12:30:25.818013 master-0 kubenswrapper[4780]: I1205 12:30:25.817931 4780 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="d81a6813a03e38c556e737371d737471f12aa2c77281926715e2cfe7ffc056aa" exitCode=0 Dec 05 12:30:25.818367 master-0 kubenswrapper[4780]: I1205 12:30:25.818316 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"d81a6813a03e38c556e737371d737471f12aa2c77281926715e2cfe7ffc056aa"} Dec 05 12:30:25.821329 master-0 kubenswrapper[4780]: I1205 12:30:25.821260 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerStarted","Data":"a021d8fc4cc2b621fc5a80784b2ce374483ed0ce0f8315b255679472aa810f64"} Dec 05 12:30:25.823805 master-0 kubenswrapper[4780]: I1205 12:30:25.823755 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerStarted","Data":"b1ed868ac971480d433bc214f55b6262c1c9875a557884ba05c4f9ee33a0c3dc"} Dec 05 12:30:25.823805 master-0 kubenswrapper[4780]: I1205 12:30:25.823797 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerStarted","Data":"a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733"} Dec 05 12:30:25.852937 master-0 kubenswrapper[4780]: I1205 12:30:25.852815 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5nqhk" podStartSLOduration=1.1648858500000001 podStartE2EDuration="13.852783888s" podCreationTimestamp="2025-12-05 12:30:12 +0000 UTC" firstStartedPulling="2025-12-05 12:30:12.738736922 +0000 UTC m=+54.761477658" lastFinishedPulling="2025-12-05 12:30:25.42663497 +0000 UTC m=+67.449375696" observedRunningTime="2025-12-05 12:30:25.834329072 +0000 UTC m=+67.857069858" watchObservedRunningTime="2025-12-05 12:30:25.852783888 +0000 UTC m=+67.875524654" Dec 05 12:30:26.144615 master-0 kubenswrapper[4780]: I1205 12:30:26.144526 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:30:26.144971 master-0 kubenswrapper[4780]: E1205 12:30:26.144689 4780 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:30:26.144971 master-0 kubenswrapper[4780]: E1205 12:30:26.144789 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:30:58.144766522 +0000 UTC m=+100.167507248 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:30:27.530502 master-0 kubenswrapper[4780]: I1205 12:30:27.530451 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:27.531125 master-0 kubenswrapper[4780]: E1205 12:30:27.530602 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:29.275131 master-0 kubenswrapper[4780]: I1205 12:30:29.274970 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:29.275865 master-0 kubenswrapper[4780]: E1205 12:30:29.275225 4780 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:29.275865 master-0 kubenswrapper[4780]: E1205 12:30:29.275359 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:45.275330925 +0000 UTC m=+87.298071831 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:29.530662 master-0 kubenswrapper[4780]: I1205 12:30:29.530449 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:29.530940 master-0 kubenswrapper[4780]: E1205 12:30:29.530662 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:30.726945 master-0 kubenswrapper[4780]: I1205 12:30:30.726855 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-qsggt"] Dec 05 12:30:30.727523 master-0 kubenswrapper[4780]: I1205 12:30:30.727435 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:30.727584 master-0 kubenswrapper[4780]: E1205 12:30:30.727518 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:30.889490 master-0 kubenswrapper[4780]: I1205 12:30:30.889379 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:30.990727 master-0 kubenswrapper[4780]: I1205 12:30:30.989894 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:31.530334 master-0 kubenswrapper[4780]: I1205 12:30:31.530242 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:31.530631 master-0 kubenswrapper[4780]: E1205 12:30:31.530527 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:31.612773 master-0 kubenswrapper[4780]: E1205 12:30:31.612351 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 12:30:31.612909 master-0 kubenswrapper[4780]: E1205 12:30:31.612782 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 12:30:31.612909 master-0 kubenswrapper[4780]: E1205 12:30:31.612799 4780 projected.go:194] Error preparing data for projected volume kube-api-access-69z2l for pod openshift-network-diagnostics/network-check-target-qsggt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:31.612909 master-0 kubenswrapper[4780]: E1205 12:30:31.612879 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l podName:f4a70855-80b5-4d6a-bed1-b42364940de0 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:32.112854845 +0000 UTC m=+74.135595571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-69z2l" (UniqueName: "kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l") pod "network-check-target-qsggt" (UID: "f4a70855-80b5-4d6a-bed1-b42364940de0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:31.753982 master-0 kubenswrapper[4780]: I1205 12:30:31.753936 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-xwx26"] Dec 05 12:30:31.755468 master-0 kubenswrapper[4780]: I1205 12:30:31.754415 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:31.760024 master-0 kubenswrapper[4780]: I1205 12:30:31.759951 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 12:30:31.760129 master-0 kubenswrapper[4780]: I1205 12:30:31.760030 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 12:30:31.760202 master-0 kubenswrapper[4780]: I1205 12:30:31.760030 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 12:30:31.760202 master-0 kubenswrapper[4780]: I1205 12:30:31.760077 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 12:30:31.760294 master-0 kubenswrapper[4780]: I1205 12:30:31.760073 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 12:30:31.844377 master-0 kubenswrapper[4780]: I1205 12:30:31.844323 4780 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="46d777da61d52678086a53c15e814977a05f1e509e1945fa53a5e65cac047f51" exitCode=0 Dec 05 12:30:31.844377 master-0 kubenswrapper[4780]: I1205 12:30:31.844376 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"46d777da61d52678086a53c15e814977a05f1e509e1945fa53a5e65cac047f51"} Dec 05 12:30:31.897464 master-0 kubenswrapper[4780]: I1205 12:30:31.897322 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:31.897464 master-0 kubenswrapper[4780]: I1205 12:30:31.897376 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-ovnkube-identity-cm\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:31.897464 master-0 kubenswrapper[4780]: I1205 12:30:31.897443 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-env-overrides\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:31.897464 master-0 kubenswrapper[4780]: I1205 12:30:31.897462 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvbfq\" (UniqueName: \"kubernetes.io/projected/b8233dad-bd19-4842-a4d5-cfa84f1feb83-kube-api-access-mvbfq\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:31.998729 master-0 kubenswrapper[4780]: I1205 12:30:31.998668 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvbfq\" (UniqueName: \"kubernetes.io/projected/b8233dad-bd19-4842-a4d5-cfa84f1feb83-kube-api-access-mvbfq\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:31.999336 master-0 kubenswrapper[4780]: I1205 12:30:31.998967 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:31.999336 master-0 kubenswrapper[4780]: E1205 12:30:31.999203 4780 secret.go:189] Couldn't get secret openshift-network-node-identity/network-node-identity-cert: secret "network-node-identity-cert" not found Dec 05 12:30:31.999336 master-0 kubenswrapper[4780]: I1205 12:30:31.999235 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-ovnkube-identity-cm\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:31.999336 master-0 kubenswrapper[4780]: E1205 12:30:31.999297 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert podName:b8233dad-bd19-4842-a4d5-cfa84f1feb83 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:32.499262597 +0000 UTC m=+74.522003353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert") pod "network-node-identity-xwx26" (UID: "b8233dad-bd19-4842-a4d5-cfa84f1feb83") : secret "network-node-identity-cert" not found Dec 05 12:30:31.999562 master-0 kubenswrapper[4780]: I1205 12:30:31.999487 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-env-overrides\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:32.000499 master-0 kubenswrapper[4780]: I1205 12:30:32.000371 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-ovnkube-identity-cm\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:32.000752 master-0 kubenswrapper[4780]: I1205 12:30:32.000702 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-env-overrides\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:32.020142 master-0 kubenswrapper[4780]: I1205 12:30:32.020102 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvbfq\" (UniqueName: \"kubernetes.io/projected/b8233dad-bd19-4842-a4d5-cfa84f1feb83-kube-api-access-mvbfq\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:32.201494 master-0 kubenswrapper[4780]: I1205 12:30:32.201437 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:32.201764 master-0 kubenswrapper[4780]: E1205 12:30:32.201707 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 12:30:32.201809 master-0 kubenswrapper[4780]: E1205 12:30:32.201763 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 12:30:32.201809 master-0 kubenswrapper[4780]: E1205 12:30:32.201784 4780 projected.go:194] Error preparing data for projected volume kube-api-access-69z2l for pod openshift-network-diagnostics/network-check-target-qsggt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:32.201889 master-0 kubenswrapper[4780]: E1205 12:30:32.201868 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l podName:f4a70855-80b5-4d6a-bed1-b42364940de0 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:33.201841448 +0000 UTC m=+75.224582344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-69z2l" (UniqueName: "kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l") pod "network-check-target-qsggt" (UID: "f4a70855-80b5-4d6a-bed1-b42364940de0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:32.504522 master-0 kubenswrapper[4780]: I1205 12:30:32.504397 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:32.511262 master-0 kubenswrapper[4780]: I1205 12:30:32.511200 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:32.529593 master-0 kubenswrapper[4780]: I1205 12:30:32.529563 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:32.529742 master-0 kubenswrapper[4780]: E1205 12:30:32.529709 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:32.673662 master-0 kubenswrapper[4780]: I1205 12:30:32.673596 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:30:32.704045 master-0 kubenswrapper[4780]: W1205 12:30:32.703983 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8233dad_bd19_4842_a4d5_cfa84f1feb83.slice/crio-ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d WatchSource:0}: Error finding container ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d: Status 404 returned error can't find the container with id ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d Dec 05 12:30:32.849266 master-0 kubenswrapper[4780]: I1205 12:30:32.849218 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerStarted","Data":"ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d"} Dec 05 12:30:33.212269 master-0 kubenswrapper[4780]: I1205 12:30:33.212165 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:33.212579 master-0 kubenswrapper[4780]: E1205 12:30:33.212424 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 12:30:33.212579 master-0 kubenswrapper[4780]: E1205 12:30:33.212471 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 12:30:33.212579 master-0 kubenswrapper[4780]: E1205 12:30:33.212488 4780 projected.go:194] Error preparing data for projected volume kube-api-access-69z2l for pod openshift-network-diagnostics/network-check-target-qsggt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:33.212579 master-0 kubenswrapper[4780]: E1205 12:30:33.212562 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l podName:f4a70855-80b5-4d6a-bed1-b42364940de0 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:35.21253787 +0000 UTC m=+77.235278806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-69z2l" (UniqueName: "kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l") pod "network-check-target-qsggt" (UID: "f4a70855-80b5-4d6a-bed1-b42364940de0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:33.530229 master-0 kubenswrapper[4780]: I1205 12:30:33.530032 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:33.530472 master-0 kubenswrapper[4780]: E1205 12:30:33.530251 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:33.855838 master-0 kubenswrapper[4780]: I1205 12:30:33.855681 4780 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="678a3e3b29045fc802f2f4ea9939ca067adfe6ff12b24bb2dd5f895390e55a41" exitCode=0 Dec 05 12:30:33.855838 master-0 kubenswrapper[4780]: I1205 12:30:33.855736 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"678a3e3b29045fc802f2f4ea9939ca067adfe6ff12b24bb2dd5f895390e55a41"} Dec 05 12:30:34.531213 master-0 kubenswrapper[4780]: I1205 12:30:34.530825 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:34.531464 master-0 kubenswrapper[4780]: E1205 12:30:34.531330 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:35.229909 master-0 kubenswrapper[4780]: I1205 12:30:35.229682 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:35.229909 master-0 kubenswrapper[4780]: E1205 12:30:35.229872 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 12:30:35.229909 master-0 kubenswrapper[4780]: E1205 12:30:35.229899 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 12:30:35.229909 master-0 kubenswrapper[4780]: E1205 12:30:35.229915 4780 projected.go:194] Error preparing data for projected volume kube-api-access-69z2l for pod openshift-network-diagnostics/network-check-target-qsggt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:35.230672 master-0 kubenswrapper[4780]: E1205 12:30:35.229977 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l podName:f4a70855-80b5-4d6a-bed1-b42364940de0 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:39.22995728 +0000 UTC m=+81.252698006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-69z2l" (UniqueName: "kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l") pod "network-check-target-qsggt" (UID: "f4a70855-80b5-4d6a-bed1-b42364940de0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:35.529693 master-0 kubenswrapper[4780]: I1205 12:30:35.529510 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:35.529961 master-0 kubenswrapper[4780]: E1205 12:30:35.529809 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:36.533855 master-0 kubenswrapper[4780]: I1205 12:30:36.533775 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:36.534754 master-0 kubenswrapper[4780]: E1205 12:30:36.534000 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:37.530217 master-0 kubenswrapper[4780]: I1205 12:30:37.529623 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:37.530217 master-0 kubenswrapper[4780]: E1205 12:30:37.529792 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:38.530865 master-0 kubenswrapper[4780]: I1205 12:30:38.529788 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:38.530865 master-0 kubenswrapper[4780]: E1205 12:30:38.530431 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:39.264696 master-0 kubenswrapper[4780]: I1205 12:30:39.264622 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:39.265048 master-0 kubenswrapper[4780]: E1205 12:30:39.264971 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 12:30:39.265048 master-0 kubenswrapper[4780]: E1205 12:30:39.265041 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 12:30:39.265135 master-0 kubenswrapper[4780]: E1205 12:30:39.265061 4780 projected.go:194] Error preparing data for projected volume kube-api-access-69z2l for pod openshift-network-diagnostics/network-check-target-qsggt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:39.265203 master-0 kubenswrapper[4780]: E1205 12:30:39.265150 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l podName:f4a70855-80b5-4d6a-bed1-b42364940de0 nodeName:}" failed. No retries permitted until 2025-12-05 12:30:47.265123068 +0000 UTC m=+89.287863984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-69z2l" (UniqueName: "kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l") pod "network-check-target-qsggt" (UID: "f4a70855-80b5-4d6a-bed1-b42364940de0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:39.533338 master-0 kubenswrapper[4780]: I1205 12:30:39.533209 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:39.533852 master-0 kubenswrapper[4780]: E1205 12:30:39.533363 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:40.530259 master-0 kubenswrapper[4780]: I1205 12:30:40.530135 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:40.530538 master-0 kubenswrapper[4780]: E1205 12:30:40.530403 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:41.530508 master-0 kubenswrapper[4780]: I1205 12:30:41.530432 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:41.531311 master-0 kubenswrapper[4780]: E1205 12:30:41.530601 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:42.530281 master-0 kubenswrapper[4780]: I1205 12:30:42.530224 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:42.530494 master-0 kubenswrapper[4780]: E1205 12:30:42.530422 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:42.889258 master-0 kubenswrapper[4780]: I1205 12:30:42.889137 4780 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="d98d05970b7b2ac04c6af16edb9c07e4ea790e687fa82b42828f83752f9655a5" exitCode=0 Dec 05 12:30:42.889258 master-0 kubenswrapper[4780]: I1205 12:30:42.889239 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"d98d05970b7b2ac04c6af16edb9c07e4ea790e687fa82b42828f83752f9655a5"} Dec 05 12:30:42.892348 master-0 kubenswrapper[4780]: I1205 12:30:42.892292 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerID="393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3" exitCode=0 Dec 05 12:30:42.892437 master-0 kubenswrapper[4780]: I1205 12:30:42.892379 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerDied","Data":"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3"} Dec 05 12:30:42.897296 master-0 kubenswrapper[4780]: I1205 12:30:42.897240 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerStarted","Data":"c63a8034e23c88dd09173f57e05eee7c9bc26e35890cfdd9f1fdc8ef0e16d843"} Dec 05 12:30:42.900000 master-0 kubenswrapper[4780]: I1205 12:30:42.899937 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerStarted","Data":"41718b57d6d2e36d2cb94e43774b239e600e6619dc10d3c14a0345e610d821c2"} Dec 05 12:30:42.900064 master-0 kubenswrapper[4780]: I1205 12:30:42.900003 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerStarted","Data":"30c5b5c630bd02b5b3e82dbf4596b8a0300a9a9b3ba466ae6fca11dbd31d9aeb"} Dec 05 12:30:42.935656 master-0 kubenswrapper[4780]: I1205 12:30:42.935147 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" podStartSLOduration=2.258462002 podStartE2EDuration="18.935115073s" podCreationTimestamp="2025-12-05 12:30:24 +0000 UTC" firstStartedPulling="2025-12-05 12:30:25.521753798 +0000 UTC m=+67.544494524" lastFinishedPulling="2025-12-05 12:30:42.198406869 +0000 UTC m=+84.221147595" observedRunningTime="2025-12-05 12:30:42.933714726 +0000 UTC m=+84.956455452" watchObservedRunningTime="2025-12-05 12:30:42.935115073 +0000 UTC m=+84.957855839" Dec 05 12:30:43.530953 master-0 kubenswrapper[4780]: I1205 12:30:43.530143 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:43.531452 master-0 kubenswrapper[4780]: E1205 12:30:43.531283 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:43.910680 master-0 kubenswrapper[4780]: I1205 12:30:43.910425 4780 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="48cc412fc0495a9b989b3163afe32a67e585bd82e370a59d4690f30fe1abc9dc" exitCode=0 Dec 05 12:30:43.910680 master-0 kubenswrapper[4780]: I1205 12:30:43.910570 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"48cc412fc0495a9b989b3163afe32a67e585bd82e370a59d4690f30fe1abc9dc"} Dec 05 12:30:43.919957 master-0 kubenswrapper[4780]: I1205 12:30:43.919872 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerStarted","Data":"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea"} Dec 05 12:30:43.919957 master-0 kubenswrapper[4780]: I1205 12:30:43.919935 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerStarted","Data":"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0"} Dec 05 12:30:43.919957 master-0 kubenswrapper[4780]: I1205 12:30:43.919950 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerStarted","Data":"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112"} Dec 05 12:30:43.919957 master-0 kubenswrapper[4780]: I1205 12:30:43.919962 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerStarted","Data":"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef"} Dec 05 12:30:43.919957 master-0 kubenswrapper[4780]: I1205 12:30:43.919973 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerStarted","Data":"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1"} Dec 05 12:30:43.920435 master-0 kubenswrapper[4780]: I1205 12:30:43.919985 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerStarted","Data":"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22"} Dec 05 12:30:44.531024 master-0 kubenswrapper[4780]: I1205 12:30:44.530470 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:44.531436 master-0 kubenswrapper[4780]: E1205 12:30:44.531147 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:45.188514 master-0 kubenswrapper[4780]: I1205 12:30:45.188216 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-xwx26" podStartSLOduration=4.6707880809999995 podStartE2EDuration="14.188164703s" podCreationTimestamp="2025-12-05 12:30:31 +0000 UTC" firstStartedPulling="2025-12-05 12:30:32.706156407 +0000 UTC m=+74.728897123" lastFinishedPulling="2025-12-05 12:30:42.223533019 +0000 UTC m=+84.246273745" observedRunningTime="2025-12-05 12:30:43.001219979 +0000 UTC m=+85.023960755" watchObservedRunningTime="2025-12-05 12:30:45.188164703 +0000 UTC m=+87.210905429" Dec 05 12:30:45.317970 master-0 kubenswrapper[4780]: I1205 12:30:45.317896 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:45.318341 master-0 kubenswrapper[4780]: E1205 12:30:45.318088 4780 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:45.318341 master-0 kubenswrapper[4780]: E1205 12:30:45.318209 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:17.318152498 +0000 UTC m=+119.340893234 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:30:45.514538 master-0 kubenswrapper[4780]: I1205 12:30:45.514468 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 05 12:30:45.514823 master-0 kubenswrapper[4780]: W1205 12:30:45.514662 4780 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Dec 05 12:30:45.530111 master-0 kubenswrapper[4780]: I1205 12:30:45.530069 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:45.530485 master-0 kubenswrapper[4780]: E1205 12:30:45.530455 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:45.933569 master-0 kubenswrapper[4780]: I1205 12:30:45.933344 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerStarted","Data":"1c3530626c917433ac22bbadd19205f000313560085b5540423d4847a8993705"} Dec 05 12:30:46.530486 master-0 kubenswrapper[4780]: I1205 12:30:46.530343 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:46.531499 master-0 kubenswrapper[4780]: E1205 12:30:46.530682 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:47.337771 master-0 kubenswrapper[4780]: I1205 12:30:47.337536 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:47.338160 master-0 kubenswrapper[4780]: E1205 12:30:47.337786 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 12:30:47.338160 master-0 kubenswrapper[4780]: E1205 12:30:47.337834 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 12:30:47.338160 master-0 kubenswrapper[4780]: E1205 12:30:47.337857 4780 projected.go:194] Error preparing data for projected volume kube-api-access-69z2l for pod openshift-network-diagnostics/network-check-target-qsggt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:47.338160 master-0 kubenswrapper[4780]: E1205 12:30:47.337974 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l podName:f4a70855-80b5-4d6a-bed1-b42364940de0 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:03.33793319 +0000 UTC m=+105.360673956 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-69z2l" (UniqueName: "kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l") pod "network-check-target-qsggt" (UID: "f4a70855-80b5-4d6a-bed1-b42364940de0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:30:47.530517 master-0 kubenswrapper[4780]: I1205 12:30:47.530399 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:47.531497 master-0 kubenswrapper[4780]: E1205 12:30:47.530573 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:47.950088 master-0 kubenswrapper[4780]: I1205 12:30:47.949953 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerStarted","Data":"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58"} Dec 05 12:30:48.530117 master-0 kubenswrapper[4780]: I1205 12:30:48.530054 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:48.530629 master-0 kubenswrapper[4780]: E1205 12:30:48.530603 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:49.530539 master-0 kubenswrapper[4780]: I1205 12:30:49.530210 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:49.530953 master-0 kubenswrapper[4780]: E1205 12:30:49.530586 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:50.308396 master-0 kubenswrapper[4780]: I1205 12:30:50.308283 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=5.308257455 podStartE2EDuration="5.308257455s" podCreationTimestamp="2025-12-05 12:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:30:47.753810156 +0000 UTC m=+89.776550902" watchObservedRunningTime="2025-12-05 12:30:50.308257455 +0000 UTC m=+92.330998191" Dec 05 12:30:50.308801 master-0 kubenswrapper[4780]: I1205 12:30:50.308451 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-prt97" podStartSLOduration=9.077967769 podStartE2EDuration="38.30844518s" podCreationTimestamp="2025-12-05 12:30:12 +0000 UTC" firstStartedPulling="2025-12-05 12:30:12.919538486 +0000 UTC m=+54.942279212" lastFinishedPulling="2025-12-05 12:30:42.150015907 +0000 UTC m=+84.172756623" observedRunningTime="2025-12-05 12:30:50.308132552 +0000 UTC m=+92.330873348" watchObservedRunningTime="2025-12-05 12:30:50.30844518 +0000 UTC m=+92.331185916" Dec 05 12:30:50.530984 master-0 kubenswrapper[4780]: I1205 12:30:50.530444 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:50.530984 master-0 kubenswrapper[4780]: E1205 12:30:50.530683 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:51.530148 master-0 kubenswrapper[4780]: I1205 12:30:51.530044 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:51.530148 master-0 kubenswrapper[4780]: E1205 12:30:51.530163 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:52.530157 master-0 kubenswrapper[4780]: I1205 12:30:52.529687 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:52.531320 master-0 kubenswrapper[4780]: E1205 12:30:52.530376 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:52.974332 master-0 kubenswrapper[4780]: I1205 12:30:52.974239 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerStarted","Data":"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45"} Dec 05 12:30:53.530604 master-0 kubenswrapper[4780]: I1205 12:30:53.530490 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:53.531563 master-0 kubenswrapper[4780]: E1205 12:30:53.530663 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:53.899279 master-0 kubenswrapper[4780]: I1205 12:30:53.899106 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 05 12:30:53.900115 master-0 kubenswrapper[4780]: I1205 12:30:53.900051 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 05 12:30:53.977881 master-0 kubenswrapper[4780]: I1205 12:30:53.977633 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:53.978184 master-0 kubenswrapper[4780]: I1205 12:30:53.978105 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:54.003350 master-0 kubenswrapper[4780]: I1205 12:30:54.003303 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:54.099750 master-0 kubenswrapper[4780]: I1205 12:30:54.099356 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" podStartSLOduration=13.236459727 podStartE2EDuration="30.099332821s" podCreationTimestamp="2025-12-05 12:30:24 +0000 UTC" firstStartedPulling="2025-12-05 12:30:25.356444584 +0000 UTC m=+67.379185330" lastFinishedPulling="2025-12-05 12:30:42.219317698 +0000 UTC m=+84.242058424" observedRunningTime="2025-12-05 12:30:54.097936724 +0000 UTC m=+96.120677460" watchObservedRunningTime="2025-12-05 12:30:54.099332821 +0000 UTC m=+96.122073547" Dec 05 12:30:54.119894 master-0 kubenswrapper[4780]: I1205 12:30:54.119795 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=1.119772167 podStartE2EDuration="1.119772167s" podCreationTimestamp="2025-12-05 12:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:30:54.119228063 +0000 UTC m=+96.141968799" watchObservedRunningTime="2025-12-05 12:30:54.119772167 +0000 UTC m=+96.142512913" Dec 05 12:30:54.129714 master-0 kubenswrapper[4780]: I1205 12:30:54.129499 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=1.129473962 podStartE2EDuration="1.129473962s" podCreationTimestamp="2025-12-05 12:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:30:54.129079572 +0000 UTC m=+96.151820288" watchObservedRunningTime="2025-12-05 12:30:54.129473962 +0000 UTC m=+96.152214688" Dec 05 12:30:54.530683 master-0 kubenswrapper[4780]: I1205 12:30:54.530604 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:54.531426 master-0 kubenswrapper[4780]: E1205 12:30:54.530812 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:54.980701 master-0 kubenswrapper[4780]: I1205 12:30:54.980619 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:55.008398 master-0 kubenswrapper[4780]: I1205 12:30:55.008336 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:30:55.445792 master-0 kubenswrapper[4780]: I1205 12:30:55.445730 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qsggt"] Dec 05 12:30:55.446041 master-0 kubenswrapper[4780]: I1205 12:30:55.445853 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:55.446041 master-0 kubenswrapper[4780]: E1205 12:30:55.445959 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:55.449495 master-0 kubenswrapper[4780]: I1205 12:30:55.449449 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-99djw"] Dec 05 12:30:55.449619 master-0 kubenswrapper[4780]: I1205 12:30:55.449549 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:55.449653 master-0 kubenswrapper[4780]: E1205 12:30:55.449631 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:57.530556 master-0 kubenswrapper[4780]: I1205 12:30:57.530467 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:57.531129 master-0 kubenswrapper[4780]: I1205 12:30:57.530479 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:57.531129 master-0 kubenswrapper[4780]: E1205 12:30:57.530650 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:30:57.531129 master-0 kubenswrapper[4780]: E1205 12:30:57.530688 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:58.147607 master-0 kubenswrapper[4780]: I1205 12:30:58.147556 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:30:58.147901 master-0 kubenswrapper[4780]: E1205 12:30:58.147781 4780 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:30:58.148020 master-0 kubenswrapper[4780]: E1205 12:30:58.148009 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:32:02.147989663 +0000 UTC m=+164.170730389 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:30:59.530747 master-0 kubenswrapper[4780]: I1205 12:30:59.530634 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:30:59.530747 master-0 kubenswrapper[4780]: I1205 12:30:59.530700 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:30:59.531857 master-0 kubenswrapper[4780]: E1205 12:30:59.530822 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:30:59.531857 master-0 kubenswrapper[4780]: E1205 12:30:59.530964 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:01.529712 master-0 kubenswrapper[4780]: I1205 12:31:01.529619 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:01.530360 master-0 kubenswrapper[4780]: I1205 12:31:01.529630 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:01.530360 master-0 kubenswrapper[4780]: E1205 12:31:01.529802 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:01.530360 master-0 kubenswrapper[4780]: E1205 12:31:01.529924 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:01.947844 master-0 kubenswrapper[4780]: I1205 12:31:01.947784 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nrs4v"] Dec 05 12:31:01.948298 master-0 kubenswrapper[4780]: I1205 12:31:01.948262 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovn-controller" containerID="cri-o://c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22" gracePeriod=30 Dec 05 12:31:01.948841 master-0 kubenswrapper[4780]: I1205 12:31:01.948738 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="kube-rbac-proxy-node" containerID="cri-o://c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef" gracePeriod=30 Dec 05 12:31:01.948841 master-0 kubenswrapper[4780]: I1205 12:31:01.948739 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112" gracePeriod=30 Dec 05 12:31:01.948841 master-0 kubenswrapper[4780]: I1205 12:31:01.948770 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="nbdb" containerID="cri-o://bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea" gracePeriod=30 Dec 05 12:31:01.948958 master-0 kubenswrapper[4780]: I1205 12:31:01.948841 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="northd" containerID="cri-o://ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0" gracePeriod=30 Dec 05 12:31:01.948958 master-0 kubenswrapper[4780]: I1205 12:31:01.948785 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovn-acl-logging" containerID="cri-o://e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1" gracePeriod=30 Dec 05 12:31:01.949008 master-0 kubenswrapper[4780]: I1205 12:31:01.948801 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="sbdb" containerID="cri-o://4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58" gracePeriod=30 Dec 05 12:31:01.968974 master-0 kubenswrapper[4780]: I1205 12:31:01.968825 4780 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovnkube-controller" containerID="cri-o://7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45" gracePeriod=30 Dec 05 12:31:01.970966 master-0 kubenswrapper[4780]: I1205 12:31:01.970737 4780 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovnkube-controller" probeResult="failure" output="" Dec 05 12:31:02.719109 master-0 kubenswrapper[4780]: I1205 12:31:02.719051 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nrs4v_f7342973-8d2b-4e8e-ad46-99d8dd7a6688/ovnkube-controller/0.log" Dec 05 12:31:02.721149 master-0 kubenswrapper[4780]: I1205 12:31:02.721129 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nrs4v_f7342973-8d2b-4e8e-ad46-99d8dd7a6688/kube-rbac-proxy-ovn-metrics/0.log" Dec 05 12:31:02.721766 master-0 kubenswrapper[4780]: I1205 12:31:02.721750 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nrs4v_f7342973-8d2b-4e8e-ad46-99d8dd7a6688/kube-rbac-proxy-node/0.log" Dec 05 12:31:02.722413 master-0 kubenswrapper[4780]: I1205 12:31:02.722372 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nrs4v_f7342973-8d2b-4e8e-ad46-99d8dd7a6688/ovn-acl-logging/0.log" Dec 05 12:31:02.723000 master-0 kubenswrapper[4780]: I1205 12:31:02.722959 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nrs4v_f7342973-8d2b-4e8e-ad46-99d8dd7a6688/ovn-controller/0.log" Dec 05 12:31:02.723435 master-0 kubenswrapper[4780]: I1205 12:31:02.723419 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:31:02.793326 master-0 kubenswrapper[4780]: I1205 12:31:02.793280 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-systemd\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793326 master-0 kubenswrapper[4780]: I1205 12:31:02.793328 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793359 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-log-socket\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793387 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovn-node-metrics-cert\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793408 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-config\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793428 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-bin\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793449 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-var-lib-openvswitch\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793468 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-node-log\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793488 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-etc-openvswitch\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793508 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-kubelet\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793524 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-openvswitch\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793542 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-env-overrides\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793526 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.793594 master-0 kubenswrapper[4780]: I1205 12:31:02.793581 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-log-socket" (OuterVolumeSpecName: "log-socket") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793615 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793560 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-ovn-kubernetes\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793665 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793689 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793705 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793700 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-slash\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793735 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-slash" (OuterVolumeSpecName: "host-slash") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793747 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793960 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-ovn\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793972 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793983 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-node-log" (OuterVolumeSpecName: "node-log") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.793992 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-systemd-units\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.794014 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.794023 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.794036 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-netd\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.794160 master-0 kubenswrapper[4780]: I1205 12:31:02.794058 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794064 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-netns\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794089 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794107 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-script-lib\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794145 4780 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk42r\" (UniqueName: \"kubernetes.io/projected/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-kube-api-access-pk42r\") pod \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\" (UID: \"f7342973-8d2b-4e8e-ad46-99d8dd7a6688\") " Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794287 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794369 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794439 4780 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-node-log\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794462 4780 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794479 4780 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-kubelet\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794491 4780 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794503 4780 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-env-overrides\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794516 4780 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794530 4780 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-slash\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794541 4780 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-ovn\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794552 4780 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-systemd-units\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794565 4780 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794576 4780 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-run-netns\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794588 4780 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794603 4780 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-log-socket\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794616 4780 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794628 4780 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.794631 master-0 kubenswrapper[4780]: I1205 12:31:02.794641 4780 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.795147 master-0 kubenswrapper[4780]: I1205 12:31:02.794703 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:31:02.800670 master-0 kubenswrapper[4780]: I1205 12:31:02.800616 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:31:02.801266 master-0 kubenswrapper[4780]: I1205 12:31:02.801151 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-kube-api-access-pk42r" (OuterVolumeSpecName: "kube-api-access-pk42r") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "kube-api-access-pk42r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:31:02.802732 master-0 kubenswrapper[4780]: I1205 12:31:02.802656 4780 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f7342973-8d2b-4e8e-ad46-99d8dd7a6688" (UID: "f7342973-8d2b-4e8e-ad46-99d8dd7a6688"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:31:02.868229 master-0 kubenswrapper[4780]: I1205 12:31:02.868115 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9vqtb"] Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: E1205 12:31:02.868382 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="nbdb" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: I1205 12:31:02.868403 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="nbdb" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: E1205 12:31:02.868414 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="kubecfg-setup" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: I1205 12:31:02.868423 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="kubecfg-setup" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: E1205 12:31:02.868437 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovn-acl-logging" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: I1205 12:31:02.868452 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovn-acl-logging" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: E1205 12:31:02.868462 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: I1205 12:31:02.868470 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: E1205 12:31:02.868479 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovnkube-controller" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: I1205 12:31:02.868486 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovnkube-controller" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: E1205 12:31:02.868499 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="sbdb" Dec 05 12:31:02.868499 master-0 kubenswrapper[4780]: I1205 12:31:02.868507 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="sbdb" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: E1205 12:31:02.868516 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="northd" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868523 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="northd" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: E1205 12:31:02.868530 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovn-controller" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868538 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovn-controller" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: E1205 12:31:02.868545 4780 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="kube-rbac-proxy-node" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868552 4780 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="kube-rbac-proxy-node" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868642 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="sbdb" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868658 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="kube-rbac-proxy-node" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868666 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="nbdb" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868674 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovn-acl-logging" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868689 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovnkube-controller" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868697 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="ovn-controller" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868705 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 12:31:02.868796 master-0 kubenswrapper[4780]: I1205 12:31:02.868712 4780 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerName="northd" Dec 05 12:31:02.870530 master-0 kubenswrapper[4780]: I1205 12:31:02.870475 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.895461 master-0 kubenswrapper[4780]: I1205 12:31:02.895383 4780 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.895461 master-0 kubenswrapper[4780]: I1205 12:31:02.895436 4780 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk42r\" (UniqueName: \"kubernetes.io/projected/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-kube-api-access-pk42r\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.895461 master-0 kubenswrapper[4780]: I1205 12:31:02.895453 4780 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-run-systemd\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.895461 master-0 kubenswrapper[4780]: I1205 12:31:02.895466 4780 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f7342973-8d2b-4e8e-ad46-99d8dd7a6688-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:02.995921 master-0 kubenswrapper[4780]: I1205 12:31:02.995708 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmq98\" (UniqueName: \"kubernetes.io/projected/4492c55f-701b-4ec8-ada1-0a5dc126d405-kube-api-access-dmq98\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.995921 master-0 kubenswrapper[4780]: I1205 12:31:02.995778 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-systemd-units\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.995921 master-0 kubenswrapper[4780]: I1205 12:31:02.995862 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-log-socket\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.995921 master-0 kubenswrapper[4780]: I1205 12:31:02.995884 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.995921 master-0 kubenswrapper[4780]: I1205 12:31:02.995902 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-script-lib\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.996613 master-0 kubenswrapper[4780]: I1205 12:31:02.995975 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.996613 master-0 kubenswrapper[4780]: I1205 12:31:02.996037 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-node-log\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.996613 master-0 kubenswrapper[4780]: I1205 12:31:02.996067 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-config\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.996613 master-0 kubenswrapper[4780]: I1205 12:31:02.996147 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-kubelet\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.996613 master-0 kubenswrapper[4780]: I1205 12:31:02.996261 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-slash\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.996613 master-0 kubenswrapper[4780]: I1205 12:31:02.996314 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-systemd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.996613 master-0 kubenswrapper[4780]: I1205 12:31:02.996394 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-netd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.996613 master-0 kubenswrapper[4780]: I1205 12:31:02.996464 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-env-overrides\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.996613 master-0 kubenswrapper[4780]: I1205 12:31:02.996487 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovn-node-metrics-cert\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.996613 master-0 kubenswrapper[4780]: I1205 12:31:02.996540 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-ovn\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.997261 master-0 kubenswrapper[4780]: I1205 12:31:02.996684 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-netns\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.997261 master-0 kubenswrapper[4780]: I1205 12:31:02.996776 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.997261 master-0 kubenswrapper[4780]: I1205 12:31:02.996811 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-etc-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.997261 master-0 kubenswrapper[4780]: I1205 12:31:02.996832 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-bin\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:02.997261 master-0 kubenswrapper[4780]: I1205 12:31:02.996876 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-var-lib-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.009152 master-0 kubenswrapper[4780]: I1205 12:31:03.009069 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nrs4v_f7342973-8d2b-4e8e-ad46-99d8dd7a6688/ovnkube-controller/0.log" Dec 05 12:31:03.011456 master-0 kubenswrapper[4780]: I1205 12:31:03.011410 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nrs4v_f7342973-8d2b-4e8e-ad46-99d8dd7a6688/kube-rbac-proxy-ovn-metrics/0.log" Dec 05 12:31:03.012084 master-0 kubenswrapper[4780]: I1205 12:31:03.012040 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nrs4v_f7342973-8d2b-4e8e-ad46-99d8dd7a6688/kube-rbac-proxy-node/0.log" Dec 05 12:31:03.012994 master-0 kubenswrapper[4780]: I1205 12:31:03.012943 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nrs4v_f7342973-8d2b-4e8e-ad46-99d8dd7a6688/ovn-acl-logging/0.log" Dec 05 12:31:03.013760 master-0 kubenswrapper[4780]: I1205 12:31:03.013715 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nrs4v_f7342973-8d2b-4e8e-ad46-99d8dd7a6688/ovn-controller/0.log" Dec 05 12:31:03.014246 master-0 kubenswrapper[4780]: I1205 12:31:03.014167 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerID="7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45" exitCode=1 Dec 05 12:31:03.014246 master-0 kubenswrapper[4780]: I1205 12:31:03.014230 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerID="4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58" exitCode=0 Dec 05 12:31:03.014246 master-0 kubenswrapper[4780]: I1205 12:31:03.014242 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerID="bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea" exitCode=0 Dec 05 12:31:03.014246 master-0 kubenswrapper[4780]: I1205 12:31:03.014252 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerID="ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0" exitCode=0 Dec 05 12:31:03.014498 master-0 kubenswrapper[4780]: I1205 12:31:03.014236 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerDied","Data":"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45"} Dec 05 12:31:03.014498 master-0 kubenswrapper[4780]: I1205 12:31:03.014266 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerID="281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112" exitCode=143 Dec 05 12:31:03.014498 master-0 kubenswrapper[4780]: I1205 12:31:03.014348 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerDied","Data":"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58"} Dec 05 12:31:03.014498 master-0 kubenswrapper[4780]: I1205 12:31:03.014384 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerID="c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef" exitCode=143 Dec 05 12:31:03.014498 master-0 kubenswrapper[4780]: I1205 12:31:03.014414 4780 scope.go:117] "RemoveContainer" containerID="7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45" Dec 05 12:31:03.014498 master-0 kubenswrapper[4780]: I1205 12:31:03.014417 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerID="e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1" exitCode=143 Dec 05 12:31:03.014498 master-0 kubenswrapper[4780]: I1205 12:31:03.014437 4780 generic.go:334] "Generic (PLEG): container finished" podID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" containerID="c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22" exitCode=143 Dec 05 12:31:03.014498 master-0 kubenswrapper[4780]: I1205 12:31:03.014397 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerDied","Data":"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea"} Dec 05 12:31:03.014498 master-0 kubenswrapper[4780]: I1205 12:31:03.014483 4780 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014546 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerDied","Data":"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014597 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerDied","Data":"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014641 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerDied","Data":"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014669 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014734 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014750 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014771 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerDied","Data":"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014795 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014814 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014829 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014844 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014858 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014873 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014887 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014902 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014916 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014938 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerDied","Data":"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014961 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014979 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.014995 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.015014 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0"} Dec 05 12:31:03.014990 master-0 kubenswrapper[4780]: I1205 12:31:03.015031 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015048 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015064 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015080 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015094 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015114 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nrs4v" event={"ID":"f7342973-8d2b-4e8e-ad46-99d8dd7a6688","Type":"ContainerDied","Data":"a021d8fc4cc2b621fc5a80784b2ce374483ed0ce0f8315b255679472aa810f64"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015135 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015151 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015165 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015212 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015230 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015245 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015258 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015274 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22"} Dec 05 12:31:03.016263 master-0 kubenswrapper[4780]: I1205 12:31:03.015288 4780 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3"} Dec 05 12:31:03.033560 master-0 kubenswrapper[4780]: I1205 12:31:03.033502 4780 scope.go:117] "RemoveContainer" containerID="4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58" Dec 05 12:31:03.050391 master-0 kubenswrapper[4780]: I1205 12:31:03.050334 4780 scope.go:117] "RemoveContainer" containerID="bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea" Dec 05 12:31:03.067723 master-0 kubenswrapper[4780]: I1205 12:31:03.067678 4780 scope.go:117] "RemoveContainer" containerID="ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0" Dec 05 12:31:03.081423 master-0 kubenswrapper[4780]: I1205 12:31:03.081336 4780 scope.go:117] "RemoveContainer" containerID="281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112" Dec 05 12:31:03.094136 master-0 kubenswrapper[4780]: I1205 12:31:03.094064 4780 scope.go:117] "RemoveContainer" containerID="c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef" Dec 05 12:31:03.097375 master-0 kubenswrapper[4780]: I1205 12:31:03.097313 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmq98\" (UniqueName: \"kubernetes.io/projected/4492c55f-701b-4ec8-ada1-0a5dc126d405-kube-api-access-dmq98\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.097474 master-0 kubenswrapper[4780]: I1205 12:31:03.097395 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-script-lib\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.097646 master-0 kubenswrapper[4780]: I1205 12:31:03.097614 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-systemd-units\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.097708 master-0 kubenswrapper[4780]: I1205 12:31:03.097665 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-log-socket\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.097708 master-0 kubenswrapper[4780]: I1205 12:31:03.097690 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.097785 master-0 kubenswrapper[4780]: I1205 12:31:03.097715 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.097847 master-0 kubenswrapper[4780]: I1205 12:31:03.097803 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-systemd-units\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098058 master-0 kubenswrapper[4780]: I1205 12:31:03.097987 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-node-log\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098058 master-0 kubenswrapper[4780]: I1205 12:31:03.098043 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098217 master-0 kubenswrapper[4780]: I1205 12:31:03.098103 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-log-socket\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098217 master-0 kubenswrapper[4780]: I1205 12:31:03.098115 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-config\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098217 master-0 kubenswrapper[4780]: I1205 12:31:03.098154 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-kubelet\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098217 master-0 kubenswrapper[4780]: I1205 12:31:03.098190 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-node-log\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098397 master-0 kubenswrapper[4780]: I1205 12:31:03.098238 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-kubelet\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098397 master-0 kubenswrapper[4780]: I1205 12:31:03.098159 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098397 master-0 kubenswrapper[4780]: I1205 12:31:03.098261 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-slash\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098397 master-0 kubenswrapper[4780]: I1205 12:31:03.098296 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-systemd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098397 master-0 kubenswrapper[4780]: I1205 12:31:03.098305 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-script-lib\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098397 master-0 kubenswrapper[4780]: I1205 12:31:03.098315 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-slash\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098397 master-0 kubenswrapper[4780]: I1205 12:31:03.098347 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-netd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098397 master-0 kubenswrapper[4780]: I1205 12:31:03.098379 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-systemd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098397 master-0 kubenswrapper[4780]: I1205 12:31:03.098409 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-env-overrides\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098430 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-netd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098443 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovn-node-metrics-cert\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098501 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-ovn\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098540 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-netns\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098585 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098615 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-ovn\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098626 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-etc-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098664 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-netns\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098725 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-etc-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098783 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-bin\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098806 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098819 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-var-lib-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098825 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-config\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098866 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-bin\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098893 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-var-lib-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.098906 master-0 kubenswrapper[4780]: I1205 12:31:03.098917 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-env-overrides\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.101554 master-0 kubenswrapper[4780]: I1205 12:31:03.101495 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovn-node-metrics-cert\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.107012 master-0 kubenswrapper[4780]: I1205 12:31:03.106952 4780 scope.go:117] "RemoveContainer" containerID="e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1" Dec 05 12:31:03.120981 master-0 kubenswrapper[4780]: I1205 12:31:03.120811 4780 scope.go:117] "RemoveContainer" containerID="c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22" Dec 05 12:31:03.132675 master-0 kubenswrapper[4780]: I1205 12:31:03.132530 4780 scope.go:117] "RemoveContainer" containerID="393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3" Dec 05 12:31:03.143505 master-0 kubenswrapper[4780]: I1205 12:31:03.143468 4780 scope.go:117] "RemoveContainer" containerID="7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45" Dec 05 12:31:03.144036 master-0 kubenswrapper[4780]: E1205 12:31:03.144002 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": container with ID starting with 7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45 not found: ID does not exist" containerID="7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45" Dec 05 12:31:03.144114 master-0 kubenswrapper[4780]: I1205 12:31:03.144045 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45"} err="failed to get container status \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": rpc error: code = NotFound desc = could not find container \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": container with ID starting with 7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45 not found: ID does not exist" Dec 05 12:31:03.144114 master-0 kubenswrapper[4780]: I1205 12:31:03.144072 4780 scope.go:117] "RemoveContainer" containerID="4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58" Dec 05 12:31:03.144471 master-0 kubenswrapper[4780]: E1205 12:31:03.144432 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": container with ID starting with 4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58 not found: ID does not exist" containerID="4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58" Dec 05 12:31:03.144544 master-0 kubenswrapper[4780]: I1205 12:31:03.144486 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58"} err="failed to get container status \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": rpc error: code = NotFound desc = could not find container \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": container with ID starting with 4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58 not found: ID does not exist" Dec 05 12:31:03.144544 master-0 kubenswrapper[4780]: I1205 12:31:03.144520 4780 scope.go:117] "RemoveContainer" containerID="bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea" Dec 05 12:31:03.144843 master-0 kubenswrapper[4780]: E1205 12:31:03.144817 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": container with ID starting with bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea not found: ID does not exist" containerID="bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea" Dec 05 12:31:03.145034 master-0 kubenswrapper[4780]: I1205 12:31:03.144846 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea"} err="failed to get container status \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": rpc error: code = NotFound desc = could not find container \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": container with ID starting with bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea not found: ID does not exist" Dec 05 12:31:03.145034 master-0 kubenswrapper[4780]: I1205 12:31:03.144863 4780 scope.go:117] "RemoveContainer" containerID="ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0" Dec 05 12:31:03.145141 master-0 kubenswrapper[4780]: E1205 12:31:03.145116 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": container with ID starting with ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0 not found: ID does not exist" containerID="ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0" Dec 05 12:31:03.145217 master-0 kubenswrapper[4780]: I1205 12:31:03.145148 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0"} err="failed to get container status \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": rpc error: code = NotFound desc = could not find container \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": container with ID starting with ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0 not found: ID does not exist" Dec 05 12:31:03.145217 master-0 kubenswrapper[4780]: I1205 12:31:03.145165 4780 scope.go:117] "RemoveContainer" containerID="281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112" Dec 05 12:31:03.145484 master-0 kubenswrapper[4780]: E1205 12:31:03.145460 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": container with ID starting with 281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112 not found: ID does not exist" containerID="281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112" Dec 05 12:31:03.145662 master-0 kubenswrapper[4780]: I1205 12:31:03.145482 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112"} err="failed to get container status \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": rpc error: code = NotFound desc = could not find container \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": container with ID starting with 281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112 not found: ID does not exist" Dec 05 12:31:03.145662 master-0 kubenswrapper[4780]: I1205 12:31:03.145497 4780 scope.go:117] "RemoveContainer" containerID="c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef" Dec 05 12:31:03.145829 master-0 kubenswrapper[4780]: E1205 12:31:03.145810 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": container with ID starting with c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef not found: ID does not exist" containerID="c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef" Dec 05 12:31:03.145883 master-0 kubenswrapper[4780]: I1205 12:31:03.145829 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef"} err="failed to get container status \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": rpc error: code = NotFound desc = could not find container \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": container with ID starting with c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef not found: ID does not exist" Dec 05 12:31:03.145883 master-0 kubenswrapper[4780]: I1205 12:31:03.145842 4780 scope.go:117] "RemoveContainer" containerID="e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1" Dec 05 12:31:03.146085 master-0 kubenswrapper[4780]: E1205 12:31:03.146064 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1\": container with ID starting with e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1 not found: ID does not exist" containerID="e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1" Dec 05 12:31:03.146149 master-0 kubenswrapper[4780]: I1205 12:31:03.146083 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1"} err="failed to get container status \"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1\": rpc error: code = NotFound desc = could not find container \"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1\": container with ID starting with e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1 not found: ID does not exist" Dec 05 12:31:03.146149 master-0 kubenswrapper[4780]: I1205 12:31:03.146097 4780 scope.go:117] "RemoveContainer" containerID="c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22" Dec 05 12:31:03.146448 master-0 kubenswrapper[4780]: E1205 12:31:03.146409 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22\": container with ID starting with c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22 not found: ID does not exist" containerID="c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22" Dec 05 12:31:03.146503 master-0 kubenswrapper[4780]: I1205 12:31:03.146465 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22"} err="failed to get container status \"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22\": rpc error: code = NotFound desc = could not find container \"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22\": container with ID starting with c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22 not found: ID does not exist" Dec 05 12:31:03.146545 master-0 kubenswrapper[4780]: I1205 12:31:03.146513 4780 scope.go:117] "RemoveContainer" containerID="393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3" Dec 05 12:31:03.146881 master-0 kubenswrapper[4780]: E1205 12:31:03.146856 4780 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3\": container with ID starting with 393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3 not found: ID does not exist" containerID="393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3" Dec 05 12:31:03.146926 master-0 kubenswrapper[4780]: I1205 12:31:03.146886 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3"} err="failed to get container status \"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3\": rpc error: code = NotFound desc = could not find container \"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3\": container with ID starting with 393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3 not found: ID does not exist" Dec 05 12:31:03.146926 master-0 kubenswrapper[4780]: I1205 12:31:03.146906 4780 scope.go:117] "RemoveContainer" containerID="7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45" Dec 05 12:31:03.147155 master-0 kubenswrapper[4780]: I1205 12:31:03.147133 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45"} err="failed to get container status \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": rpc error: code = NotFound desc = could not find container \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": container with ID starting with 7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45 not found: ID does not exist" Dec 05 12:31:03.147155 master-0 kubenswrapper[4780]: I1205 12:31:03.147154 4780 scope.go:117] "RemoveContainer" containerID="4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58" Dec 05 12:31:03.147390 master-0 kubenswrapper[4780]: I1205 12:31:03.147365 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58"} err="failed to get container status \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": rpc error: code = NotFound desc = could not find container \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": container with ID starting with 4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58 not found: ID does not exist" Dec 05 12:31:03.147429 master-0 kubenswrapper[4780]: I1205 12:31:03.147390 4780 scope.go:117] "RemoveContainer" containerID="bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea" Dec 05 12:31:03.147627 master-0 kubenswrapper[4780]: I1205 12:31:03.147583 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea"} err="failed to get container status \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": rpc error: code = NotFound desc = could not find container \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": container with ID starting with bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea not found: ID does not exist" Dec 05 12:31:03.147674 master-0 kubenswrapper[4780]: I1205 12:31:03.147632 4780 scope.go:117] "RemoveContainer" containerID="ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0" Dec 05 12:31:03.147880 master-0 kubenswrapper[4780]: I1205 12:31:03.147858 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0"} err="failed to get container status \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": rpc error: code = NotFound desc = could not find container \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": container with ID starting with ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0 not found: ID does not exist" Dec 05 12:31:03.147880 master-0 kubenswrapper[4780]: I1205 12:31:03.147877 4780 scope.go:117] "RemoveContainer" containerID="281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112" Dec 05 12:31:03.148099 master-0 kubenswrapper[4780]: I1205 12:31:03.148077 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112"} err="failed to get container status \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": rpc error: code = NotFound desc = could not find container \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": container with ID starting with 281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112 not found: ID does not exist" Dec 05 12:31:03.148099 master-0 kubenswrapper[4780]: I1205 12:31:03.148095 4780 scope.go:117] "RemoveContainer" containerID="c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef" Dec 05 12:31:03.148361 master-0 kubenswrapper[4780]: I1205 12:31:03.148340 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef"} err="failed to get container status \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": rpc error: code = NotFound desc = could not find container \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": container with ID starting with c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef not found: ID does not exist" Dec 05 12:31:03.148361 master-0 kubenswrapper[4780]: I1205 12:31:03.148358 4780 scope.go:117] "RemoveContainer" containerID="e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1" Dec 05 12:31:03.148604 master-0 kubenswrapper[4780]: I1205 12:31:03.148562 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1"} err="failed to get container status \"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1\": rpc error: code = NotFound desc = could not find container \"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1\": container with ID starting with e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1 not found: ID does not exist" Dec 05 12:31:03.148604 master-0 kubenswrapper[4780]: I1205 12:31:03.148586 4780 scope.go:117] "RemoveContainer" containerID="c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22" Dec 05 12:31:03.148842 master-0 kubenswrapper[4780]: I1205 12:31:03.148821 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22"} err="failed to get container status \"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22\": rpc error: code = NotFound desc = could not find container \"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22\": container with ID starting with c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22 not found: ID does not exist" Dec 05 12:31:03.148842 master-0 kubenswrapper[4780]: I1205 12:31:03.148839 4780 scope.go:117] "RemoveContainer" containerID="393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3" Dec 05 12:31:03.149066 master-0 kubenswrapper[4780]: I1205 12:31:03.149044 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3"} err="failed to get container status \"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3\": rpc error: code = NotFound desc = could not find container \"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3\": container with ID starting with 393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3 not found: ID does not exist" Dec 05 12:31:03.149066 master-0 kubenswrapper[4780]: I1205 12:31:03.149061 4780 scope.go:117] "RemoveContainer" containerID="7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45" Dec 05 12:31:03.149311 master-0 kubenswrapper[4780]: I1205 12:31:03.149287 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45"} err="failed to get container status \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": rpc error: code = NotFound desc = could not find container \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": container with ID starting with 7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45 not found: ID does not exist" Dec 05 12:31:03.149311 master-0 kubenswrapper[4780]: I1205 12:31:03.149308 4780 scope.go:117] "RemoveContainer" containerID="4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58" Dec 05 12:31:03.149517 master-0 kubenswrapper[4780]: I1205 12:31:03.149494 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58"} err="failed to get container status \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": rpc error: code = NotFound desc = could not find container \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": container with ID starting with 4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58 not found: ID does not exist" Dec 05 12:31:03.149517 master-0 kubenswrapper[4780]: I1205 12:31:03.149514 4780 scope.go:117] "RemoveContainer" containerID="bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea" Dec 05 12:31:03.149762 master-0 kubenswrapper[4780]: I1205 12:31:03.149740 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea"} err="failed to get container status \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": rpc error: code = NotFound desc = could not find container \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": container with ID starting with bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea not found: ID does not exist" Dec 05 12:31:03.149762 master-0 kubenswrapper[4780]: I1205 12:31:03.149758 4780 scope.go:117] "RemoveContainer" containerID="ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0" Dec 05 12:31:03.149989 master-0 kubenswrapper[4780]: I1205 12:31:03.149968 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0"} err="failed to get container status \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": rpc error: code = NotFound desc = could not find container \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": container with ID starting with ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0 not found: ID does not exist" Dec 05 12:31:03.149989 master-0 kubenswrapper[4780]: I1205 12:31:03.149985 4780 scope.go:117] "RemoveContainer" containerID="281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112" Dec 05 12:31:03.150252 master-0 kubenswrapper[4780]: I1205 12:31:03.150229 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112"} err="failed to get container status \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": rpc error: code = NotFound desc = could not find container \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": container with ID starting with 281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112 not found: ID does not exist" Dec 05 12:31:03.150252 master-0 kubenswrapper[4780]: I1205 12:31:03.150249 4780 scope.go:117] "RemoveContainer" containerID="c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef" Dec 05 12:31:03.150455 master-0 kubenswrapper[4780]: I1205 12:31:03.150433 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef"} err="failed to get container status \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": rpc error: code = NotFound desc = could not find container \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": container with ID starting with c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef not found: ID does not exist" Dec 05 12:31:03.150455 master-0 kubenswrapper[4780]: I1205 12:31:03.150452 4780 scope.go:117] "RemoveContainer" containerID="e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1" Dec 05 12:31:03.150668 master-0 kubenswrapper[4780]: I1205 12:31:03.150634 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1"} err="failed to get container status \"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1\": rpc error: code = NotFound desc = could not find container \"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1\": container with ID starting with e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1 not found: ID does not exist" Dec 05 12:31:03.150668 master-0 kubenswrapper[4780]: I1205 12:31:03.150651 4780 scope.go:117] "RemoveContainer" containerID="c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22" Dec 05 12:31:03.150881 master-0 kubenswrapper[4780]: I1205 12:31:03.150859 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22"} err="failed to get container status \"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22\": rpc error: code = NotFound desc = could not find container \"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22\": container with ID starting with c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22 not found: ID does not exist" Dec 05 12:31:03.150881 master-0 kubenswrapper[4780]: I1205 12:31:03.150878 4780 scope.go:117] "RemoveContainer" containerID="393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3" Dec 05 12:31:03.151106 master-0 kubenswrapper[4780]: I1205 12:31:03.151084 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3"} err="failed to get container status \"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3\": rpc error: code = NotFound desc = could not find container \"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3\": container with ID starting with 393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3 not found: ID does not exist" Dec 05 12:31:03.151106 master-0 kubenswrapper[4780]: I1205 12:31:03.151103 4780 scope.go:117] "RemoveContainer" containerID="7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45" Dec 05 12:31:03.151358 master-0 kubenswrapper[4780]: I1205 12:31:03.151320 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45"} err="failed to get container status \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": rpc error: code = NotFound desc = could not find container \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": container with ID starting with 7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45 not found: ID does not exist" Dec 05 12:31:03.151358 master-0 kubenswrapper[4780]: I1205 12:31:03.151339 4780 scope.go:117] "RemoveContainer" containerID="4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58" Dec 05 12:31:03.152195 master-0 kubenswrapper[4780]: I1205 12:31:03.152121 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58"} err="failed to get container status \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": rpc error: code = NotFound desc = could not find container \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": container with ID starting with 4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58 not found: ID does not exist" Dec 05 12:31:03.152195 master-0 kubenswrapper[4780]: I1205 12:31:03.152145 4780 scope.go:117] "RemoveContainer" containerID="bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea" Dec 05 12:31:03.152453 master-0 kubenswrapper[4780]: I1205 12:31:03.152412 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea"} err="failed to get container status \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": rpc error: code = NotFound desc = could not find container \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": container with ID starting with bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea not found: ID does not exist" Dec 05 12:31:03.152528 master-0 kubenswrapper[4780]: I1205 12:31:03.152455 4780 scope.go:117] "RemoveContainer" containerID="ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0" Dec 05 12:31:03.152800 master-0 kubenswrapper[4780]: I1205 12:31:03.152770 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0"} err="failed to get container status \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": rpc error: code = NotFound desc = could not find container \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": container with ID starting with ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0 not found: ID does not exist" Dec 05 12:31:03.152800 master-0 kubenswrapper[4780]: I1205 12:31:03.152794 4780 scope.go:117] "RemoveContainer" containerID="281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112" Dec 05 12:31:03.153073 master-0 kubenswrapper[4780]: I1205 12:31:03.153035 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112"} err="failed to get container status \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": rpc error: code = NotFound desc = could not find container \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": container with ID starting with 281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112 not found: ID does not exist" Dec 05 12:31:03.153148 master-0 kubenswrapper[4780]: I1205 12:31:03.153071 4780 scope.go:117] "RemoveContainer" containerID="c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef" Dec 05 12:31:03.153552 master-0 kubenswrapper[4780]: I1205 12:31:03.153503 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef"} err="failed to get container status \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": rpc error: code = NotFound desc = could not find container \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": container with ID starting with c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef not found: ID does not exist" Dec 05 12:31:03.153552 master-0 kubenswrapper[4780]: I1205 12:31:03.153531 4780 scope.go:117] "RemoveContainer" containerID="e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1" Dec 05 12:31:03.153819 master-0 kubenswrapper[4780]: I1205 12:31:03.153790 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1"} err="failed to get container status \"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1\": rpc error: code = NotFound desc = could not find container \"e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1\": container with ID starting with e568b08fb702b432ab846c3ddaec81d63fd64f0495c4136504cb17bad69d4cd1 not found: ID does not exist" Dec 05 12:31:03.153819 master-0 kubenswrapper[4780]: I1205 12:31:03.153815 4780 scope.go:117] "RemoveContainer" containerID="c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22" Dec 05 12:31:03.154053 master-0 kubenswrapper[4780]: I1205 12:31:03.154014 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22"} err="failed to get container status \"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22\": rpc error: code = NotFound desc = could not find container \"c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22\": container with ID starting with c7199ccfad9f084e67ea4844d18e7de85114c4dd48cd2f5d8fb951d53b675e22 not found: ID does not exist" Dec 05 12:31:03.154053 master-0 kubenswrapper[4780]: I1205 12:31:03.154046 4780 scope.go:117] "RemoveContainer" containerID="393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3" Dec 05 12:31:03.154333 master-0 kubenswrapper[4780]: I1205 12:31:03.154306 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3"} err="failed to get container status \"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3\": rpc error: code = NotFound desc = could not find container \"393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3\": container with ID starting with 393ba08e3d52a7538579c97d697ed69698b0a4f3f29e8e23b2f6dfbf8f337ec3 not found: ID does not exist" Dec 05 12:31:03.154333 master-0 kubenswrapper[4780]: I1205 12:31:03.154328 4780 scope.go:117] "RemoveContainer" containerID="7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45" Dec 05 12:31:03.154582 master-0 kubenswrapper[4780]: I1205 12:31:03.154553 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45"} err="failed to get container status \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": rpc error: code = NotFound desc = could not find container \"7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45\": container with ID starting with 7be3a6b33a555d78ba4dc10fc0f2195ed569f7d7c56f4306b788f06bc02d3c45 not found: ID does not exist" Dec 05 12:31:03.154582 master-0 kubenswrapper[4780]: I1205 12:31:03.154580 4780 scope.go:117] "RemoveContainer" containerID="4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58" Dec 05 12:31:03.154816 master-0 kubenswrapper[4780]: I1205 12:31:03.154789 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58"} err="failed to get container status \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": rpc error: code = NotFound desc = could not find container \"4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58\": container with ID starting with 4b6d7e9f7ca28330e1c03d1ae01f171df5fcd004ec6dcc4233c5092bd2b8ae58 not found: ID does not exist" Dec 05 12:31:03.154816 master-0 kubenswrapper[4780]: I1205 12:31:03.154811 4780 scope.go:117] "RemoveContainer" containerID="bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea" Dec 05 12:31:03.155035 master-0 kubenswrapper[4780]: I1205 12:31:03.155008 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea"} err="failed to get container status \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": rpc error: code = NotFound desc = could not find container \"bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea\": container with ID starting with bbe36a0b701ab5b1c2cace09752413273c01f8805ca9427f32162d59546c92ea not found: ID does not exist" Dec 05 12:31:03.155035 master-0 kubenswrapper[4780]: I1205 12:31:03.155027 4780 scope.go:117] "RemoveContainer" containerID="ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0" Dec 05 12:31:03.155358 master-0 kubenswrapper[4780]: I1205 12:31:03.155333 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0"} err="failed to get container status \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": rpc error: code = NotFound desc = could not find container \"ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0\": container with ID starting with ee931ed7542ac8b79783cfcec3720f258b3b86b2f969e986fbe44de8e406afa0 not found: ID does not exist" Dec 05 12:31:03.155358 master-0 kubenswrapper[4780]: I1205 12:31:03.155353 4780 scope.go:117] "RemoveContainer" containerID="281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112" Dec 05 12:31:03.155648 master-0 kubenswrapper[4780]: I1205 12:31:03.155602 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112"} err="failed to get container status \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": rpc error: code = NotFound desc = could not find container \"281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112\": container with ID starting with 281a10f54bb695724905670829e7656b05bfe8f9a60c9ba3aa2bdf0c868af112 not found: ID does not exist" Dec 05 12:31:03.155648 master-0 kubenswrapper[4780]: I1205 12:31:03.155635 4780 scope.go:117] "RemoveContainer" containerID="c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef" Dec 05 12:31:03.155920 master-0 kubenswrapper[4780]: I1205 12:31:03.155890 4780 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef"} err="failed to get container status \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": rpc error: code = NotFound desc = could not find container \"c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef\": container with ID starting with c92ae3bc07b930d05845653cc4a1d495c18eb92a42951aa2d9e2cdad9ae45bef not found: ID does not exist" Dec 05 12:31:03.401557 master-0 kubenswrapper[4780]: I1205 12:31:03.401373 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:03.401816 master-0 kubenswrapper[4780]: E1205 12:31:03.401632 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 12:31:03.401816 master-0 kubenswrapper[4780]: E1205 12:31:03.401663 4780 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 12:31:03.401816 master-0 kubenswrapper[4780]: E1205 12:31:03.401686 4780 projected.go:194] Error preparing data for projected volume kube-api-access-69z2l for pod openshift-network-diagnostics/network-check-target-qsggt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:31:03.401816 master-0 kubenswrapper[4780]: E1205 12:31:03.401766 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l podName:f4a70855-80b5-4d6a-bed1-b42364940de0 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:35.401738085 +0000 UTC m=+137.424478841 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-69z2l" (UniqueName: "kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l") pod "network-check-target-qsggt" (UID: "f4a70855-80b5-4d6a-bed1-b42364940de0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 12:31:03.425305 master-0 kubenswrapper[4780]: I1205 12:31:03.425235 4780 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nrs4v"] Dec 05 12:31:03.437789 master-0 kubenswrapper[4780]: I1205 12:31:03.437693 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmq98\" (UniqueName: \"kubernetes.io/projected/4492c55f-701b-4ec8-ada1-0a5dc126d405-kube-api-access-dmq98\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.463500 master-0 kubenswrapper[4780]: I1205 12:31:03.463411 4780 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nrs4v"] Dec 05 12:31:03.484430 master-0 kubenswrapper[4780]: I1205 12:31:03.484223 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:03.497997 master-0 kubenswrapper[4780]: W1205 12:31:03.497907 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4492c55f_701b_4ec8_ada1_0a5dc126d405.slice/crio-ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134 WatchSource:0}: Error finding container ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134: Status 404 returned error can't find the container with id ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134 Dec 05 12:31:03.529718 master-0 kubenswrapper[4780]: I1205 12:31:03.529642 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:03.529718 master-0 kubenswrapper[4780]: I1205 12:31:03.529695 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:03.529929 master-0 kubenswrapper[4780]: E1205 12:31:03.529871 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:03.529989 master-0 kubenswrapper[4780]: E1205 12:31:03.529932 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:04.019536 master-0 kubenswrapper[4780]: I1205 12:31:04.019327 4780 generic.go:334] "Generic (PLEG): container finished" podID="4492c55f-701b-4ec8-ada1-0a5dc126d405" containerID="1e1ba9d3a2cd6fc3c76c6b40cc81f5a9fa8707214a43505b547185529870eae9" exitCode=0 Dec 05 12:31:04.019536 master-0 kubenswrapper[4780]: I1205 12:31:04.019435 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerDied","Data":"1e1ba9d3a2cd6fc3c76c6b40cc81f5a9fa8707214a43505b547185529870eae9"} Dec 05 12:31:04.020771 master-0 kubenswrapper[4780]: I1205 12:31:04.019613 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134"} Dec 05 12:31:04.535668 master-0 kubenswrapper[4780]: I1205 12:31:04.535261 4780 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7342973-8d2b-4e8e-ad46-99d8dd7a6688" path="/var/lib/kubelet/pods/f7342973-8d2b-4e8e-ad46-99d8dd7a6688/volumes" Dec 05 12:31:05.027905 master-0 kubenswrapper[4780]: I1205 12:31:05.027830 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"5c460027d91dbde84ccd52b0f0ca6fe1dc4b67dcc928471783b6fff7ae2ff897"} Dec 05 12:31:05.529995 master-0 kubenswrapper[4780]: I1205 12:31:05.529823 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:05.529995 master-0 kubenswrapper[4780]: I1205 12:31:05.529840 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:05.530261 master-0 kubenswrapper[4780]: E1205 12:31:05.530068 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:05.530293 master-0 kubenswrapper[4780]: E1205 12:31:05.530257 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:06.036267 master-0 kubenswrapper[4780]: I1205 12:31:06.035779 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"870198accbd17689e843fe910fb099fc9b006845184f99c6f27de26a4f229484"} Dec 05 12:31:06.036267 master-0 kubenswrapper[4780]: I1205 12:31:06.035844 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"add76824970c5035363490e23193d9c3951f4e6416f932bfbf753686a3f1c73d"} Dec 05 12:31:06.036267 master-0 kubenswrapper[4780]: I1205 12:31:06.035861 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"76860118877eff4c9fad8d4e5b56521e5266564bc07f30738a05fcf57a7826c0"} Dec 05 12:31:06.036267 master-0 kubenswrapper[4780]: I1205 12:31:06.035874 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"884ee3f24f9e08747275b422db9832356d773976d0d14992046903c0c0db05d6"} Dec 05 12:31:06.036267 master-0 kubenswrapper[4780]: I1205 12:31:06.035888 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"0ad0842396c73a410e419040cdd43cae0eebe3ffda0c30e321c4b5837d83dd29"} Dec 05 12:31:07.529745 master-0 kubenswrapper[4780]: I1205 12:31:07.529681 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:07.530588 master-0 kubenswrapper[4780]: E1205 12:31:07.530459 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:07.530723 master-0 kubenswrapper[4780]: I1205 12:31:07.529754 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:07.530998 master-0 kubenswrapper[4780]: E1205 12:31:07.530932 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:09.272039 master-0 kubenswrapper[4780]: I1205 12:31:09.271940 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 05 12:31:09.529536 master-0 kubenswrapper[4780]: I1205 12:31:09.529450 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:09.529536 master-0 kubenswrapper[4780]: I1205 12:31:09.529488 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:09.529846 master-0 kubenswrapper[4780]: E1205 12:31:09.529568 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:09.529846 master-0 kubenswrapper[4780]: E1205 12:31:09.529691 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:10.054784 master-0 kubenswrapper[4780]: I1205 12:31:10.054676 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"bc3cbb74a19a81cb8c8983a0be4b89fe52a747534fa8fdd8c9d37eaea6fe9abf"} Dec 05 12:31:11.058948 master-0 kubenswrapper[4780]: I1205 12:31:11.058578 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5nqhk_f725fa37-ef11-479a-8cf9-f4b90fe5e7a1/kube-multus/0.log" Dec 05 12:31:11.058948 master-0 kubenswrapper[4780]: I1205 12:31:11.058954 4780 generic.go:334] "Generic (PLEG): container finished" podID="f725fa37-ef11-479a-8cf9-f4b90fe5e7a1" containerID="60d32869d5d76c04555375fdfd9ab0f008a07a41f85b96737cde09fadc0deeb4" exitCode=1 Dec 05 12:31:11.059666 master-0 kubenswrapper[4780]: I1205 12:31:11.058991 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5nqhk" event={"ID":"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1","Type":"ContainerDied","Data":"60d32869d5d76c04555375fdfd9ab0f008a07a41f85b96737cde09fadc0deeb4"} Dec 05 12:31:11.059666 master-0 kubenswrapper[4780]: I1205 12:31:11.059394 4780 scope.go:117] "RemoveContainer" containerID="60d32869d5d76c04555375fdfd9ab0f008a07a41f85b96737cde09fadc0deeb4" Dec 05 12:31:11.312526 master-0 kubenswrapper[4780]: I1205 12:31:11.312410 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=2.312388537 podStartE2EDuration="2.312388537s" podCreationTimestamp="2025-12-05 12:31:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:31:11.311886194 +0000 UTC m=+113.334626940" watchObservedRunningTime="2025-12-05 12:31:11.312388537 +0000 UTC m=+113.335129263" Dec 05 12:31:11.530393 master-0 kubenswrapper[4780]: I1205 12:31:11.530358 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:11.530545 master-0 kubenswrapper[4780]: I1205 12:31:11.530372 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:11.530577 master-0 kubenswrapper[4780]: E1205 12:31:11.530526 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:11.530705 master-0 kubenswrapper[4780]: E1205 12:31:11.530664 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:12.066059 master-0 kubenswrapper[4780]: I1205 12:31:12.065995 4780 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5nqhk_f725fa37-ef11-479a-8cf9-f4b90fe5e7a1/kube-multus/0.log" Dec 05 12:31:12.066893 master-0 kubenswrapper[4780]: I1205 12:31:12.066102 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5nqhk" event={"ID":"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1","Type":"ContainerStarted","Data":"714e28f97e2ec6d00e1683c4d2537a164bb01931e5ad5b6860350da680801a09"} Dec 05 12:31:12.071217 master-0 kubenswrapper[4780]: I1205 12:31:12.071154 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"9ca4e2fef100193f66453d974a3aeb277071826fa1a2d4b62430670342a3f96c"} Dec 05 12:31:12.072236 master-0 kubenswrapper[4780]: I1205 12:31:12.072199 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:12.072294 master-0 kubenswrapper[4780]: I1205 12:31:12.072243 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:12.072294 master-0 kubenswrapper[4780]: I1205 12:31:12.072262 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:12.098058 master-0 kubenswrapper[4780]: I1205 12:31:12.097993 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:12.100020 master-0 kubenswrapper[4780]: I1205 12:31:12.099990 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:13.544470 master-0 kubenswrapper[4780]: I1205 12:31:13.544388 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:13.544470 master-0 kubenswrapper[4780]: I1205 12:31:13.544441 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:13.545228 master-0 kubenswrapper[4780]: E1205 12:31:13.544547 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:13.545228 master-0 kubenswrapper[4780]: E1205 12:31:13.544882 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:15.530218 master-0 kubenswrapper[4780]: I1205 12:31:15.530099 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:15.530218 master-0 kubenswrapper[4780]: I1205 12:31:15.530159 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:15.531397 master-0 kubenswrapper[4780]: E1205 12:31:15.530298 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:15.531397 master-0 kubenswrapper[4780]: E1205 12:31:15.530529 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:17.340315 master-0 kubenswrapper[4780]: I1205 12:31:17.340150 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:17.341100 master-0 kubenswrapper[4780]: E1205 12:31:17.340491 4780 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:31:17.341100 master-0 kubenswrapper[4780]: E1205 12:31:17.340684 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:21.340626418 +0000 UTC m=+183.363367234 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 12:31:17.530160 master-0 kubenswrapper[4780]: I1205 12:31:17.530067 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:17.530397 master-0 kubenswrapper[4780]: I1205 12:31:17.530087 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:17.530397 master-0 kubenswrapper[4780]: E1205 12:31:17.530252 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:17.530499 master-0 kubenswrapper[4780]: E1205 12:31:17.530385 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:18.229865 master-0 kubenswrapper[4780]: E1205 12:31:18.229798 4780 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 12:31:18.570646 master-0 kubenswrapper[4780]: E1205 12:31:18.570383 4780 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 12:31:19.529754 master-0 kubenswrapper[4780]: I1205 12:31:19.529624 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:19.529754 master-0 kubenswrapper[4780]: I1205 12:31:19.529724 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:19.530058 master-0 kubenswrapper[4780]: E1205 12:31:19.529915 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:19.530101 master-0 kubenswrapper[4780]: E1205 12:31:19.530056 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:19.652205 master-0 kubenswrapper[4780]: I1205 12:31:19.652091 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" podStartSLOduration=17.652069072 podStartE2EDuration="17.652069072s" podCreationTimestamp="2025-12-05 12:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:31:19.651351943 +0000 UTC m=+121.674092689" watchObservedRunningTime="2025-12-05 12:31:19.652069072 +0000 UTC m=+121.674809798" Dec 05 12:31:21.530062 master-0 kubenswrapper[4780]: I1205 12:31:21.529831 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:21.530062 master-0 kubenswrapper[4780]: I1205 12:31:21.529944 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:21.530062 master-0 kubenswrapper[4780]: E1205 12:31:21.529995 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:21.530913 master-0 kubenswrapper[4780]: E1205 12:31:21.530168 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:23.529890 master-0 kubenswrapper[4780]: I1205 12:31:23.529769 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:23.531341 master-0 kubenswrapper[4780]: I1205 12:31:23.529795 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:23.531341 master-0 kubenswrapper[4780]: E1205 12:31:23.530006 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:23.531341 master-0 kubenswrapper[4780]: E1205 12:31:23.530246 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:23.572464 master-0 kubenswrapper[4780]: E1205 12:31:23.572356 4780 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 12:31:25.530604 master-0 kubenswrapper[4780]: I1205 12:31:25.530059 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:25.531228 master-0 kubenswrapper[4780]: I1205 12:31:25.530251 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:25.531228 master-0 kubenswrapper[4780]: E1205 12:31:25.530777 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:25.531228 master-0 kubenswrapper[4780]: E1205 12:31:25.530978 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:27.529807 master-0 kubenswrapper[4780]: I1205 12:31:27.529701 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:27.530587 master-0 kubenswrapper[4780]: I1205 12:31:27.529791 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:27.530587 master-0 kubenswrapper[4780]: E1205 12:31:27.529877 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qsggt" podUID="f4a70855-80b5-4d6a-bed1-b42364940de0" Dec 05 12:31:27.530587 master-0 kubenswrapper[4780]: E1205 12:31:27.529995 4780 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-99djw" podUID="fb7003a6-4341-49eb-bec3-76ba8610fa12" Dec 05 12:31:29.530493 master-0 kubenswrapper[4780]: I1205 12:31:29.530369 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:29.530493 master-0 kubenswrapper[4780]: I1205 12:31:29.530426 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:29.532948 master-0 kubenswrapper[4780]: I1205 12:31:29.532895 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 12:31:29.533525 master-0 kubenswrapper[4780]: I1205 12:31:29.533475 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 12:31:29.534282 master-0 kubenswrapper[4780]: I1205 12:31:29.533726 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 12:31:33.503982 master-0 kubenswrapper[4780]: I1205 12:31:33.503912 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:34.625966 master-0 kubenswrapper[4780]: I1205 12:31:34.625890 4780 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Dec 05 12:31:34.908885 master-0 kubenswrapper[4780]: I1205 12:31:34.908716 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt"] Dec 05 12:31:34.909278 master-0 kubenswrapper[4780]: I1205 12:31:34.909237 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:34.912360 master-0 kubenswrapper[4780]: I1205 12:31:34.912299 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 12:31:34.913122 master-0 kubenswrapper[4780]: I1205 12:31:34.913055 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 12:31:34.913122 master-0 kubenswrapper[4780]: I1205 12:31:34.913101 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.913411 master-0 kubenswrapper[4780]: I1205 12:31:34.913273 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 12:31:34.918230 master-0 kubenswrapper[4780]: I1205 12:31:34.917589 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw"] Dec 05 12:31:34.919834 master-0 kubenswrapper[4780]: I1205 12:31:34.918736 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:34.927327 master-0 kubenswrapper[4780]: I1205 12:31:34.923103 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-8649c48786-7xrk6"] Dec 05 12:31:34.927327 master-0 kubenswrapper[4780]: I1205 12:31:34.923725 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t"] Dec 05 12:31:34.927327 master-0 kubenswrapper[4780]: I1205 12:31:34.924097 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:34.927327 master-0 kubenswrapper[4780]: I1205 12:31:34.924356 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc"] Dec 05 12:31:34.927327 master-0 kubenswrapper[4780]: I1205 12:31:34.924729 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:34.927327 master-0 kubenswrapper[4780]: I1205 12:31:34.924867 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:34.930719 master-0 kubenswrapper[4780]: I1205 12:31:34.928890 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf"] Dec 05 12:31:34.930719 master-0 kubenswrapper[4780]: I1205 12:31:34.929804 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:34.930719 master-0 kubenswrapper[4780]: I1205 12:31:34.930556 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz"] Dec 05 12:31:34.931235 master-0 kubenswrapper[4780]: I1205 12:31:34.931109 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:34.932744 master-0 kubenswrapper[4780]: I1205 12:31:34.932549 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 05 12:31:34.934910 master-0 kubenswrapper[4780]: I1205 12:31:34.932933 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq"] Dec 05 12:31:34.934910 master-0 kubenswrapper[4780]: I1205 12:31:34.933020 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.934910 master-0 kubenswrapper[4780]: I1205 12:31:34.933473 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:34.934910 master-0 kubenswrapper[4780]: I1205 12:31:34.933847 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 12:31:34.934910 master-0 kubenswrapper[4780]: I1205 12:31:34.933988 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 12:31:34.935332 master-0 kubenswrapper[4780]: I1205 12:31:34.935041 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 05 12:31:34.935332 master-0 kubenswrapper[4780]: I1205 12:31:34.935295 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 12:31:34.935471 master-0 kubenswrapper[4780]: I1205 12:31:34.935407 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.935566 master-0 kubenswrapper[4780]: I1205 12:31:34.935518 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.935650 master-0 kubenswrapper[4780]: I1205 12:31:34.935595 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 12:31:34.935716 master-0 kubenswrapper[4780]: I1205 12:31:34.935690 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 12:31:34.935801 master-0 kubenswrapper[4780]: I1205 12:31:34.935773 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 05 12:31:34.935801 master-0 kubenswrapper[4780]: I1205 12:31:34.935780 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:31:34.935946 master-0 kubenswrapper[4780]: I1205 12:31:34.935918 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 05 12:31:34.936160 master-0 kubenswrapper[4780]: I1205 12:31:34.936093 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 12:31:34.936313 master-0 kubenswrapper[4780]: I1205 12:31:34.936174 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 12:31:34.936798 master-0 kubenswrapper[4780]: I1205 12:31:34.936749 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp"] Dec 05 12:31:34.936957 master-0 kubenswrapper[4780]: I1205 12:31:34.936911 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 12:31:34.937321 master-0 kubenswrapper[4780]: I1205 12:31:34.937285 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:34.937977 master-0 kubenswrapper[4780]: I1205 12:31:34.937936 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 05 12:31:34.938063 master-0 kubenswrapper[4780]: I1205 12:31:34.938029 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 12:31:34.951208 master-0 kubenswrapper[4780]: I1205 12:31:34.950600 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv"] Dec 05 12:31:34.951506 master-0 kubenswrapper[4780]: I1205 12:31:34.951243 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24"] Dec 05 12:31:34.955795 master-0 kubenswrapper[4780]: I1205 12:31:34.951637 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:34.955795 master-0 kubenswrapper[4780]: I1205 12:31:34.952010 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:34.955795 master-0 kubenswrapper[4780]: I1205 12:31:34.954761 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 12:31:34.955795 master-0 kubenswrapper[4780]: I1205 12:31:34.955010 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 12:31:34.955795 master-0 kubenswrapper[4780]: I1205 12:31:34.955337 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.964160 master-0 kubenswrapper[4780]: I1205 12:31:34.964095 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp"] Dec 05 12:31:34.965207 master-0 kubenswrapper[4780]: I1205 12:31:34.964792 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c"] Dec 05 12:31:34.965207 master-0 kubenswrapper[4780]: I1205 12:31:34.965135 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.965318 master-0 kubenswrapper[4780]: I1205 12:31:34.965258 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv"] Dec 05 12:31:34.966838 master-0 kubenswrapper[4780]: I1205 12:31:34.965819 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:34.966838 master-0 kubenswrapper[4780]: I1205 12:31:34.965990 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 12:31:34.966838 master-0 kubenswrapper[4780]: I1205 12:31:34.966250 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 12:31:34.966838 master-0 kubenswrapper[4780]: I1205 12:31:34.966589 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 12:31:34.967020 master-0 kubenswrapper[4780]: I1205 12:31:34.966943 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 12:31:34.967686 master-0 kubenswrapper[4780]: I1205 12:31:34.967635 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 12:31:34.967838 master-0 kubenswrapper[4780]: I1205 12:31:34.967804 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" Dec 05 12:31:34.968283 master-0 kubenswrapper[4780]: I1205 12:31:34.968261 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:34.968506 master-0 kubenswrapper[4780]: I1205 12:31:34.968459 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 05 12:31:34.968778 master-0 kubenswrapper[4780]: I1205 12:31:34.968620 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 12:31:34.968778 master-0 kubenswrapper[4780]: I1205 12:31:34.968655 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 12:31:34.968859 master-0 kubenswrapper[4780]: I1205 12:31:34.968811 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 12:31:34.968859 master-0 kubenswrapper[4780]: I1205 12:31:34.968831 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.969012 master-0 kubenswrapper[4780]: I1205 12:31:34.968884 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 12:31:34.969012 master-0 kubenswrapper[4780]: I1205 12:31:34.968812 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 12:31:34.969012 master-0 kubenswrapper[4780]: I1205 12:31:34.968977 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 12:31:34.969095 master-0 kubenswrapper[4780]: I1205 12:31:34.969040 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq"] Dec 05 12:31:34.969207 master-0 kubenswrapper[4780]: I1205 12:31:34.969153 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 12:31:34.969839 master-0 kubenswrapper[4780]: I1205 12:31:34.969805 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:34.970061 master-0 kubenswrapper[4780]: I1205 12:31:34.970034 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5"] Dec 05 12:31:34.970390 master-0 kubenswrapper[4780]: I1205 12:31:34.970362 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:34.970959 master-0 kubenswrapper[4780]: I1205 12:31:34.970934 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs"] Dec 05 12:31:34.971312 master-0 kubenswrapper[4780]: I1205 12:31:34.971289 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:34.972590 master-0 kubenswrapper[4780]: I1205 12:31:34.972559 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-f797b99b6-vwhxt"] Dec 05 12:31:34.972973 master-0 kubenswrapper[4780]: I1205 12:31:34.972941 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:34.973583 master-0 kubenswrapper[4780]: I1205 12:31:34.973517 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 12:31:34.973724 master-0 kubenswrapper[4780]: I1205 12:31:34.973693 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 12:31:34.973859 master-0 kubenswrapper[4780]: I1205 12:31:34.973826 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 12:31:34.974257 master-0 kubenswrapper[4780]: I1205 12:31:34.974172 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 12:31:34.974459 master-0 kubenswrapper[4780]: I1205 12:31:34.974428 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 12:31:34.974740 master-0 kubenswrapper[4780]: I1205 12:31:34.974706 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 12:31:34.974785 master-0 kubenswrapper[4780]: I1205 12:31:34.974736 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 12:31:34.975093 master-0 kubenswrapper[4780]: I1205 12:31:34.975062 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.975299 master-0 kubenswrapper[4780]: I1205 12:31:34.975265 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z"] Dec 05 12:31:34.976989 master-0 kubenswrapper[4780]: I1205 12:31:34.976751 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.976989 master-0 kubenswrapper[4780]: I1205 12:31:34.976816 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.977108 master-0 kubenswrapper[4780]: I1205 12:31:34.977029 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 12:31:34.977680 master-0 kubenswrapper[4780]: I1205 12:31:34.977315 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 12:31:34.977680 master-0 kubenswrapper[4780]: I1205 12:31:34.977375 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 12:31:34.977680 master-0 kubenswrapper[4780]: I1205 12:31:34.977536 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 05 12:31:34.979391 master-0 kubenswrapper[4780]: I1205 12:31:34.978460 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt"] Dec 05 12:31:34.979391 master-0 kubenswrapper[4780]: I1205 12:31:34.978504 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-8649c48786-7xrk6"] Dec 05 12:31:34.979391 master-0 kubenswrapper[4780]: I1205 12:31:34.978522 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp"] Dec 05 12:31:34.979391 master-0 kubenswrapper[4780]: I1205 12:31:34.978633 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:34.979391 master-0 kubenswrapper[4780]: I1205 12:31:34.979437 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 12:31:34.980145 master-0 kubenswrapper[4780]: I1205 12:31:34.979458 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 12:31:34.980145 master-0 kubenswrapper[4780]: I1205 12:31:34.979677 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 12:31:34.980145 master-0 kubenswrapper[4780]: I1205 12:31:34.979772 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 12:31:34.980145 master-0 kubenswrapper[4780]: I1205 12:31:34.979857 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:31:34.980145 master-0 kubenswrapper[4780]: I1205 12:31:34.979939 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 12:31:34.980145 master-0 kubenswrapper[4780]: I1205 12:31:34.980017 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 12:31:34.980751 master-0 kubenswrapper[4780]: I1205 12:31:34.980717 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 12:31:34.980876 master-0 kubenswrapper[4780]: I1205 12:31:34.980767 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t"] Dec 05 12:31:34.988495 master-0 kubenswrapper[4780]: I1205 12:31:34.985119 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 12:31:34.988495 master-0 kubenswrapper[4780]: I1205 12:31:34.986522 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq"] Dec 05 12:31:34.988495 master-0 kubenswrapper[4780]: I1205 12:31:34.986868 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 12:31:34.988495 master-0 kubenswrapper[4780]: I1205 12:31:34.986990 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 05 12:31:34.991141 master-0 kubenswrapper[4780]: I1205 12:31:34.991062 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24"] Dec 05 12:31:34.992408 master-0 kubenswrapper[4780]: I1205 12:31:34.991696 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq"] Dec 05 12:31:34.993618 master-0 kubenswrapper[4780]: I1205 12:31:34.993367 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 12:31:34.993618 master-0 kubenswrapper[4780]: I1205 12:31:34.993487 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv"] Dec 05 12:31:34.995577 master-0 kubenswrapper[4780]: I1205 12:31:34.994890 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz"] Dec 05 12:31:34.995793 master-0 kubenswrapper[4780]: I1205 12:31:34.995715 4780 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-nwplt"] Dec 05 12:31:35.003733 master-0 kubenswrapper[4780]: I1205 12:31:35.003465 4780 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 05 12:31:35.003733 master-0 kubenswrapper[4780]: I1205 12:31:35.003534 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 05 12:31:35.003900 master-0 kubenswrapper[4780]: I1205 12:31:35.003827 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 05 12:31:35.012397 master-0 kubenswrapper[4780]: I1205 12:31:35.012330 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc"] Dec 05 12:31:35.012574 master-0 kubenswrapper[4780]: I1205 12:31:35.012342 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 12:31:35.012574 master-0 kubenswrapper[4780]: I1205 12:31:35.012379 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5"] Dec 05 12:31:35.012574 master-0 kubenswrapper[4780]: I1205 12:31:35.012516 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.013021 master-0 kubenswrapper[4780]: I1205 12:31:35.012987 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp"] Dec 05 12:31:35.014707 master-0 kubenswrapper[4780]: I1205 12:31:35.014400 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.014739 4780 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.014697 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807d9093-aa67-4840-b5be-7f3abcc1beed-config\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.014838 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tngh\" (UniqueName: \"kubernetes.io/projected/d53a4886-db25-43a1-825a-66a9a9a58590-kube-api-access-2tngh\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.014879 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxxw7\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-kube-api-access-fxxw7\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.014955 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d53a4886-db25-43a1-825a-66a9a9a58590-config\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.014982 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-bound-sa-token\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.015257 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.015309 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.015402 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.015491 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3792522-fec6-4022-90ac-0b8467fcd625-config\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.015549 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtvzs\" (UniqueName: \"kubernetes.io/projected/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-kube-api-access-dtvzs\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:35.015565 master-0 kubenswrapper[4780]: I1205 12:31:35.015575 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/807d9093-aa67-4840-b5be-7f3abcc1beed-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015597 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-trusted-ca\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015645 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015666 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015717 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6mb6\" (UniqueName: \"kubernetes.io/projected/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-kube-api-access-z6mb6\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015740 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d53a4886-db25-43a1-825a-66a9a9a58590-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015774 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flxbg\" (UniqueName: \"kubernetes.io/projected/f3792522-fec6-4022-90ac-0b8467fcd625-kube-api-access-flxbg\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015794 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2acba71-b9dc-4b85-be35-c995b8be2f19-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015816 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nml2g\" (UniqueName: \"kubernetes.io/projected/a2acba71-b9dc-4b85-be35-c995b8be2f19-kube-api-access-nml2g\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015838 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015899 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015919 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3792522-fec6-4022-90ac-0b8467fcd625-serving-cert\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015944 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62nqj\" (UniqueName: \"kubernetes.io/projected/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-kube-api-access-62nqj\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.015966 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:35.016841 master-0 kubenswrapper[4780]: I1205 12:31:35.016010 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph9w6\" (UniqueName: \"kubernetes.io/projected/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-kube-api-access-ph9w6\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:35.017528 master-0 kubenswrapper[4780]: I1205 12:31:35.016151 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.017528 master-0 kubenswrapper[4780]: I1205 12:31:35.016238 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807d9093-aa67-4840-b5be-7f3abcc1beed-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:35.021481 master-0 kubenswrapper[4780]: I1205 12:31:35.021442 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw"] Dec 05 12:31:35.022991 master-0 kubenswrapper[4780]: I1205 12:31:35.022940 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv"] Dec 05 12:31:35.025225 master-0 kubenswrapper[4780]: I1205 12:31:35.025202 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c"] Dec 05 12:31:35.026625 master-0 kubenswrapper[4780]: I1205 12:31:35.026555 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf"] Dec 05 12:31:35.027953 master-0 kubenswrapper[4780]: I1205 12:31:35.027925 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z"] Dec 05 12:31:35.029661 master-0 kubenswrapper[4780]: I1205 12:31:35.029396 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-f797b99b6-vwhxt"] Dec 05 12:31:35.031513 master-0 kubenswrapper[4780]: I1205 12:31:35.030936 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs"] Dec 05 12:31:35.116932 master-0 kubenswrapper[4780]: I1205 12:31:35.116849 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594aaded-5615-4bed-87ee-6173059a73be-config\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:35.116932 master-0 kubenswrapper[4780]: I1205 12:31:35.116899 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26x2z\" (UniqueName: \"kubernetes.io/projected/49760d62-02e5-4882-b47f-663102b04946-kube-api-access-26x2z\") pod \"csi-snapshot-controller-operator-6bc8656fdc-zn7hv\" (UID: \"49760d62-02e5-4882-b47f-663102b04946\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" Dec 05 12:31:35.116932 master-0 kubenswrapper[4780]: I1205 12:31:35.116929 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/58187662-b502-4d90-95ce-2aa91a81d256-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:35.117167 master-0 kubenswrapper[4780]: I1205 12:31:35.117043 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1871a9d6-6369-4d08-816f-9c6310b61ddf-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:35.117167 master-0 kubenswrapper[4780]: I1205 12:31:35.117093 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.117167 master-0 kubenswrapper[4780]: I1205 12:31:35.117142 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807d9093-aa67-4840-b5be-7f3abcc1beed-config\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:35.118453 master-0 kubenswrapper[4780]: I1205 12:31:35.118393 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tngh\" (UniqueName: \"kubernetes.io/projected/d53a4886-db25-43a1-825a-66a9a9a58590-kube-api-access-2tngh\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:35.118517 master-0 kubenswrapper[4780]: I1205 12:31:35.118465 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxw7\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-kube-api-access-fxxw7\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.118517 master-0 kubenswrapper[4780]: I1205 12:31:35.118510 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.118606 master-0 kubenswrapper[4780]: I1205 12:31:35.118557 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5hdg\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-kube-api-access-t5hdg\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.118606 master-0 kubenswrapper[4780]: I1205 12:31:35.118587 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d53a4886-db25-43a1-825a-66a9a9a58590-config\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:35.118685 master-0 kubenswrapper[4780]: I1205 12:31:35.118615 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.118685 master-0 kubenswrapper[4780]: I1205 12:31:35.118644 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-bound-sa-token\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.118685 master-0 kubenswrapper[4780]: I1205 12:31:35.118680 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:35.118809 master-0 kubenswrapper[4780]: I1205 12:31:35.118712 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807d9093-aa67-4840-b5be-7f3abcc1beed-config\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:35.118809 master-0 kubenswrapper[4780]: I1205 12:31:35.118716 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1871a9d6-6369-4d08-816f-9c6310b61ddf-config\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:35.118809 master-0 kubenswrapper[4780]: I1205 12:31:35.118768 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba095394-1873-4793-969d-3be979fa0771-serving-cert\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.118809 master-0 kubenswrapper[4780]: I1205 12:31:35.118802 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:35.119019 master-0 kubenswrapper[4780]: I1205 12:31:35.118826 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:35.119019 master-0 kubenswrapper[4780]: I1205 12:31:35.118847 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-iptables-alerter-script\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.119019 master-0 kubenswrapper[4780]: I1205 12:31:35.118867 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:35.119019 master-0 kubenswrapper[4780]: I1205 12:31:35.118891 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:35.119019 master-0 kubenswrapper[4780]: I1205 12:31:35.118922 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3792522-fec6-4022-90ac-0b8467fcd625-config\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:35.119019 master-0 kubenswrapper[4780]: I1205 12:31:35.118967 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-host-slash\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.119019 master-0 kubenswrapper[4780]: I1205 12:31:35.118985 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55qpg\" (UniqueName: \"kubernetes.io/projected/ba095394-1873-4793-969d-3be979fa0771-kube-api-access-55qpg\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.119019 master-0 kubenswrapper[4780]: I1205 12:31:35.119007 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.119019 master-0 kubenswrapper[4780]: I1205 12:31:35.119026 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtvzs\" (UniqueName: \"kubernetes.io/projected/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-kube-api-access-dtvzs\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:35.119384 master-0 kubenswrapper[4780]: I1205 12:31:35.119048 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-config\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.119384 master-0 kubenswrapper[4780]: I1205 12:31:35.119065 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/807d9093-aa67-4840-b5be-7f3abcc1beed-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:35.119384 master-0 kubenswrapper[4780]: I1205 12:31:35.119082 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-config\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.119384 master-0 kubenswrapper[4780]: I1205 12:31:35.119099 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-serving-cert\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.119384 master-0 kubenswrapper[4780]: I1205 12:31:35.119140 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-trusted-ca\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.119575 master-0 kubenswrapper[4780]: I1205 12:31:35.119421 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:35.119575 master-0 kubenswrapper[4780]: E1205 12:31:35.119468 4780 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:31:35.119575 master-0 kubenswrapper[4780]: E1205 12:31:35.119537 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:31:35.619517333 +0000 UTC m=+137.642258069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:31:35.119995 master-0 kubenswrapper[4780]: I1205 12:31:35.119470 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtjln\" (UniqueName: \"kubernetes.io/projected/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-kube-api-access-xtjln\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:35.122399 master-0 kubenswrapper[4780]: I1205 12:31:35.120353 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:35.122399 master-0 kubenswrapper[4780]: I1205 12:31:35.120638 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d53a4886-db25-43a1-825a-66a9a9a58590-config\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:35.122399 master-0 kubenswrapper[4780]: I1205 12:31:35.120911 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-trusted-ca\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.122612 master-0 kubenswrapper[4780]: I1205 12:31:35.122559 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3792522-fec6-4022-90ac-0b8467fcd625-config\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.119947 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps4ws\" (UniqueName: \"kubernetes.io/projected/58187662-b502-4d90-95ce-2aa91a81d256-kube-api-access-ps4ws\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.122755 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/594aaded-5615-4bed-87ee-6173059a73be-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.122789 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38941513-e968-45f1-9cb2-b63d40338f36-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.122824 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.122868 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2gd8\" (UniqueName: \"kubernetes.io/projected/1e6babfe-724a-4eab-bb3b-bc318bf57b70-kube-api-access-c2gd8\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.122899 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.122924 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6mb6\" (UniqueName: \"kubernetes.io/projected/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-kube-api-access-z6mb6\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.122966 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d53a4886-db25-43a1-825a-66a9a9a58590-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.122993 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-client\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.123015 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594aaded-5615-4bed-87ee-6173059a73be-serving-cert\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.123056 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flxbg\" (UniqueName: \"kubernetes.io/projected/f3792522-fec6-4022-90ac-0b8467fcd625-kube-api-access-flxbg\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.123085 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2acba71-b9dc-4b85-be35-c995b8be2f19-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.123135 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nml2g\" (UniqueName: \"kubernetes.io/projected/a2acba71-b9dc-4b85-be35-c995b8be2f19-kube-api-access-nml2g\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.123352 master-0 kubenswrapper[4780]: I1205 12:31:35.123172 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwcr9\" (UniqueName: \"kubernetes.io/projected/f119ffe4-16bd-49eb-916d-b18ba0d79b54-kube-api-access-wwcr9\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123229 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123279 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123307 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123327 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3792522-fec6-4022-90ac-0b8467fcd625-serving-cert\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123350 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62nqj\" (UniqueName: \"kubernetes.io/projected/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-kube-api-access-62nqj\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123373 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123405 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlnqb\" (UniqueName: \"kubernetes.io/projected/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-kube-api-access-mlnqb\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123430 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123452 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph9w6\" (UniqueName: \"kubernetes.io/projected/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-kube-api-access-ph9w6\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123476 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123503 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123526 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-config\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123550 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123574 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5p5d\" (UniqueName: \"kubernetes.io/projected/5f0c6889-0739-48a3-99cd-6db9d1f83242-kube-api-access-p5p5d\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:35.123842 master-0 kubenswrapper[4780]: I1205 12:31:35.123599 4780 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1871a9d6-6369-4d08-816f-9c6310b61ddf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:35.124261 master-0 kubenswrapper[4780]: I1205 12:31:35.123623 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807d9093-aa67-4840-b5be-7f3abcc1beed-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:35.125120 master-0 kubenswrapper[4780]: I1205 12:31:35.124712 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:35.125120 master-0 kubenswrapper[4780]: I1205 12:31:35.124842 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2acba71-b9dc-4b85-be35-c995b8be2f19-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.125818 master-0 kubenswrapper[4780]: E1205 12:31:35.123549 4780 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:31:35.125818 master-0 kubenswrapper[4780]: E1205 12:31:35.125543 4780 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:35.125818 master-0 kubenswrapper[4780]: E1205 12:31:35.125566 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:35.62553995 +0000 UTC m=+137.648280866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:31:35.125818 master-0 kubenswrapper[4780]: E1205 12:31:35.123660 4780 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:35.125818 master-0 kubenswrapper[4780]: E1205 12:31:35.124268 4780 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:31:35.125818 master-0 kubenswrapper[4780]: E1205 12:31:35.125599 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:35.625582082 +0000 UTC m=+137.648322808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:35.125818 master-0 kubenswrapper[4780]: E1205 12:31:35.125753 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:35.625712925 +0000 UTC m=+137.648453871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:31:35.125818 master-0 kubenswrapper[4780]: E1205 12:31:35.125789 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:31:35.625776477 +0000 UTC m=+137.648517213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:31:35.128876 master-0 kubenswrapper[4780]: I1205 12:31:35.128845 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:35.129249 master-0 kubenswrapper[4780]: I1205 12:31:35.129209 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3792522-fec6-4022-90ac-0b8467fcd625-serving-cert\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:35.131535 master-0 kubenswrapper[4780]: I1205 12:31:35.131483 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d53a4886-db25-43a1-825a-66a9a9a58590-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:35.133207 master-0 kubenswrapper[4780]: I1205 12:31:35.133142 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807d9093-aa67-4840-b5be-7f3abcc1beed-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:35.135161 master-0 kubenswrapper[4780]: I1205 12:31:35.135117 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:35.135980 master-0 kubenswrapper[4780]: I1205 12:31:35.135943 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tngh\" (UniqueName: \"kubernetes.io/projected/d53a4886-db25-43a1-825a-66a9a9a58590-kube-api-access-2tngh\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:35.138579 master-0 kubenswrapper[4780]: I1205 12:31:35.138361 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxw7\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-kube-api-access-fxxw7\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.145583 master-0 kubenswrapper[4780]: I1205 12:31:35.144050 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-bound-sa-token\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.145583 master-0 kubenswrapper[4780]: I1205 12:31:35.144625 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/807d9093-aa67-4840-b5be-7f3abcc1beed-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:35.145583 master-0 kubenswrapper[4780]: I1205 12:31:35.144869 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtvzs\" (UniqueName: \"kubernetes.io/projected/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-kube-api-access-dtvzs\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:35.149734 master-0 kubenswrapper[4780]: I1205 12:31:35.149671 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6mb6\" (UniqueName: \"kubernetes.io/projected/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-kube-api-access-z6mb6\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:35.151816 master-0 kubenswrapper[4780]: I1205 12:31:35.151763 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nml2g\" (UniqueName: \"kubernetes.io/projected/a2acba71-b9dc-4b85-be35-c995b8be2f19-kube-api-access-nml2g\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.153516 master-0 kubenswrapper[4780]: I1205 12:31:35.153452 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62nqj\" (UniqueName: \"kubernetes.io/projected/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-kube-api-access-62nqj\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:35.154321 master-0 kubenswrapper[4780]: I1205 12:31:35.154283 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flxbg\" (UniqueName: \"kubernetes.io/projected/f3792522-fec6-4022-90ac-0b8467fcd625-kube-api-access-flxbg\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:35.157173 master-0 kubenswrapper[4780]: I1205 12:31:35.157141 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph9w6\" (UniqueName: \"kubernetes.io/projected/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-kube-api-access-ph9w6\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:35.224967 master-0 kubenswrapper[4780]: I1205 12:31:35.224375 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.224967 master-0 kubenswrapper[4780]: I1205 12:31:35.224437 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.224967 master-0 kubenswrapper[4780]: I1205 12:31:35.224464 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5hdg\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-kube-api-access-t5hdg\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.225276 master-0 kubenswrapper[4780]: I1205 12:31:35.225089 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.225276 master-0 kubenswrapper[4780]: I1205 12:31:35.225160 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1871a9d6-6369-4d08-816f-9c6310b61ddf-config\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:35.225276 master-0 kubenswrapper[4780]: I1205 12:31:35.225205 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba095394-1873-4793-969d-3be979fa0771-serving-cert\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.225374 master-0 kubenswrapper[4780]: I1205 12:31:35.225336 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.225514 master-0 kubenswrapper[4780]: I1205 12:31:35.225464 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:35.225704 master-0 kubenswrapper[4780]: I1205 12:31:35.225673 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-iptables-alerter-script\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.225740 master-0 kubenswrapper[4780]: I1205 12:31:35.225710 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:35.225775 master-0 kubenswrapper[4780]: I1205 12:31:35.225746 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-host-slash\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.225827 master-0 kubenswrapper[4780]: E1205 12:31:35.225807 4780 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:31:35.225872 master-0 kubenswrapper[4780]: I1205 12:31:35.225854 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55qpg\" (UniqueName: \"kubernetes.io/projected/ba095394-1873-4793-969d-3be979fa0771-kube-api-access-55qpg\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.225901 master-0 kubenswrapper[4780]: I1205 12:31:35.225892 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.225929 master-0 kubenswrapper[4780]: I1205 12:31:35.225919 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-config\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.225960 master-0 kubenswrapper[4780]: I1205 12:31:35.225939 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-config\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.225960 master-0 kubenswrapper[4780]: I1205 12:31:35.225937 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-host-slash\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.226026 master-0 kubenswrapper[4780]: I1205 12:31:35.225962 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-serving-cert\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.226026 master-0 kubenswrapper[4780]: I1205 12:31:35.226011 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:35.226119 master-0 kubenswrapper[4780]: E1205 12:31:35.226030 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:35.72601611 +0000 UTC m=+137.748756826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:31:35.226119 master-0 kubenswrapper[4780]: I1205 12:31:35.226072 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtjln\" (UniqueName: \"kubernetes.io/projected/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-kube-api-access-xtjln\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:35.226119 master-0 kubenswrapper[4780]: I1205 12:31:35.226094 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps4ws\" (UniqueName: \"kubernetes.io/projected/58187662-b502-4d90-95ce-2aa91a81d256-kube-api-access-ps4ws\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:35.226348 master-0 kubenswrapper[4780]: I1205 12:31:35.226145 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38941513-e968-45f1-9cb2-b63d40338f36-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.226517 master-0 kubenswrapper[4780]: I1205 12:31:35.226368 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/594aaded-5615-4bed-87ee-6173059a73be-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:35.226517 master-0 kubenswrapper[4780]: I1205 12:31:35.226476 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1871a9d6-6369-4d08-816f-9c6310b61ddf-config\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:35.227126 master-0 kubenswrapper[4780]: I1205 12:31:35.227095 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.227126 master-0 kubenswrapper[4780]: I1205 12:31:35.227113 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-iptables-alerter-script\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.227405 master-0 kubenswrapper[4780]: I1205 12:31:35.227164 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2gd8\" (UniqueName: \"kubernetes.io/projected/1e6babfe-724a-4eab-bb3b-bc318bf57b70-kube-api-access-c2gd8\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:35.227471 master-0 kubenswrapper[4780]: I1205 12:31:35.227398 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-config\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.227471 master-0 kubenswrapper[4780]: I1205 12:31:35.227436 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-client\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.227471 master-0 kubenswrapper[4780]: I1205 12:31:35.227345 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-config\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.227592 master-0 kubenswrapper[4780]: I1205 12:31:35.227470 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594aaded-5615-4bed-87ee-6173059a73be-serving-cert\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:35.227592 master-0 kubenswrapper[4780]: I1205 12:31:35.227556 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwcr9\" (UniqueName: \"kubernetes.io/projected/f119ffe4-16bd-49eb-916d-b18ba0d79b54-kube-api-access-wwcr9\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.227715 master-0 kubenswrapper[4780]: I1205 12:31:35.227649 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:35.227771 master-0 kubenswrapper[4780]: I1205 12:31:35.227714 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlnqb\" (UniqueName: \"kubernetes.io/projected/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-kube-api-access-mlnqb\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.227771 master-0 kubenswrapper[4780]: I1205 12:31:35.227752 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.227844 master-0 kubenswrapper[4780]: E1205 12:31:35.227794 4780 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:35.227878 master-0 kubenswrapper[4780]: E1205 12:31:35.227845 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:35.727828028 +0000 UTC m=+137.750568754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:35.227878 master-0 kubenswrapper[4780]: I1205 12:31:35.227796 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.227967 master-0 kubenswrapper[4780]: I1205 12:31:35.227880 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-config\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:35.227967 master-0 kubenswrapper[4780]: I1205 12:31:35.227905 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5p5d\" (UniqueName: \"kubernetes.io/projected/5f0c6889-0739-48a3-99cd-6db9d1f83242-kube-api-access-p5p5d\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:35.227967 master-0 kubenswrapper[4780]: I1205 12:31:35.227929 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1871a9d6-6369-4d08-816f-9c6310b61ddf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:35.227967 master-0 kubenswrapper[4780]: I1205 12:31:35.227959 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:35.228122 master-0 kubenswrapper[4780]: I1205 12:31:35.227980 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26x2z\" (UniqueName: \"kubernetes.io/projected/49760d62-02e5-4882-b47f-663102b04946-kube-api-access-26x2z\") pod \"csi-snapshot-controller-operator-6bc8656fdc-zn7hv\" (UID: \"49760d62-02e5-4882-b47f-663102b04946\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" Dec 05 12:31:35.228122 master-0 kubenswrapper[4780]: I1205 12:31:35.227999 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38941513-e968-45f1-9cb2-b63d40338f36-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.228122 master-0 kubenswrapper[4780]: I1205 12:31:35.228039 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594aaded-5615-4bed-87ee-6173059a73be-config\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:35.228122 master-0 kubenswrapper[4780]: I1205 12:31:35.228070 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1871a9d6-6369-4d08-816f-9c6310b61ddf-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:35.228122 master-0 kubenswrapper[4780]: I1205 12:31:35.228098 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.228482 master-0 kubenswrapper[4780]: I1205 12:31:35.228131 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/58187662-b502-4d90-95ce-2aa91a81d256-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:35.228482 master-0 kubenswrapper[4780]: E1205 12:31:35.228353 4780 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:31:35.228482 master-0 kubenswrapper[4780]: E1205 12:31:35.228404 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:35.728387783 +0000 UTC m=+137.751128509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:31:35.228760 master-0 kubenswrapper[4780]: I1205 12:31:35.228709 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:35.228926 master-0 kubenswrapper[4780]: I1205 12:31:35.228901 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-config\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:35.228988 master-0 kubenswrapper[4780]: E1205 12:31:35.228931 4780 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:35.229035 master-0 kubenswrapper[4780]: E1205 12:31:35.228995 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:35.728977108 +0000 UTC m=+137.751717854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:31:35.229307 master-0 kubenswrapper[4780]: I1205 12:31:35.229278 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.229376 master-0 kubenswrapper[4780]: I1205 12:31:35.229314 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/58187662-b502-4d90-95ce-2aa91a81d256-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:35.229424 master-0 kubenswrapper[4780]: I1205 12:31:35.229369 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594aaded-5615-4bed-87ee-6173059a73be-config\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:35.229683 master-0 kubenswrapper[4780]: I1205 12:31:35.229632 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba095394-1873-4793-969d-3be979fa0771-serving-cert\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.230613 master-0 kubenswrapper[4780]: I1205 12:31:35.230577 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-serving-cert\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.231770 master-0 kubenswrapper[4780]: I1205 12:31:35.231745 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1871a9d6-6369-4d08-816f-9c6310b61ddf-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:35.231983 master-0 kubenswrapper[4780]: I1205 12:31:35.231949 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-client\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.232731 master-0 kubenswrapper[4780]: I1205 12:31:35.232681 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:35.233071 master-0 kubenswrapper[4780]: I1205 12:31:35.232991 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594aaded-5615-4bed-87ee-6173059a73be-serving-cert\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:35.237254 master-0 kubenswrapper[4780]: I1205 12:31:35.237215 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:35.243052 master-0 kubenswrapper[4780]: I1205 12:31:35.243007 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtjln\" (UniqueName: \"kubernetes.io/projected/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-kube-api-access-xtjln\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:35.243241 master-0 kubenswrapper[4780]: I1205 12:31:35.243194 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5hdg\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-kube-api-access-t5hdg\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.245059 master-0 kubenswrapper[4780]: I1205 12:31:35.245019 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps4ws\" (UniqueName: \"kubernetes.io/projected/58187662-b502-4d90-95ce-2aa91a81d256-kube-api-access-ps4ws\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:35.265817 master-0 kubenswrapper[4780]: I1205 12:31:35.265737 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55qpg\" (UniqueName: \"kubernetes.io/projected/ba095394-1873-4793-969d-3be979fa0771-kube-api-access-55qpg\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.285512 master-0 kubenswrapper[4780]: I1205 12:31:35.285455 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/594aaded-5615-4bed-87ee-6173059a73be-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:35.289070 master-0 kubenswrapper[4780]: I1205 12:31:35.289028 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:35.306411 master-0 kubenswrapper[4780]: I1205 12:31:35.306365 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2gd8\" (UniqueName: \"kubernetes.io/projected/1e6babfe-724a-4eab-bb3b-bc318bf57b70-kube-api-access-c2gd8\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:35.321721 master-0 kubenswrapper[4780]: I1205 12:31:35.321679 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:35.328916 master-0 kubenswrapper[4780]: I1205 12:31:35.328877 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:35.329784 master-0 kubenswrapper[4780]: I1205 12:31:35.329759 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwcr9\" (UniqueName: \"kubernetes.io/projected/f119ffe4-16bd-49eb-916d-b18ba0d79b54-kube-api-access-wwcr9\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.342959 master-0 kubenswrapper[4780]: I1205 12:31:35.342906 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:35.348407 master-0 kubenswrapper[4780]: I1205 12:31:35.348360 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1871a9d6-6369-4d08-816f-9c6310b61ddf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:35.351392 master-0 kubenswrapper[4780]: I1205 12:31:35.350513 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:35.357672 master-0 kubenswrapper[4780]: I1205 12:31:35.357639 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:35.367741 master-0 kubenswrapper[4780]: I1205 12:31:35.367670 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:35.378052 master-0 kubenswrapper[4780]: I1205 12:31:35.377984 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5p5d\" (UniqueName: \"kubernetes.io/projected/5f0c6889-0739-48a3-99cd-6db9d1f83242-kube-api-access-p5p5d\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:35.401684 master-0 kubenswrapper[4780]: I1205 12:31:35.401403 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:35.411222 master-0 kubenswrapper[4780]: I1205 12:31:35.411138 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:35.416052 master-0 kubenswrapper[4780]: I1205 12:31:35.414996 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26x2z\" (UniqueName: \"kubernetes.io/projected/49760d62-02e5-4882-b47f-663102b04946-kube-api-access-26x2z\") pod \"csi-snapshot-controller-operator-6bc8656fdc-zn7hv\" (UID: \"49760d62-02e5-4882-b47f-663102b04946\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" Dec 05 12:31:35.417521 master-0 kubenswrapper[4780]: I1205 12:31:35.417485 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt"] Dec 05 12:31:35.429623 master-0 kubenswrapper[4780]: I1205 12:31:35.429244 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.429623 master-0 kubenswrapper[4780]: W1205 12:31:35.429300 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7889e61e_b7ae_4ab6_a7d3_a1c5c83243b9.slice/crio-df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd WatchSource:0}: Error finding container df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd: Status 404 returned error can't find the container with id df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd Dec 05 12:31:35.433764 master-0 kubenswrapper[4780]: I1205 12:31:35.433707 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:35.452883 master-0 kubenswrapper[4780]: I1205 12:31:35.452785 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:35.456708 master-0 kubenswrapper[4780]: I1205 12:31:35.456667 4780 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlnqb\" (UniqueName: \"kubernetes.io/projected/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-kube-api-access-mlnqb\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.486464 master-0 kubenswrapper[4780]: I1205 12:31:35.486417 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:35.523065 master-0 kubenswrapper[4780]: I1205 12:31:35.523009 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t"] Dec 05 12:31:35.558010 master-0 kubenswrapper[4780]: I1205 12:31:35.556564 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:35.561779 master-0 kubenswrapper[4780]: W1205 12:31:35.558457 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod807d9093_aa67_4840_b5be_7f3abcc1beed.slice/crio-47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33 WatchSource:0}: Error finding container 47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33: Status 404 returned error can't find the container with id 47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33 Dec 05 12:31:35.569489 master-0 kubenswrapper[4780]: I1205 12:31:35.569322 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz"] Dec 05 12:31:35.637412 master-0 kubenswrapper[4780]: I1205 12:31:35.634513 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf"] Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: I1205 12:31:35.637609 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: I1205 12:31:35.637667 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: I1205 12:31:35.637689 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: I1205 12:31:35.637709 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: I1205 12:31:35.637736 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: E1205 12:31:35.637859 4780 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: E1205 12:31:35.637904 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:36.63789074 +0000 UTC m=+138.660631466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: E1205 12:31:35.639091 4780 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: E1205 12:31:35.639127 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:31:36.639117053 +0000 UTC m=+138.661857779 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: E1205 12:31:35.639201 4780 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: E1205 12:31:35.639230 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:36.639220836 +0000 UTC m=+138.661961562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: E1205 12:31:35.639289 4780 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: E1205 12:31:35.639314 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:36.639304968 +0000 UTC m=+138.662045694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: E1205 12:31:35.639388 4780 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:31:35.642248 master-0 kubenswrapper[4780]: E1205 12:31:35.639415 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:31:36.63940734 +0000 UTC m=+138.662148066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:31:35.644302 master-0 kubenswrapper[4780]: W1205 12:31:35.644170 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3d73c1_f4bd_4c91_936a_086dfa5e3460.slice/crio-3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782 WatchSource:0}: Error finding container 3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782: Status 404 returned error can't find the container with id 3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782 Dec 05 12:31:35.681700 master-0 kubenswrapper[4780]: I1205 12:31:35.678114 4780 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" Dec 05 12:31:35.686996 master-0 kubenswrapper[4780]: I1205 12:31:35.686581 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv"] Dec 05 12:31:35.690866 master-0 kubenswrapper[4780]: I1205 12:31:35.689978 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp"] Dec 05 12:31:35.699029 master-0 kubenswrapper[4780]: W1205 12:31:35.698260 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1871a9d6_6369_4d08_816f_9c6310b61ddf.slice/crio-2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4 WatchSource:0}: Error finding container 2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4: Status 404 returned error can't find the container with id 2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4 Dec 05 12:31:35.704284 master-0 kubenswrapper[4780]: I1205 12:31:35.704220 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5"] Dec 05 12:31:35.706235 master-0 kubenswrapper[4780]: I1205 12:31:35.706150 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp"] Dec 05 12:31:35.709904 master-0 kubenswrapper[4780]: I1205 12:31:35.709833 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs"] Dec 05 12:31:35.718955 master-0 kubenswrapper[4780]: I1205 12:31:35.718904 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24"] Dec 05 12:31:35.736691 master-0 kubenswrapper[4780]: W1205 12:31:35.736426 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf119ffe4_16bd_49eb_916d_b18ba0d79b54.slice/crio-b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab WatchSource:0}: Error finding container b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab: Status 404 returned error can't find the container with id b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab Dec 05 12:31:35.738504 master-0 kubenswrapper[4780]: I1205 12:31:35.738468 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:35.738563 master-0 kubenswrapper[4780]: I1205 12:31:35.738524 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:35.738692 master-0 kubenswrapper[4780]: I1205 12:31:35.738668 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:35.738737 master-0 kubenswrapper[4780]: E1205 12:31:35.738681 4780 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:35.738772 master-0 kubenswrapper[4780]: E1205 12:31:35.738759 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:36.73873748 +0000 UTC m=+138.761478206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:35.738881 master-0 kubenswrapper[4780]: E1205 12:31:35.738841 4780 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:35.739099 master-0 kubenswrapper[4780]: I1205 12:31:35.738861 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:35.739099 master-0 kubenswrapper[4780]: E1205 12:31:35.738934 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:36.738908634 +0000 UTC m=+138.761649530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:31:35.739099 master-0 kubenswrapper[4780]: E1205 12:31:35.738959 4780 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:31:35.739099 master-0 kubenswrapper[4780]: E1205 12:31:35.738962 4780 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:31:35.739099 master-0 kubenswrapper[4780]: E1205 12:31:35.739061 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:36.739034017 +0000 UTC m=+138.761774893 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:31:35.739099 master-0 kubenswrapper[4780]: E1205 12:31:35.739083 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:36.739074058 +0000 UTC m=+138.761814784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:31:35.774211 master-0 kubenswrapper[4780]: I1205 12:31:35.774110 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qsggt"] Dec 05 12:31:35.784766 master-0 kubenswrapper[4780]: W1205 12:31:35.784678 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4a70855_80b5_4d6a_bed1_b42364940de0.slice/crio-c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6 WatchSource:0}: Error finding container c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6: Status 404 returned error can't find the container with id c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6 Dec 05 12:31:35.882103 master-0 kubenswrapper[4780]: I1205 12:31:35.882057 4780 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv"] Dec 05 12:31:35.889988 master-0 kubenswrapper[4780]: W1205 12:31:35.889916 4780 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49760d62_02e5_4882_b47f_663102b04946.slice/crio-6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e WatchSource:0}: Error finding container 6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e: Status 404 returned error can't find the container with id 6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e Dec 05 12:31:36.158417 master-0 kubenswrapper[4780]: I1205 12:31:36.158337 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" event={"ID":"1871a9d6-6369-4d08-816f-9c6310b61ddf","Type":"ContainerStarted","Data":"2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4"} Dec 05 12:31:36.159507 master-0 kubenswrapper[4780]: I1205 12:31:36.159466 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerStarted","Data":"b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab"} Dec 05 12:31:36.160738 master-0 kubenswrapper[4780]: I1205 12:31:36.160684 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerStarted","Data":"df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd"} Dec 05 12:31:36.162340 master-0 kubenswrapper[4780]: I1205 12:31:36.162296 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerStarted","Data":"3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782"} Dec 05 12:31:36.164615 master-0 kubenswrapper[4780]: I1205 12:31:36.163923 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qsggt" event={"ID":"f4a70855-80b5-4d6a-bed1-b42364940de0","Type":"ContainerStarted","Data":"91f93abed058375a2f9d971d7119339c27c4857eb8ea956d8ecc7aeb14fabe54"} Dec 05 12:31:36.164615 master-0 kubenswrapper[4780]: I1205 12:31:36.163953 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qsggt" event={"ID":"f4a70855-80b5-4d6a-bed1-b42364940de0","Type":"ContainerStarted","Data":"c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6"} Dec 05 12:31:36.164615 master-0 kubenswrapper[4780]: I1205 12:31:36.164091 4780 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:36.165040 master-0 kubenswrapper[4780]: I1205 12:31:36.164995 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" event={"ID":"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e","Type":"ContainerStarted","Data":"9fd6db41eb8dc90e6efffc25bb3c93739722e6824dad0dcb9a786720bc6514c4"} Dec 05 12:31:36.166168 master-0 kubenswrapper[4780]: I1205 12:31:36.166114 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerStarted","Data":"44e741be030df14b7e9e415d32f4095c562d693609b8dc4bd8ec51c21503bbca"} Dec 05 12:31:36.167618 master-0 kubenswrapper[4780]: I1205 12:31:36.167555 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nwplt" event={"ID":"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6","Type":"ContainerStarted","Data":"3a9d8373a41ae93e2045d1c0300d43339b0c915de4cad9048741918269853b51"} Dec 05 12:31:36.168665 master-0 kubenswrapper[4780]: I1205 12:31:36.168625 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerStarted","Data":"bba3aa271baddd92ed5881d6af79fb82b3a45fce07083a5cd051cbeeb1a01428"} Dec 05 12:31:36.170684 master-0 kubenswrapper[4780]: I1205 12:31:36.170459 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerStarted","Data":"f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0"} Dec 05 12:31:36.170684 master-0 kubenswrapper[4780]: I1205 12:31:36.170491 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerStarted","Data":"47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33"} Dec 05 12:31:36.171770 master-0 kubenswrapper[4780]: I1205 12:31:36.171290 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" event={"ID":"49760d62-02e5-4882-b47f-663102b04946","Type":"ContainerStarted","Data":"6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e"} Dec 05 12:31:36.172849 master-0 kubenswrapper[4780]: I1205 12:31:36.172766 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerStarted","Data":"ecdffd0c2fc8d747077d4ca5dcb541da82682f6d035455ac42566e8514bfadc3"} Dec 05 12:31:36.174094 master-0 kubenswrapper[4780]: I1205 12:31:36.174036 4780 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" event={"ID":"f3792522-fec6-4022-90ac-0b8467fcd625","Type":"ContainerStarted","Data":"49f2f301b501743d7a4254bc3eeb040151fb199e2a4d9ec64ddce3a74ce66f5b"} Dec 05 12:31:36.204256 master-0 kubenswrapper[4780]: I1205 12:31:36.202564 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" podStartSLOduration=102.202536224 podStartE2EDuration="1m42.202536224s" podCreationTimestamp="2025-12-05 12:29:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:31:36.201918628 +0000 UTC m=+138.224659374" watchObservedRunningTime="2025-12-05 12:31:36.202536224 +0000 UTC m=+138.225276950" Dec 05 12:31:36.204256 master-0 kubenswrapper[4780]: I1205 12:31:36.202692 4780 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qsggt" podStartSLOduration=66.202685538 podStartE2EDuration="1m6.202685538s" podCreationTimestamp="2025-12-05 12:30:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:31:36.181677366 +0000 UTC m=+138.204418112" watchObservedRunningTime="2025-12-05 12:31:36.202685538 +0000 UTC m=+138.225426294" Dec 05 12:31:36.650405 master-0 kubenswrapper[4780]: I1205 12:31:36.650331 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:36.650405 master-0 kubenswrapper[4780]: I1205 12:31:36.650393 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:36.650405 master-0 kubenswrapper[4780]: I1205 12:31:36.650421 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: I1205 12:31:36.650458 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: I1205 12:31:36.650515 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: E1205 12:31:36.650678 4780 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: E1205 12:31:36.650741 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:31:38.650721478 +0000 UTC m=+140.673462204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: E1205 12:31:36.650810 4780 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: E1205 12:31:36.650872 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:38.650862393 +0000 UTC m=+140.673603119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: E1205 12:31:36.650932 4780 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: E1205 12:31:36.650962 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:38.650952395 +0000 UTC m=+140.673693121 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: E1205 12:31:36.651007 4780 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: E1205 12:31:36.651027 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:31:38.651021107 +0000 UTC m=+140.673761833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: E1205 12:31:36.651062 4780 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:36.651560 master-0 kubenswrapper[4780]: E1205 12:31:36.651080 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:38.651073928 +0000 UTC m=+140.673814654 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:36.751882 master-0 kubenswrapper[4780]: I1205 12:31:36.751811 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:36.752446 master-0 kubenswrapper[4780]: I1205 12:31:36.751921 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:36.752446 master-0 kubenswrapper[4780]: I1205 12:31:36.751953 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:36.752446 master-0 kubenswrapper[4780]: E1205 12:31:36.752136 4780 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:31:36.752446 master-0 kubenswrapper[4780]: E1205 12:31:36.752251 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:38.752227025 +0000 UTC m=+140.774967751 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:31:36.752625 master-0 kubenswrapper[4780]: I1205 12:31:36.752574 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:36.752680 master-0 kubenswrapper[4780]: E1205 12:31:36.752657 4780 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:36.753466 master-0 kubenswrapper[4780]: E1205 12:31:36.752716 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:38.752698617 +0000 UTC m=+140.775439343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:31:36.753466 master-0 kubenswrapper[4780]: E1205 12:31:36.752877 4780 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:36.753466 master-0 kubenswrapper[4780]: E1205 12:31:36.752961 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:38.752935354 +0000 UTC m=+140.775676250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:36.753466 master-0 kubenswrapper[4780]: E1205 12:31:36.753165 4780 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:31:36.753466 master-0 kubenswrapper[4780]: E1205 12:31:36.753342 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:38.753306374 +0000 UTC m=+140.776047250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:31:38.698122 master-0 kubenswrapper[4780]: I1205 12:31:38.698048 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:38.698629 master-0 kubenswrapper[4780]: I1205 12:31:38.698141 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:38.698629 master-0 kubenswrapper[4780]: E1205 12:31:38.698290 4780 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:31:38.698629 master-0 kubenswrapper[4780]: I1205 12:31:38.698398 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:38.698629 master-0 kubenswrapper[4780]: I1205 12:31:38.698496 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:38.698629 master-0 kubenswrapper[4780]: E1205 12:31:38.698499 4780 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:31:38.698629 master-0 kubenswrapper[4780]: E1205 12:31:38.698531 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:31:42.698510496 +0000 UTC m=+144.721251222 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:31:38.698629 master-0 kubenswrapper[4780]: E1205 12:31:38.698583 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:42.698562218 +0000 UTC m=+144.721302944 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:31:38.698838 master-0 kubenswrapper[4780]: E1205 12:31:38.698718 4780 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:31:38.699018 master-0 kubenswrapper[4780]: I1205 12:31:38.698989 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:38.699079 master-0 kubenswrapper[4780]: E1205 12:31:38.699063 4780 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:38.699150 master-0 kubenswrapper[4780]: E1205 12:31:38.699134 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:42.699111982 +0000 UTC m=+144.721852708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:38.699242 master-0 kubenswrapper[4780]: E1205 12:31:38.699159 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:31:42.699148513 +0000 UTC m=+144.721889239 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:31:38.699467 master-0 kubenswrapper[4780]: E1205 12:31:38.699394 4780 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:38.699569 master-0 kubenswrapper[4780]: E1205 12:31:38.699523 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:42.699437141 +0000 UTC m=+144.722178087 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:31:38.800441 master-0 kubenswrapper[4780]: I1205 12:31:38.800363 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:38.800670 master-0 kubenswrapper[4780]: I1205 12:31:38.800471 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:38.800670 master-0 kubenswrapper[4780]: E1205 12:31:38.800624 4780 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:31:38.800670 master-0 kubenswrapper[4780]: E1205 12:31:38.800630 4780 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:31:38.800814 master-0 kubenswrapper[4780]: E1205 12:31:38.800698 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:42.80067605 +0000 UTC m=+144.823416776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:31:38.800814 master-0 kubenswrapper[4780]: E1205 12:31:38.800737 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:42.800711701 +0000 UTC m=+144.823452617 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:31:38.800902 master-0 kubenswrapper[4780]: I1205 12:31:38.800856 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:38.800944 master-0 kubenswrapper[4780]: I1205 12:31:38.800930 4780 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:38.801049 master-0 kubenswrapper[4780]: E1205 12:31:38.801008 4780 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:38.801093 master-0 kubenswrapper[4780]: E1205 12:31:38.801063 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:42.80105088 +0000 UTC m=+144.823791796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:38.801134 master-0 kubenswrapper[4780]: E1205 12:31:38.801087 4780 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:38.801134 master-0 kubenswrapper[4780]: E1205 12:31:38.801129 4780 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:42.801117092 +0000 UTC m=+144.823857828 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:31:41.587779 master-0 systemd[1]: Stopping Kubernetes Kubelet... Dec 05 12:31:41.612387 master-0 systemd[1]: kubelet.service: Deactivated successfully. Dec 05 12:31:41.612752 master-0 systemd[1]: Stopped Kubernetes Kubelet. Dec 05 12:31:41.615170 master-0 systemd[1]: kubelet.service: Consumed 11.862s CPU time. Dec 05 12:31:41.641956 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 05 12:31:41.751134 master-0 kubenswrapper[8731]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:31:41.751134 master-0 kubenswrapper[8731]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 12:31:41.751134 master-0 kubenswrapper[8731]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:31:41.751134 master-0 kubenswrapper[8731]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:31:41.751134 master-0 kubenswrapper[8731]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 12:31:41.751134 master-0 kubenswrapper[8731]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:31:41.752896 master-0 kubenswrapper[8731]: I1205 12:31:41.751269 8731 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754575 8731 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754596 8731 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754603 8731 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754609 8731 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754615 8731 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754622 8731 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754628 8731 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754633 8731 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754639 8731 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754645 8731 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754651 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754656 8731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754662 8731 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754675 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754681 8731 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754686 8731 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754692 8731 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754698 8731 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754704 8731 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:31:41.754681 master-0 kubenswrapper[8731]: W1205 12:31:41.754710 8731 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754716 8731 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754723 8731 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754728 8731 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754735 8731 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754741 8731 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754746 8731 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754751 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754757 8731 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754763 8731 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754768 8731 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754773 8731 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754780 8731 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754788 8731 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754794 8731 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754801 8731 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754806 8731 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754812 8731 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754817 8731 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754822 8731 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:31:41.755850 master-0 kubenswrapper[8731]: W1205 12:31:41.754828 8731 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754833 8731 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754838 8731 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754844 8731 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754850 8731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754855 8731 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754860 8731 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754866 8731 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754872 8731 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754881 8731 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754888 8731 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754894 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754900 8731 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754906 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754915 8731 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754921 8731 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754926 8731 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754931 8731 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754937 8731 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:31:41.757066 master-0 kubenswrapper[8731]: W1205 12:31:41.754943 8731 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.754948 8731 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.754955 8731 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.754961 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.754967 8731 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.754972 8731 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.754978 8731 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.754985 8731 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.754991 8731 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.754996 8731 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.755002 8731 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.755009 8731 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.755016 8731 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: W1205 12:31:41.755022 8731 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: I1205 12:31:41.755165 8731 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: I1205 12:31:41.755196 8731 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: I1205 12:31:41.755207 8731 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: I1205 12:31:41.755215 8731 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: I1205 12:31:41.755222 8731 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: I1205 12:31:41.755229 8731 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: I1205 12:31:41.755238 8731 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 12:31:41.758114 master-0 kubenswrapper[8731]: I1205 12:31:41.755246 8731 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755253 8731 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755259 8731 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755266 8731 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755272 8731 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755279 8731 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755295 8731 flags.go:64] FLAG: --cgroup-root="" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755302 8731 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755308 8731 flags.go:64] FLAG: --client-ca-file="" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755315 8731 flags.go:64] FLAG: --cloud-config="" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755322 8731 flags.go:64] FLAG: --cloud-provider="" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755328 8731 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755337 8731 flags.go:64] FLAG: --cluster-domain="" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755343 8731 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755349 8731 flags.go:64] FLAG: --config-dir="" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755356 8731 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755363 8731 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755371 8731 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755377 8731 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755383 8731 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755390 8731 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755397 8731 flags.go:64] FLAG: --contention-profiling="false" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755404 8731 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755409 8731 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755416 8731 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 12:31:41.759345 master-0 kubenswrapper[8731]: I1205 12:31:41.755422 8731 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755429 8731 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755435 8731 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755441 8731 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755447 8731 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755454 8731 flags.go:64] FLAG: --enable-server="true" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755460 8731 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755468 8731 flags.go:64] FLAG: --event-burst="100" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755474 8731 flags.go:64] FLAG: --event-qps="50" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755480 8731 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755486 8731 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755493 8731 flags.go:64] FLAG: --eviction-hard="" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755500 8731 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755511 8731 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755518 8731 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755524 8731 flags.go:64] FLAG: --eviction-soft="" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755530 8731 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755537 8731 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755543 8731 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755549 8731 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755555 8731 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755561 8731 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755567 8731 flags.go:64] FLAG: --feature-gates="" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755575 8731 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755581 8731 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 12:31:41.760740 master-0 kubenswrapper[8731]: I1205 12:31:41.755588 8731 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755594 8731 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755602 8731 flags.go:64] FLAG: --healthz-port="10248" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755608 8731 flags.go:64] FLAG: --help="false" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755614 8731 flags.go:64] FLAG: --hostname-override="" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755620 8731 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755627 8731 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755634 8731 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755640 8731 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755646 8731 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755652 8731 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755658 8731 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755664 8731 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755671 8731 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755678 8731 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755684 8731 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755691 8731 flags.go:64] FLAG: --kube-reserved="" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755697 8731 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755704 8731 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755710 8731 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755719 8731 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755725 8731 flags.go:64] FLAG: --lock-file="" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755731 8731 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755737 8731 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755744 8731 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755754 8731 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 12:31:41.762293 master-0 kubenswrapper[8731]: I1205 12:31:41.755760 8731 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755766 8731 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755772 8731 flags.go:64] FLAG: --logging-format="text" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755778 8731 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755785 8731 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755791 8731 flags.go:64] FLAG: --manifest-url="" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755797 8731 flags.go:64] FLAG: --manifest-url-header="" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755810 8731 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755816 8731 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755824 8731 flags.go:64] FLAG: --max-pods="110" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755830 8731 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755836 8731 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755843 8731 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755849 8731 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755856 8731 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755863 8731 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755869 8731 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755883 8731 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755890 8731 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755897 8731 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755904 8731 flags.go:64] FLAG: --pod-cidr="" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755909 8731 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755923 8731 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 12:31:41.763609 master-0 kubenswrapper[8731]: I1205 12:31:41.755929 8731 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.755935 8731 flags.go:64] FLAG: --pods-per-core="0" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.755941 8731 flags.go:64] FLAG: --port="10250" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.755947 8731 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.755956 8731 flags.go:64] FLAG: --provider-id="" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.755962 8731 flags.go:64] FLAG: --qos-reserved="" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.755969 8731 flags.go:64] FLAG: --read-only-port="10255" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.755975 8731 flags.go:64] FLAG: --register-node="true" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.755981 8731 flags.go:64] FLAG: --register-schedulable="true" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.755987 8731 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.755998 8731 flags.go:64] FLAG: --registry-burst="10" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756004 8731 flags.go:64] FLAG: --registry-qps="5" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756010 8731 flags.go:64] FLAG: --reserved-cpus="" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756016 8731 flags.go:64] FLAG: --reserved-memory="" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756023 8731 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756031 8731 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756038 8731 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756044 8731 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756050 8731 flags.go:64] FLAG: --runonce="false" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756056 8731 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756064 8731 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756070 8731 flags.go:64] FLAG: --seccomp-default="false" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756076 8731 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756082 8731 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756089 8731 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756095 8731 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 12:31:41.765130 master-0 kubenswrapper[8731]: I1205 12:31:41.756101 8731 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756108 8731 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756114 8731 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756120 8731 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756127 8731 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756134 8731 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756140 8731 flags.go:64] FLAG: --system-cgroups="" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756146 8731 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756156 8731 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756162 8731 flags.go:64] FLAG: --tls-cert-file="" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756170 8731 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756195 8731 flags.go:64] FLAG: --tls-min-version="" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756201 8731 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756207 8731 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756213 8731 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756220 8731 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756225 8731 flags.go:64] FLAG: --v="2" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756234 8731 flags.go:64] FLAG: --version="false" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756241 8731 flags.go:64] FLAG: --vmodule="" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756249 8731 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: I1205 12:31:41.756256 8731 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: W1205 12:31:41.756423 8731 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: W1205 12:31:41.756431 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: W1205 12:31:41.756438 8731 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:31:41.766805 master-0 kubenswrapper[8731]: W1205 12:31:41.756443 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756449 8731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756455 8731 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756460 8731 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756467 8731 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756476 8731 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756481 8731 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756487 8731 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756493 8731 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756500 8731 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756507 8731 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756513 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756519 8731 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756525 8731 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756530 8731 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756536 8731 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756542 8731 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756547 8731 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756555 8731 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756562 8731 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:31:41.768126 master-0 kubenswrapper[8731]: W1205 12:31:41.756569 8731 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756575 8731 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756581 8731 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756586 8731 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756592 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756597 8731 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756602 8731 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756607 8731 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756612 8731 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756618 8731 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756623 8731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756628 8731 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756633 8731 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756639 8731 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756645 8731 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756650 8731 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756656 8731 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756664 8731 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756669 8731 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756675 8731 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:31:41.769663 master-0 kubenswrapper[8731]: W1205 12:31:41.756680 8731 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756685 8731 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756690 8731 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756696 8731 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756701 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756706 8731 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756711 8731 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756716 8731 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756722 8731 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756727 8731 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756735 8731 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756740 8731 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756746 8731 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756752 8731 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756757 8731 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756763 8731 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756768 8731 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756774 8731 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756779 8731 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756785 8731 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:31:41.772563 master-0 kubenswrapper[8731]: W1205 12:31:41.756790 8731 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.756796 8731 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.756801 8731 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.756806 8731 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.756812 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.756817 8731 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.756824 8731 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.756830 8731 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.756836 8731 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: I1205 12:31:41.756858 8731 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: I1205 12:31:41.772281 8731 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: I1205 12:31:41.772349 8731 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.772501 8731 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.772514 8731 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:31:41.773150 master-0 kubenswrapper[8731]: W1205 12:31:41.772526 8731 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772537 8731 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772547 8731 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772558 8731 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772567 8731 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772575 8731 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772584 8731 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772593 8731 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772601 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772609 8731 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772617 8731 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772625 8731 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772633 8731 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772645 8731 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772654 8731 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772663 8731 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772670 8731 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772679 8731 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772687 8731 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:31:41.773516 master-0 kubenswrapper[8731]: W1205 12:31:41.772695 8731 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772703 8731 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772711 8731 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772718 8731 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772729 8731 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772741 8731 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772751 8731 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772759 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772768 8731 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772777 8731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772790 8731 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772799 8731 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772808 8731 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772817 8731 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772825 8731 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772834 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772843 8731 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772851 8731 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772859 8731 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:31:41.773962 master-0 kubenswrapper[8731]: W1205 12:31:41.772868 8731 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772876 8731 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772884 8731 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772892 8731 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772900 8731 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772907 8731 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772915 8731 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772923 8731 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772930 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772939 8731 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772947 8731 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772955 8731 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772962 8731 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772970 8731 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772978 8731 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772986 8731 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.772993 8731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.773001 8731 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.773009 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.773017 8731 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:31:41.774470 master-0 kubenswrapper[8731]: W1205 12:31:41.773025 8731 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773033 8731 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773042 8731 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773052 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773060 8731 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773069 8731 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773077 8731 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773086 8731 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773094 8731 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773103 8731 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773111 8731 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773119 8731 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: I1205 12:31:41.773133 8731 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773561 8731 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773582 8731 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:31:41.774942 master-0 kubenswrapper[8731]: W1205 12:31:41.773595 8731 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773606 8731 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773619 8731 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773630 8731 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773641 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773651 8731 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773661 8731 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773672 8731 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773682 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773692 8731 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773701 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773711 8731 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773737 8731 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773745 8731 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773753 8731 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773764 8731 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773773 8731 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773782 8731 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773792 8731 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:31:41.775394 master-0 kubenswrapper[8731]: W1205 12:31:41.773803 8731 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773811 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773822 8731 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773830 8731 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773839 8731 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773846 8731 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773854 8731 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773862 8731 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773879 8731 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773887 8731 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773895 8731 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773903 8731 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773911 8731 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773919 8731 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773927 8731 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773934 8731 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773943 8731 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773950 8731 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773962 8731 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773971 8731 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:31:41.775822 master-0 kubenswrapper[8731]: W1205 12:31:41.773980 8731 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.773988 8731 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.773998 8731 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774006 8731 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774014 8731 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774022 8731 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774029 8731 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774037 8731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774045 8731 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774053 8731 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774061 8731 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774069 8731 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774077 8731 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774084 8731 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774092 8731 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774100 8731 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774108 8731 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774116 8731 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774125 8731 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774133 8731 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:31:41.776298 master-0 kubenswrapper[8731]: W1205 12:31:41.774143 8731 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: W1205 12:31:41.774152 8731 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: W1205 12:31:41.774160 8731 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: W1205 12:31:41.774170 8731 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: W1205 12:31:41.774203 8731 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: W1205 12:31:41.774211 8731 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: W1205 12:31:41.774219 8731 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: W1205 12:31:41.774227 8731 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: W1205 12:31:41.774234 8731 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: W1205 12:31:41.774243 8731 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: W1205 12:31:41.774250 8731 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: I1205 12:31:41.774264 8731 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 12:31:41.776854 master-0 kubenswrapper[8731]: I1205 12:31:41.774660 8731 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 12:31:41.778010 master-0 kubenswrapper[8731]: I1205 12:31:41.777972 8731 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 12:31:41.778187 master-0 kubenswrapper[8731]: I1205 12:31:41.778152 8731 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 12:31:41.778723 master-0 kubenswrapper[8731]: I1205 12:31:41.778696 8731 server.go:997] "Starting client certificate rotation" Dec 05 12:31:41.778723 master-0 kubenswrapper[8731]: I1205 12:31:41.778720 8731 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 12:31:41.779020 master-0 kubenswrapper[8731]: I1205 12:31:41.778887 8731 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-06 12:20:41 +0000 UTC, rotation deadline is 2025-12-06 06:29:47.960475891 +0000 UTC Dec 05 12:31:41.779020 master-0 kubenswrapper[8731]: I1205 12:31:41.779013 8731 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h58m6.181467047s for next certificate rotation Dec 05 12:31:41.780259 master-0 kubenswrapper[8731]: I1205 12:31:41.780218 8731 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 12:31:41.782105 master-0 kubenswrapper[8731]: I1205 12:31:41.782063 8731 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 12:31:41.785191 master-0 kubenswrapper[8731]: I1205 12:31:41.785143 8731 log.go:25] "Validated CRI v1 runtime API" Dec 05 12:31:41.789429 master-0 kubenswrapper[8731]: I1205 12:31:41.789389 8731 log.go:25] "Validated CRI v1 image API" Dec 05 12:31:41.790869 master-0 kubenswrapper[8731]: I1205 12:31:41.790846 8731 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 12:31:41.794289 master-0 kubenswrapper[8731]: I1205 12:31:41.794256 8731 fs.go:135] Filesystem UUIDs: map[4623d87d-4611-48ee-a0ce-68b00f5d84bd:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Dec 05 12:31:41.794755 master-0 kubenswrapper[8731]: I1205 12:31:41.794285 8731 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1bd06826dd54922214ff0bdf4dd49e3e4fb5917fe2431fd30da1ce39eb71cae2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1bd06826dd54922214ff0bdf4dd49e3e4fb5917fe2431fd30da1ce39eb71cae2/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4/userdata/shm major:0 minor:328 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/34418050489e8b48781fa5128a0548228f5bdb58f7e6a5f88226bbd7dacf7bb5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/34418050489e8b48781fa5128a0548228f5bdb58f7e6a5f88226bbd7dacf7bb5/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8/userdata/shm major:0 minor:122 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3a9d8373a41ae93e2045d1c0300d43339b0c915de4cad9048741918269853b51/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3a9d8373a41ae93e2045d1c0300d43339b0c915de4cad9048741918269853b51/userdata/shm major:0 minor:344 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782/userdata/shm major:0 minor:320 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/44e741be030df14b7e9e415d32f4095c562d693609b8dc4bd8ec51c21503bbca/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/44e741be030df14b7e9e415d32f4095c562d693609b8dc4bd8ec51c21503bbca/userdata/shm major:0 minor:334 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33/userdata/shm major:0 minor:316 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/49f2f301b501743d7a4254bc3eeb040151fb199e2a4d9ec64ddce3a74ce66f5b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/49f2f301b501743d7a4254bc3eeb040151fb199e2a4d9ec64ddce3a74ce66f5b/userdata/shm major:0 minor:325 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e/userdata/shm major:0 minor:360 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9fd6db41eb8dc90e6efffc25bb3c93739722e6824dad0dcb9a786720bc6514c4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9fd6db41eb8dc90e6efffc25bb3c93739722e6824dad0dcb9a786720bc6514c4/userdata/shm major:0 minor:337 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733/userdata/shm major:0 minor:160 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab/userdata/shm major:0 minor:327 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b/userdata/shm major:0 minor:149 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bba3aa271baddd92ed5881d6af79fb82b3a45fce07083a5cd051cbeeb1a01428/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bba3aa271baddd92ed5881d6af79fb82b3a45fce07083a5cd051cbeeb1a01428/userdata/shm major:0 minor:321 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6/userdata/shm major:0 minor:350 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd/userdata/shm major:0 minor:311 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e0816ccdefc3d19a555c704cf7914804a097b5a95e2655805ebd92880ab7a03f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e0816ccdefc3d19a555c704cf7914804a097b5a95e2655805ebd92880ab7a03f/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e5863ee594cde86aeedce8416be9b249f569b2f49267eb70875c7f8a2e451e4e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e5863ee594cde86aeedce8416be9b249f569b2f49267eb70875c7f8a2e451e4e/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ecdffd0c2fc8d747077d4ca5dcb541da82682f6d035455ac42566e8514bfadc3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ecdffd0c2fc8d747077d4ca5dcb541da82682f6d035455ac42566e8514bfadc3/userdata/shm major:0 minor:332 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134/userdata/shm major:0 minor:164 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d/userdata/shm major:0 minor:187 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0dda6d9b-cb3a-413a-85af-ef08f15ea42e/volumes/kubernetes.io~projected/kube-api-access-62nqj:{mountpoint:/var/lib/kubelet/pods/0dda6d9b-cb3a-413a-85af-ef08f15ea42e/volumes/kubernetes.io~projected/kube-api-access-62nqj major:0 minor:300 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~projected/kube-api-access major:0 minor:324 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~secret/serving-cert major:0 minor:305 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1e6babfe-724a-4eab-bb3b-bc318bf57b70/volumes/kubernetes.io~projected/kube-api-access-c2gd8:{mountpoint:/var/lib/kubelet/pods/1e6babfe-724a-4eab-bb3b-bc318bf57b70/volumes/kubernetes.io~projected/kube-api-access-c2gd8 major:0 minor:318 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29812c4b-48ac-488c-863c-1d52e39ea2ae/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/29812c4b-48ac-488c-863c-1d52e39ea2ae/volumes/kubernetes.io~projected/kube-api-access major:0 minor:77 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:341 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/kube-api-access-t5hdg:{mountpoint:/var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/kube-api-access-t5hdg major:0 minor:309 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~projected/kube-api-access-dmq98:{mountpoint:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~projected/kube-api-access-dmq98 major:0 minor:159 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:158 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/49760d62-02e5-4882-b47f-663102b04946/volumes/kubernetes.io~projected/kube-api-access-26x2z:{mountpoint:/var/lib/kubelet/pods/49760d62-02e5-4882-b47f-663102b04946/volumes/kubernetes.io~projected/kube-api-access-26x2z major:0 minor:335 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~projected/kube-api-access-xtjln:{mountpoint:/var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~projected/kube-api-access-xtjln major:0 minor:310 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~secret/serving-cert major:0 minor:308 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58187662-b502-4d90-95ce-2aa91a81d256/volumes/kubernetes.io~projected/kube-api-access-ps4ws:{mountpoint:/var/lib/kubelet/pods/58187662-b502-4d90-95ce-2aa91a81d256/volumes/kubernetes.io~projected/kube-api-access-ps4ws major:0 minor:312 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~projected/kube-api-access major:0 minor:315 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~secret/serving-cert major:0 minor:306 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~projected/kube-api-access-996h9:{mountpoint:/var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~projected/kube-api-access-996h9 major:0 minor:76 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f0c6889-0739-48a3-99cd-6db9d1f83242/volumes/kubernetes.io~projected/kube-api-access-p5p5d:{mountpoint:/var/lib/kubelet/pods/5f0c6889-0739-48a3-99cd-6db9d1f83242/volumes/kubernetes.io~projected/kube-api-access-p5p5d major:0 minor:329 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/708bf629-9949-4b79-a88a-c73ba033475b/volumes/kubernetes.io~projected/kube-api-access-6vx2z:{mountpoint:/var/lib/kubelet/pods/708bf629-9949-4b79-a88a-c73ba033475b/volumes/kubernetes.io~projected/kube-api-access-6vx2z major:0 minor:148 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~projected/kube-api-access-dtvzs:{mountpoint:/var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~projected/kube-api-access-dtvzs major:0 minor:295 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~secret/serving-cert major:0 minor:289 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~projected/kube-api-access major:0 minor:297 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~secret/serving-cert major:0 minor:288 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~projected/kube-api-access-nml2g:{mountpoint:/var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~projected/kube-api-access-nml2g major:0 minor:299 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:296 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/kube-api-access-fxxw7:{mountpoint:/var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/kube-api-access-fxxw7 major:0 minor:294 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~projected/kube-api-access-9z8h9:{mountpoint:/var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~projected/kube-api-access-9z8h9 major:0 minor:157 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:156 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~projected/kube-api-access-mvbfq:{mountpoint:/var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~projected/kube-api-access-mvbfq major:0 minor:185 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~secret/webhook-cert major:0 minor:186 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~projected/kube-api-access-55qpg:{mountpoint:/var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~projected/kube-api-access-55qpg major:0 minor:314 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~secret/serving-cert major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6/volumes/kubernetes.io~projected/kube-api-access-mlnqb:{mountpoint:/var/lib/kubelet/pods/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6/volumes/kubernetes.io~projected/kube-api-access-mlnqb major:0 minor:343 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~projected/kube-api-access-ph9w6:{mountpoint:/var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~projected/kube-api-access-ph9w6 major:0 minor:302 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:292 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb/volumes/kubernetes.io~projected/kube-api-access-z6mb6:{mountpoint:/var/lib/kubelet/pods/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb/volumes/kubernetes.io~projected/kube-api-access-z6mb6 major:0 minor:298 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~projected/kube-api-access-2tngh:{mountpoint:/var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~projected/kube-api-access-2tngh major:0 minor:293 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~secret/serving-cert major:0 minor:291 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~projected/kube-api-access-wwcr9:{mountpoint:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~projected/kube-api-access-wwcr9 major:0 minor:319 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/etcd-client major:0 minor:307 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/serving-cert major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~projected/kube-api-access-flxbg:{mountpoint:/var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~projected/kube-api-access-flxbg major:0 minor:301 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~secret/serving-cert major:0 minor:290 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f4a70855-80b5-4d6a-bed1-b42364940de0/volumes/kubernetes.io~projected/kube-api-access-69z2l:{mountpoint:/var/lib/kubelet/pods/f4a70855-80b5-4d6a-bed1-b42364940de0/volumes/kubernetes.io~projected/kube-api-access-69z2l major:0 minor:342 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1/volumes/kubernetes.io~projected/kube-api-access-x59kd:{mountpoint:/var/lib/kubelet/pods/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1/volumes/kubernetes.io~projected/kube-api-access-x59kd major:0 minor:141 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb7003a6-4341-49eb-bec3-76ba8610fa12/volumes/kubernetes.io~projected/kube-api-access-69n5s:{mountpoint:/var/lib/kubelet/pods/fb7003a6-4341-49eb-bec3-76ba8610fa12/volumes/kubernetes.io~projected/kube-api-access-69n5s major:0 minor:153 fsType:tmpfs blockSize:0} overlay_0-104:{mountpoint:/var/lib/containers/storage/overlay/361654af092d60dbc2c37ad00d56d53f895715ea33aff3696ba19ff31c9a32cd/merged major:0 minor:104 fsType:overlay blockSize:0} overlay_0-112:{mountpoint:/var/lib/containers/storage/overlay/28d2aa47d3a6de9da5336c33e0eb8e6d279750fdb3baec0c6b351c1b33cf8a7a/merged major:0 minor:112 fsType:overlay blockSize:0} overlay_0-120:{mountpoint:/var/lib/containers/storage/overlay/78dd4e03cf7b36ee1c3c9c4f3ca9b9425e1567b6fee5c91e1890cff7148f32f4/merged major:0 minor:120 fsType:overlay blockSize:0} overlay_0-124:{mountpoint:/var/lib/containers/storage/overlay/41a63c485b6f01daf711198e34ae3bd8919d0c4b03a87f453aba290d7e667c4b/merged major:0 minor:124 fsType:overlay blockSize:0} overlay_0-126:{mountpoint:/var/lib/containers/storage/overlay/c3bb86d274f22c380bfdb563eb386b6ddec88c1c7c4c9e3efd95417f98a20314/merged major:0 minor:126 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/db1eb56baf8c4609a9a00fa74b13bf2216d31f89bf2fd724863bf184115fac99/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-139:{mountpoint:/var/lib/containers/storage/overlay/7c9eed8dbadc7d21bfe259abb0e0b78eb49b24a4ef20fa11f8b16f9f112edcf0/merged major:0 minor:139 fsType:overlay blockSize:0} overlay_0-144:{mountpoint:/var/lib/containers/storage/overlay/cb9b117a6e3f0d02be5147ac9286575ef896871d113013e8381ee3ad2b0b427a/merged major:0 minor:144 fsType:overlay blockSize:0} overlay_0-146:{mountpoint:/var/lib/containers/storage/overlay/5e603ed284b633834925b11b73da59dcfd563d9ccef07b75d7211a6c0e836831/merged major:0 minor:146 fsType:overlay blockSize:0} overlay_0-151:{mountpoint:/var/lib/containers/storage/overlay/caca56b5d27ec29990282d0668116750acbba2d7e3d9779738bc16e5c448ab9a/merged major:0 minor:151 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/9fd902f996c13fe0242574cdafad3b2c5ff3681aa7b0fc7635f8f860481968c3/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-162:{mountpoint:/var/lib/containers/storage/overlay/5cd3bf058edf49ff3ffbbd533611e3e6f91d8142b72269ad7b85352f5944ef79/merged major:0 minor:162 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/bca31ded97068415413387593851c0977265e7e7169339d6e791b5ff6aa32b7f/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-168:{mountpoint:/var/lib/containers/storage/overlay/27fa107044f8e2b4bf52ef0af9b888461d16231f2467d98248fd7170e24645fc/merged major:0 minor:168 fsType:overlay blockSize:0} overlay_0-170:{mountpoint:/var/lib/containers/storage/overlay/f396a5163e890309cd50cbe125aa426b841dfdb0dcc6080f06f52b23f5be551b/merged major:0 minor:170 fsType:overlay blockSize:0} overlay_0-172:{mountpoint:/var/lib/containers/storage/overlay/5c65662a322b4ae50a6c0cdb0aaf7519ed441eab69704037b551581159fd5f64/merged major:0 minor:172 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/ac62cd12baf8c7a5d2ed3c5aa1f2735c50dcc3076bf74dcfdfbad53a68ac998e/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-183:{mountpoint:/var/lib/containers/storage/overlay/939c3753384f1514c3b853d68e4b24a4a2a080bb9f674fdc6e890c86e8c37f2b/merged major:0 minor:183 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/269c002f4c930e7db768dc98adf0f0a333ee354a44fa1614a670ce6df48d0ad0/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-191:{mountpoint:/var/lib/containers/storage/overlay/409b9a9c36edf4083b8c0c2dc75f654720588680463712b46fcec148cb1a912d/merged major:0 minor:191 fsType:overlay blockSize:0} overlay_0-193:{mountpoint:/var/lib/containers/storage/overlay/f712703810076067635ff08101d4fb0010dd2cebceeb9318a17e00faee44a766/merged major:0 minor:193 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/d4f62d75aa34a729b0c5990c9ee6f5fb30bb2b7bb8acb6b598d048c68ee66b5a/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-197:{mountpoint:/var/lib/containers/storage/overlay/f65afd34ddd49773a2c4ff9b7affb041aff8c6067349ccac6ff871b1b710454d/merged major:0 minor:197 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/9e3cadddfe6201f0c6f29135f52d4a2ca63642665652cd9a587cf54141709b6f/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/182ac96263feee5bca0f0e2143f00e777790c4e5735ff3ba8ddbe112d125ddff/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-206:{mountpoint:/var/lib/containers/storage/overlay/8659c156470e53fe5c973827ee699001769b2242553bea709e756b4e67b3d9d5/merged major:0 minor:206 fsType:overlay blockSize:0} overlay_0-219:{mountpoint:/var/lib/containers/storage/overlay/9f0d41721ea4552b6a12a46145ecad8cac4f76cf25d816924b26e624cbd7fe1d/merged major:0 minor:219 fsType:overlay blockSize:0} overlay_0-228:{mountpoint:/var/lib/containers/storage/overlay/297ace21ddd432d81d7b585c1f45b6cfd386564184684f1e7ee048b441e78a40/merged major:0 minor:228 fsType:overlay blockSize:0} overlay_0-236:{mountpoint:/var/lib/containers/storage/overlay/76738f80f3bd0288fddd5a2c1ddfac2cb8aa863f26e8cff9a0e8faca265722ab/merged major:0 minor:236 fsType:overlay blockSize:0} overlay_0-244:{mountpoint:/var/lib/containers/storage/overlay/6deec249017972248057f2fad13f5c9fc72fb5ba9980da2625467e86de5b8aa0/merged major:0 minor:244 fsType:overlay blockSize:0} overlay_0-252:{mountpoint:/var/lib/containers/storage/overlay/e3b3474e3c26528abebded8429a3051db3fe95979c71ae59152deb3ae0668d40/merged major:0 minor:252 fsType:overlay blockSize:0} overlay_0-260:{mountpoint:/var/lib/containers/storage/overlay/4ac9ba4f6247f04e187d7fe8a1d8377f91bf4b8c55f8b24bd6ee401f089bdb01/merged major:0 minor:260 fsType:overlay blockSize:0} overlay_0-268:{mountpoint:/var/lib/containers/storage/overlay/3e822920d6ed6f2b76af28ff2be7525444850654efc38e57b420fbca87d9864f/merged major:0 minor:268 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/3224cce41c1fa6ac4bc26df67118122dc4d84a179332d3e14c569bc9639f3262/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/2ec937be02c006e8ea30f0c26d738878ef91fcb8b10f853675598f257c084c85/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-338:{mountpoint:/var/lib/containers/storage/overlay/4a664c51113d273b7d013c6c40831a5c27a717b27b9e9258f466a9e8218e1f42/merged major:0 minor:338 fsType:overlay blockSize:0} overlay_0-346:{mountpoint:/var/lib/containers/storage/overlay/95f70ee53f3561b38690bac06c2e6778e48d100ce469a85b8db0f5aa99e21b1d/merged major:0 minor:346 fsType:overlay blockSize:0} overlay_0-348:{mountpoint:/var/lib/containers/storage/overlay/6d67683af6fd1ed1094663b55adecef20a4d8f64a540ba84705b4b98152e82f1/merged major:0 minor:348 fsType:overlay blockSize:0} overlay_0-351:{mountpoint:/var/lib/containers/storage/overlay/8749c62c3a7fb854198a5e327d0b7160f22eeadd093d63723505af98920f563e/merged major:0 minor:351 fsType:overlay blockSize:0} overlay_0-354:{mountpoint:/var/lib/containers/storage/overlay/7d48f4a394638040cf7c0e538fa3e6b388b18cdc16a8d2fe2922d33fe20a7ed4/merged major:0 minor:354 fsType:overlay blockSize:0} overlay_0-356:{mountpoint:/var/lib/containers/storage/overlay/180f64707d36d3a3a449455105de1db58d613f89875b7dd3055e3eaa2e4cd837/merged major:0 minor:356 fsType:overlay blockSize:0} overlay_0-358:{mountpoint:/var/lib/containers/storage/overlay/8fd0db57a1b6addcec7e68a35f0aa1c280da87f0c23e452949f20eddb658417c/merged major:0 minor:358 fsType:overlay blockSize:0} overlay_0-362:{mountpoint:/var/lib/containers/storage/overlay/be87fd787b02484e30a261a48a388c79c4042d4175481763c6feb4044fd6db8b/merged major:0 minor:362 fsType:overlay blockSize:0} overlay_0-364:{mountpoint:/var/lib/containers/storage/overlay/9546388ec8993231c68c32ca39851d2ed8dcc814833d9fa999526d11ff3a9cde/merged major:0 minor:364 fsType:overlay blockSize:0} overlay_0-366:{mountpoint:/var/lib/containers/storage/overlay/4e55472085cd5241a0d7ac0f8b0557312fd2ed995a41d253dacacd625f35afe4/merged major:0 minor:366 fsType:overlay blockSize:0} overlay_0-368:{mountpoint:/var/lib/containers/storage/overlay/fae14a26a50255f69a0e16428a56449ac4183e03e41b9a663b5409bb10e95950/merged major:0 minor:368 fsType:overlay blockSize:0} overlay_0-370:{mountpoint:/var/lib/containers/storage/overlay/a3bfc79070e9d4b9ec3df7498044adc650bb6467643cf465e47146211d906477/merged major:0 minor:370 fsType:overlay blockSize:0} overlay_0-372:{mountpoint:/var/lib/containers/storage/overlay/68a7f71e3b40e9782a03553f85ef7bdc14c259064c6296ddd229c55a2612e3ca/merged major:0 minor:372 fsType:overlay blockSize:0} overlay_0-374:{mountpoint:/var/lib/containers/storage/overlay/2a0150cc2c7e3c6d8d319702844184974ac344e27d05464cdbee4621a5f950dc/merged major:0 minor:374 fsType:overlay blockSize:0} overlay_0-390:{mountpoint:/var/lib/containers/storage/overlay/db1d3852c9bb1aa6e01fcbf5f1bd27200b172b147e54dc0ecf076c4a9440dc0a/merged major:0 minor:390 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/5be720ed0f07c6e4ec1a71bfefb06cfa7d394c5774fad1940d52b4c27d2fb47a/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/8216183a9a6560247b8a9b605cb6b0afb9384e9a33d361b1b0bb474a3d9953c5/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/be06c02eedfdab80d2f321980daffb39ce8a9aa0b2d7ac55f42d39601a30e3f3/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/0f208c6e81feb4f53230a27a0d62529133cdcbd6c08905b510683d1d5507146f/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/636aebaf211fc059b0d012f129027563cd2c8906d50ca9688ca02507dafa327a/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/e3923e8cce784e95881e391276cd0c1646d15a40caeabff733b9e204e0c03e21/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/be83a66558b076e29911f2cf7fb0a1a795c03cdbf204ab0e3675cc6199d7ec48/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/4534b2d32eaa35406dbc51ec4b2997821f02f056ae94638decc852cb1b60b5fb/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/8848b32588f76fbf279754edd9aeb50ca4201d5b3e405a230cb1f42415b26b8b/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/d6fc155bdb658e853e615788af0fea6d39109152a16c2f818e1f70b5820749fa/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-80:{mountpoint:/var/lib/containers/storage/overlay/ba4e9bc11020c8e87ebbca254973a95f1b3d13be91906624dda5716dbf08971e/merged major:0 minor:80 fsType:overlay blockSize:0} overlay_0-88:{mountpoint:/var/lib/containers/storage/overlay/9c69591bcfb954d42008989af47e724c67ef4a131032af9e3c819cd261d9569f/merged major:0 minor:88 fsType:overlay blockSize:0} overlay_0-96:{mountpoint:/var/lib/containers/storage/overlay/92bdfb2e4303f64303686cbed93e23c5f9449f0c8850534171784b36e5083546/merged major:0 minor:96 fsType:overlay blockSize:0}] Dec 05 12:31:41.850954 master-0 kubenswrapper[8731]: I1205 12:31:41.849296 8731 manager.go:217] Machine: {Timestamp:2025-12-05 12:31:41.84585524 +0000 UTC m=+0.149839437 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514145280 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:7ed1cb80ed224980aa762c96e2471f55 SystemUUID:7ed1cb80-ed22-4980-aa76-2c96e2471f55 BootID:195a1d65-51c2-44ad-9194-26630da59f9f Filesystems:[{Device:/run/containers/storage/overlay-containers/b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b/userdata/shm DeviceMajor:0 DeviceMinor:149 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6/userdata/shm DeviceMajor:0 DeviceMinor:350 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-139 DeviceMajor:0 DeviceMinor:139 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~projected/kube-api-access-9z8h9 DeviceMajor:0 DeviceMinor:157 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-219 DeviceMajor:0 DeviceMinor:219 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-252 DeviceMajor:0 DeviceMinor:252 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:158 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134/userdata/shm DeviceMajor:0 DeviceMinor:164 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-354 DeviceMajor:0 DeviceMinor:354 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-80 DeviceMajor:0 DeviceMinor:80 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/kube-api-access-fxxw7 DeviceMajor:0 DeviceMinor:294 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-112 DeviceMajor:0 DeviceMinor:112 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1/volumes/kubernetes.io~projected/kube-api-access-x59kd DeviceMajor:0 DeviceMinor:141 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~projected/kube-api-access-2tngh DeviceMajor:0 DeviceMinor:293 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-370 DeviceMajor:0 DeviceMinor:370 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33/userdata/shm DeviceMajor:0 DeviceMinor:316 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab/userdata/shm DeviceMajor:0 DeviceMinor:327 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102829056 Type:vfs Inodes:819200 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-268 DeviceMajor:0 DeviceMinor:268 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-197 DeviceMajor:0 DeviceMinor:197 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~projected/kube-api-access-nml2g DeviceMajor:0 DeviceMinor:299 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/kube-api-access-t5hdg DeviceMajor:0 DeviceMinor:309 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3a9d8373a41ae93e2045d1c0300d43339b0c915de4cad9048741918269853b51/userdata/shm DeviceMajor:0 DeviceMinor:344 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-88 DeviceMajor:0 DeviceMinor:88 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733/userdata/shm DeviceMajor:0 DeviceMinor:160 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/58187662-b502-4d90-95ce-2aa91a81d256/volumes/kubernetes.io~projected/kube-api-access-ps4ws DeviceMajor:0 DeviceMinor:312 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~projected/kube-api-access-996h9 DeviceMajor:0 DeviceMinor:76 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-151 DeviceMajor:0 DeviceMinor:151 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-368 DeviceMajor:0 DeviceMinor:368 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-374 DeviceMajor:0 DeviceMinor:374 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9fd6db41eb8dc90e6efffc25bb3c93739722e6824dad0dcb9a786720bc6514c4/userdata/shm DeviceMajor:0 DeviceMinor:337 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-260 DeviceMajor:0 DeviceMinor:260 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:289 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:296 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:306 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bba3aa271baddd92ed5881d6af79fb82b3a45fce07083a5cd051cbeeb1a01428/userdata/shm DeviceMajor:0 DeviceMinor:321 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257070592 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-244 DeviceMajor:0 DeviceMinor:244 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:308 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-366 DeviceMajor:0 DeviceMinor:366 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-191 DeviceMajor:0 DeviceMinor:191 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6/volumes/kubernetes.io~projected/kube-api-access-mlnqb DeviceMajor:0 DeviceMinor:343 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-228 DeviceMajor:0 DeviceMinor:228 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:290 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-162 DeviceMajor:0 DeviceMinor:162 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-168 DeviceMajor:0 DeviceMinor:168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ecdffd0c2fc8d747077d4ca5dcb541da82682f6d035455ac42566e8514bfadc3/userdata/shm DeviceMajor:0 DeviceMinor:332 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-348 DeviceMajor:0 DeviceMinor:348 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-351 DeviceMajor:0 DeviceMinor:351 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e/userdata/shm DeviceMajor:0 DeviceMinor:360 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~projected/kube-api-access-dmq98 DeviceMajor:0 DeviceMinor:159 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:307 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-372 DeviceMajor:0 DeviceMinor:372 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~projected/kube-api-access-dtvzs DeviceMajor:0 DeviceMinor:295 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/49760d62-02e5-4882-b47f-663102b04946/volumes/kubernetes.io~projected/kube-api-access-26x2z DeviceMajor:0 DeviceMinor:335 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-96 DeviceMajor:0 DeviceMinor:96 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-104 DeviceMajor:0 DeviceMinor:104 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d/userdata/shm DeviceMajor:0 DeviceMinor:187 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:288 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:304 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:292 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/44e741be030df14b7e9e415d32f4095c562d693609b8dc4bd8ec51c21503bbca/userdata/shm DeviceMajor:0 DeviceMinor:334 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-358 DeviceMajor:0 DeviceMinor:358 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~projected/kube-api-access-ph9w6 DeviceMajor:0 DeviceMinor:302 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-120 DeviceMajor:0 DeviceMinor:120 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:156 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-183 DeviceMajor:0 DeviceMinor:183 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fb7003a6-4341-49eb-bec3-76ba8610fa12/volumes/kubernetes.io~projected/kube-api-access-69n5s DeviceMajor:0 DeviceMinor:153 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-236 DeviceMajor:0 DeviceMinor:236 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:303 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~projected/kube-api-access-wwcr9 DeviceMajor:0 DeviceMinor:319 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:341 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-346 DeviceMajor:0 DeviceMinor:346 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1bd06826dd54922214ff0bdf4dd49e3e4fb5917fe2431fd30da1ce39eb71cae2/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/29812c4b-48ac-488c-863c-1d52e39ea2ae/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:77 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8/userdata/shm DeviceMajor:0 DeviceMinor:122 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:305 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4/userdata/shm DeviceMajor:0 DeviceMinor:328 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-146 DeviceMajor:0 DeviceMinor:146 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:291 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:315 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782/userdata/shm DeviceMajor:0 DeviceMinor:320 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-390 DeviceMajor:0 DeviceMinor:390 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0dda6d9b-cb3a-413a-85af-ef08f15ea42e/volumes/kubernetes.io~projected/kube-api-access-62nqj DeviceMajor:0 DeviceMinor:300 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~projected/kube-api-access-55qpg DeviceMajor:0 DeviceMinor:314 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e0816ccdefc3d19a555c704cf7914804a097b5a95e2655805ebd92880ab7a03f/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~projected/kube-api-access-mvbfq DeviceMajor:0 DeviceMinor:185 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:186 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-193 DeviceMajor:0 DeviceMinor:193 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:297 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/5f0c6889-0739-48a3-99cd-6db9d1f83242/volumes/kubernetes.io~projected/kube-api-access-p5p5d DeviceMajor:0 DeviceMinor:329 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-124 DeviceMajor:0 DeviceMinor:124 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-126 DeviceMajor:0 DeviceMinor:126 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-144 DeviceMajor:0 DeviceMinor:144 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/708bf629-9949-4b79-a88a-c73ba033475b/volumes/kubernetes.io~projected/kube-api-access-6vx2z DeviceMajor:0 DeviceMinor:148 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb/volumes/kubernetes.io~projected/kube-api-access-z6mb6 DeviceMajor:0 DeviceMinor:298 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd/userdata/shm DeviceMajor:0 DeviceMinor:311 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257074688 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-172 DeviceMajor:0 DeviceMinor:172 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~projected/kube-api-access-xtjln DeviceMajor:0 DeviceMinor:310 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/49f2f301b501743d7a4254bc3eeb040151fb199e2a4d9ec64ddce3a74ce66f5b/userdata/shm DeviceMajor:0 DeviceMinor:325 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-338 DeviceMajor:0 DeviceMinor:338 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-362 DeviceMajor:0 DeviceMinor:362 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-170 DeviceMajor:0 DeviceMinor:170 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~projected/kube-api-access-flxbg DeviceMajor:0 DeviceMinor:301 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:324 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f4a70855-80b5-4d6a-bed1-b42364940de0/volumes/kubernetes.io~projected/kube-api-access-69z2l DeviceMajor:0 DeviceMinor:342 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-364 DeviceMajor:0 DeviceMinor:364 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e5863ee594cde86aeedce8416be9b249f569b2f49267eb70875c7f8a2e451e4e/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/34418050489e8b48781fa5128a0548228f5bdb58f7e6a5f88226bbd7dacf7bb5/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-206 DeviceMajor:0 DeviceMinor:206 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1e6babfe-724a-4eab-bb3b-bc318bf57b70/volumes/kubernetes.io~projected/kube-api-access-c2gd8 DeviceMajor:0 DeviceMinor:318 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-356 DeviceMajor:0 DeviceMinor:356 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102829056 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:2a325da0f7b2c28 MacAddress:2e:8f:82:c2:9f:83 Speed:10000 Mtu:8900} {Name:3d66257a9a5cc16 MacAddress:ae:64:c3:cd:c0:34 Speed:10000 Mtu:8900} {Name:44e741be030df14 MacAddress:72:9d:94:0f:f7:9d Speed:10000 Mtu:8900} {Name:47731386c0cb9aa MacAddress:16:17:6f:15:8e:f5 Speed:10000 Mtu:8900} {Name:49f2f301b501743 MacAddress:b2:55:6c:0e:58:85 Speed:10000 Mtu:8900} {Name:6d7e84b5ce96cc7 MacAddress:76:ac:27:f2:08:72 Speed:10000 Mtu:8900} {Name:9fd6db41eb8dc90 MacAddress:92:e8:6e:43:d8:9f Speed:10000 Mtu:8900} {Name:b36190e4cf6d5a6 MacAddress:ce:9c:3a:9d:42:61 Speed:10000 Mtu:8900} {Name:bba3aa271baddd9 MacAddress:3a:66:f2:3c:04:f8 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:b2:49:0d:80:cf:b8 Speed:0 Mtu:8900} {Name:c9238078b14a694 MacAddress:7e:5d:7f:f4:3d:54 Speed:10000 Mtu:8900} {Name:df3031001bb8ce6 MacAddress:9e:ed:54:fe:dd:08 Speed:10000 Mtu:8900} {Name:ecdffd0c2fc8d74 MacAddress:22:c6:19:c7:5f:68 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:27:b3:a6 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:d3:8e:e6 Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:8a:28:cb:43:ed:9b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514145280 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 12:31:41.851910 master-0 kubenswrapper[8731]: I1205 12:31:41.851689 8731 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 12:31:41.852284 master-0 kubenswrapper[8731]: I1205 12:31:41.852263 8731 manager.go:233] Version: {KernelVersion:5.14.0-427.100.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511170715-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 12:31:41.852951 master-0 kubenswrapper[8731]: I1205 12:31:41.852934 8731 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 12:31:41.853468 master-0 kubenswrapper[8731]: I1205 12:31:41.853426 8731 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 12:31:41.854058 master-0 kubenswrapper[8731]: I1205 12:31:41.853597 8731 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 12:31:41.854385 master-0 kubenswrapper[8731]: I1205 12:31:41.854367 8731 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 12:31:41.854646 master-0 kubenswrapper[8731]: I1205 12:31:41.854592 8731 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 12:31:41.854912 master-0 kubenswrapper[8731]: I1205 12:31:41.854852 8731 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 12:31:41.855199 master-0 kubenswrapper[8731]: I1205 12:31:41.855162 8731 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 12:31:41.855761 master-0 kubenswrapper[8731]: I1205 12:31:41.855704 8731 state_mem.go:36] "Initialized new in-memory state store" Dec 05 12:31:41.856260 master-0 kubenswrapper[8731]: I1205 12:31:41.856201 8731 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 12:31:41.856647 master-0 kubenswrapper[8731]: I1205 12:31:41.856634 8731 kubelet.go:418] "Attempting to sync node with API server" Dec 05 12:31:41.856920 master-0 kubenswrapper[8731]: I1205 12:31:41.856865 8731 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 12:31:41.857322 master-0 kubenswrapper[8731]: I1205 12:31:41.857263 8731 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 12:31:41.857506 master-0 kubenswrapper[8731]: I1205 12:31:41.857494 8731 kubelet.go:324] "Adding apiserver pod source" Dec 05 12:31:41.857582 master-0 kubenswrapper[8731]: I1205 12:31:41.857571 8731 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 12:31:41.862344 master-0 kubenswrapper[8731]: I1205 12:31:41.862126 8731 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 05 12:31:41.866240 master-0 kubenswrapper[8731]: I1205 12:31:41.865974 8731 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 12:31:41.869260 master-0 kubenswrapper[8731]: I1205 12:31:41.869213 8731 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 12:31:41.869690 master-0 kubenswrapper[8731]: I1205 12:31:41.869620 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 12:31:41.869690 master-0 kubenswrapper[8731]: I1205 12:31:41.869652 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 12:31:41.869690 master-0 kubenswrapper[8731]: I1205 12:31:41.869663 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 12:31:41.869690 master-0 kubenswrapper[8731]: I1205 12:31:41.869679 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 12:31:41.869690 master-0 kubenswrapper[8731]: I1205 12:31:41.869687 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 12:31:41.869690 master-0 kubenswrapper[8731]: I1205 12:31:41.869698 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 12:31:41.869880 master-0 kubenswrapper[8731]: I1205 12:31:41.869710 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 12:31:41.869880 master-0 kubenswrapper[8731]: I1205 12:31:41.869720 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 12:31:41.869880 master-0 kubenswrapper[8731]: I1205 12:31:41.869741 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 12:31:41.869880 master-0 kubenswrapper[8731]: I1205 12:31:41.869751 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 12:31:41.869880 master-0 kubenswrapper[8731]: I1205 12:31:41.869767 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 12:31:41.869880 master-0 kubenswrapper[8731]: I1205 12:31:41.869785 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 12:31:41.869880 master-0 kubenswrapper[8731]: I1205 12:31:41.869825 8731 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 12:31:41.871574 master-0 kubenswrapper[8731]: I1205 12:31:41.871527 8731 server.go:1280] "Started kubelet" Dec 05 12:31:41.873106 master-0 kubenswrapper[8731]: I1205 12:31:41.871863 8731 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 12:31:41.873106 master-0 kubenswrapper[8731]: I1205 12:31:41.872046 8731 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 05 12:31:41.873106 master-0 kubenswrapper[8731]: I1205 12:31:41.872124 8731 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 12:31:41.873106 master-0 kubenswrapper[8731]: I1205 12:31:41.872944 8731 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 12:31:41.874530 master-0 kubenswrapper[8731]: I1205 12:31:41.874231 8731 server.go:449] "Adding debug handlers to kubelet server" Dec 05 12:31:41.875643 master-0 kubenswrapper[8731]: I1205 12:31:41.875580 8731 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 12:31:41.876207 master-0 kubenswrapper[8731]: I1205 12:31:41.876148 8731 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 12:31:41.876558 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 05 12:31:41.880040 master-0 kubenswrapper[8731]: I1205 12:31:41.879605 8731 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 12:31:41.880040 master-0 kubenswrapper[8731]: I1205 12:31:41.880039 8731 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 12:31:41.881366 master-0 kubenswrapper[8731]: I1205 12:31:41.881143 8731 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 12:31:41.881781 master-0 kubenswrapper[8731]: I1205 12:31:41.881733 8731 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 12:31:41.881979 master-0 kubenswrapper[8731]: I1205 12:31:41.881489 8731 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 05 12:31:41.882025 master-0 kubenswrapper[8731]: I1205 12:31:41.881630 8731 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-06 12:20:41 +0000 UTC, rotation deadline is 2025-12-06 09:15:07.677042077 +0000 UTC Dec 05 12:31:41.882705 master-0 kubenswrapper[8731]: I1205 12:31:41.882026 8731 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h43m25.795032187s for next certificate rotation Dec 05 12:31:41.883533 master-0 kubenswrapper[8731]: I1205 12:31:41.883484 8731 factory.go:55] Registering systemd factory Dec 05 12:31:41.883533 master-0 kubenswrapper[8731]: I1205 12:31:41.883508 8731 factory.go:221] Registration of the systemd container factory successfully Dec 05 12:31:41.883944 master-0 kubenswrapper[8731]: I1205 12:31:41.883893 8731 factory.go:153] Registering CRI-O factory Dec 05 12:31:41.883944 master-0 kubenswrapper[8731]: I1205 12:31:41.883914 8731 factory.go:221] Registration of the crio container factory successfully Dec 05 12:31:41.884071 master-0 kubenswrapper[8731]: I1205 12:31:41.884035 8731 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 12:31:41.884117 master-0 kubenswrapper[8731]: I1205 12:31:41.884073 8731 factory.go:103] Registering Raw factory Dec 05 12:31:41.884117 master-0 kubenswrapper[8731]: I1205 12:31:41.884101 8731 manager.go:1196] Started watching for new ooms in manager Dec 05 12:31:41.884850 master-0 kubenswrapper[8731]: I1205 12:31:41.884812 8731 manager.go:319] Starting recovery of all containers Dec 05 12:31:41.889093 master-0 kubenswrapper[8731]: I1205 12:31:41.889012 8731 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 12:31:41.890339 master-0 kubenswrapper[8731]: I1205 12:31:41.890257 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6" volumeName="kubernetes.io/configmap/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-iptables-alerter-script" seLinuxMountContext="" Dec 05 12:31:41.890339 master-0 kubenswrapper[8731]: I1205 12:31:41.890330 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-service-ca" seLinuxMountContext="" Dec 05 12:31:41.890429 master-0 kubenswrapper[8731]: I1205 12:31:41.890344 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3792522-fec6-4022-90ac-0b8467fcd625" volumeName="kubernetes.io/secret/f3792522-fec6-4022-90ac-0b8467fcd625-serving-cert" seLinuxMountContext="" Dec 05 12:31:41.890429 master-0 kubenswrapper[8731]: I1205 12:31:41.890357 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a2acba71-b9dc-4b85-be35-c995b8be2f19" volumeName="kubernetes.io/projected/a2acba71-b9dc-4b85-be35-c995b8be2f19-kube-api-access-nml2g" seLinuxMountContext="" Dec 05 12:31:41.890429 master-0 kubenswrapper[8731]: I1205 12:31:41.890369 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58187662-b502-4d90-95ce-2aa91a81d256" volumeName="kubernetes.io/projected/58187662-b502-4d90-95ce-2aa91a81d256-kube-api-access-ps4ws" seLinuxMountContext="" Dec 05 12:31:41.890429 master-0 kubenswrapper[8731]: I1205 12:31:41.890379 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" volumeName="kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-kube-api-access-fxxw7" seLinuxMountContext="" Dec 05 12:31:41.890429 master-0 kubenswrapper[8731]: I1205 12:31:41.890390 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8233dad-bd19-4842-a4d5-cfa84f1feb83" volumeName="kubernetes.io/projected/b8233dad-bd19-4842-a4d5-cfa84f1feb83-kube-api-access-mvbfq" seLinuxMountContext="" Dec 05 12:31:41.890429 master-0 kubenswrapper[8731]: I1205 12:31:41.890400 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-ca" seLinuxMountContext="" Dec 05 12:31:41.890429 master-0 kubenswrapper[8731]: I1205 12:31:41.890432 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-client" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890443 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3792522-fec6-4022-90ac-0b8467fcd625" volumeName="kubernetes.io/projected/f3792522-fec6-4022-90ac-0b8467fcd625-kube-api-access-flxbg" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890455 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49760d62-02e5-4882-b47f-663102b04946" volumeName="kubernetes.io/projected/49760d62-02e5-4882-b47f-663102b04946-kube-api-access-26x2z" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890467 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" volumeName="kubernetes.io/projected/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-kube-api-access-dtvzs" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890476 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="807d9093-aa67-4840-b5be-7f3abcc1beed" volumeName="kubernetes.io/configmap/807d9093-aa67-4840-b5be-7f3abcc1beed-config" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890488 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" volumeName="kubernetes.io/secret/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-cluster-olm-operator-serving-cert" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890499 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-config" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890510 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" volumeName="kubernetes.io/configmap/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-config" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890523 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4492c55f-701b-4ec8-ada1-0a5dc126d405" volumeName="kubernetes.io/projected/4492c55f-701b-4ec8-ada1-0a5dc126d405-kube-api-access-dmq98" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890533 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1871a9d6-6369-4d08-816f-9c6310b61ddf" volumeName="kubernetes.io/projected/1871a9d6-6369-4d08-816f-9c6310b61ddf-kube-api-access" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890543 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58187662-b502-4d90-95ce-2aa91a81d256" volumeName="kubernetes.io/configmap/58187662-b502-4d90-95ce-2aa91a81d256-telemetry-config" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890554 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="708bf629-9949-4b79-a88a-c73ba033475b" volumeName="kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890564 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" volumeName="kubernetes.io/secret/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890574 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d53a4886-db25-43a1-825a-66a9a9a58590" volumeName="kubernetes.io/secret/d53a4886-db25-43a1-825a-66a9a9a58590-serving-cert" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890609 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1e6babfe-724a-4eab-bb3b-bc318bf57b70" volumeName="kubernetes.io/projected/1e6babfe-724a-4eab-bb3b-bc318bf57b70-kube-api-access-c2gd8" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890618 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4492c55f-701b-4ec8-ada1-0a5dc126d405" volumeName="kubernetes.io/secret/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890627 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="708bf629-9949-4b79-a88a-c73ba033475b" volumeName="kubernetes.io/projected/708bf629-9949-4b79-a88a-c73ba033475b-kube-api-access-6vx2z" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890637 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="807d9093-aa67-4840-b5be-7f3abcc1beed" volumeName="kubernetes.io/secret/807d9093-aa67-4840-b5be-7f3abcc1beed-serving-cert" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890651 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba095394-1873-4793-969d-3be979fa0771" volumeName="kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-config" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890662 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/projected/f119ffe4-16bd-49eb-916d-b18ba0d79b54-kube-api-access-wwcr9" seLinuxMountContext="" Dec 05 12:31:41.890669 master-0 kubenswrapper[8731]: I1205 12:31:41.890673 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1871a9d6-6369-4d08-816f-9c6310b61ddf" volumeName="kubernetes.io/secret/1871a9d6-6369-4d08-816f-9c6310b61ddf-serving-cert" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890686 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-serving-cert" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890718 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" volumeName="kubernetes.io/projected/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-kube-api-access-ph9w6" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890730 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4b7f0d8d-a2bf-4550-b6e6-1c56adae827e" volumeName="kubernetes.io/projected/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-kube-api-access-xtjln" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890744 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="594aaded-5615-4bed-87ee-6173059a73be" volumeName="kubernetes.io/secret/594aaded-5615-4bed-87ee-6173059a73be-serving-cert" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890756 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="807d9093-aa67-4840-b5be-7f3abcc1beed" volumeName="kubernetes.io/projected/807d9093-aa67-4840-b5be-7f3abcc1beed-kube-api-access" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890768 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" volumeName="kubernetes.io/empty-dir/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-operand-assets" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890779 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d53a4886-db25-43a1-825a-66a9a9a58590" volumeName="kubernetes.io/projected/d53a4886-db25-43a1-825a-66a9a9a58590-kube-api-access-2tngh" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890791 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dda6d9b-cb3a-413a-85af-ef08f15ea42e" volumeName="kubernetes.io/projected/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-kube-api-access-62nqj" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890803 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4492c55f-701b-4ec8-ada1-0a5dc126d405" volumeName="kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-script-lib" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890814 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" volumeName="kubernetes.io/configmap/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-trusted-ca" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890826 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba095394-1873-4793-969d-3be979fa0771" volumeName="kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-trusted-ca-bundle" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890837 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29812c4b-48ac-488c-863c-1d52e39ea2ae" volumeName="kubernetes.io/projected/29812c4b-48ac-488c-863c-1d52e39ea2ae-kube-api-access" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890849 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8233dad-bd19-4842-a4d5-cfa84f1feb83" volumeName="kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-env-overrides" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890861 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f725fa37-ef11-479a-8cf9-f4b90fe5e7a1" volumeName="kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cni-binary-copy" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890873 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29812c4b-48ac-488c-863c-1d52e39ea2ae" volumeName="kubernetes.io/configmap/29812c4b-48ac-488c-863c-1d52e39ea2ae-service-ca" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890886 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4492c55f-701b-4ec8-ada1-0a5dc126d405" volumeName="kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-env-overrides" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890897 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4b7f0d8d-a2bf-4550-b6e6-1c56adae827e" volumeName="kubernetes.io/configmap/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-config" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890908 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="594aaded-5615-4bed-87ee-6173059a73be" volumeName="kubernetes.io/projected/594aaded-5615-4bed-87ee-6173059a73be-kube-api-access" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890919 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d53a4886-db25-43a1-825a-66a9a9a58590" volumeName="kubernetes.io/configmap/d53a4886-db25-43a1-825a-66a9a9a58590-config" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890931 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f725fa37-ef11-479a-8cf9-f4b90fe5e7a1" volumeName="kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-daemon-config" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890943 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb7003a6-4341-49eb-bec3-76ba8610fa12" volumeName="kubernetes.io/projected/fb7003a6-4341-49eb-bec3-76ba8610fa12-kube-api-access-69n5s" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890955 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4492c55f-701b-4ec8-ada1-0a5dc126d405" volumeName="kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-config" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890966 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="38941513-e968-45f1-9cb2-b63d40338f36" volumeName="kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-kube-api-access-t5hdg" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.890982 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f0c6889-0739-48a3-99cd-6db9d1f83242" volumeName="kubernetes.io/projected/5f0c6889-0739-48a3-99cd-6db9d1f83242-kube-api-access-p5p5d" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891033 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="708bf629-9949-4b79-a88a-c73ba033475b" volumeName="kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-binary-copy" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891054 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" volumeName="kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-bound-sa-token" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891068 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" volumeName="kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovnkube-config" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891085 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8233dad-bd19-4842-a4d5-cfa84f1feb83" volumeName="kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891097 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" volumeName="kubernetes.io/projected/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-kube-api-access-z6mb6" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891107 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="38941513-e968-45f1-9cb2-b63d40338f36" volumeName="kubernetes.io/configmap/38941513-e968-45f1-9cb2-b63d40338f36-trusted-ca" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891120 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4b7f0d8d-a2bf-4550-b6e6-1c56adae827e" volumeName="kubernetes.io/secret/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-serving-cert" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891135 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="708bf629-9949-4b79-a88a-c73ba033475b" volumeName="kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-whereabouts-configmap" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891150 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" volumeName="kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-env-overrides" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891163 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" volumeName="kubernetes.io/projected/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-kube-api-access-9z8h9" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891190 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8233dad-bd19-4842-a4d5-cfa84f1feb83" volumeName="kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891205 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba095394-1873-4793-969d-3be979fa0771" volumeName="kubernetes.io/projected/ba095394-1873-4793-969d-3be979fa0771-kube-api-access-55qpg" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891216 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1e6babfe-724a-4eab-bb3b-bc318bf57b70" volumeName="kubernetes.io/configmap/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 12:31:41.891502 master-0 kubenswrapper[8731]: I1205 12:31:41.891227 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="38941513-e968-45f1-9cb2-b63d40338f36" volumeName="kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-bound-sa-token" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891244 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a2acba71-b9dc-4b85-be35-c995b8be2f19" volumeName="kubernetes.io/configmap/a2acba71-b9dc-4b85-be35-c995b8be2f19-trusted-ca" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891585 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6" volumeName="kubernetes.io/projected/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-kube-api-access-mlnqb" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891597 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3792522-fec6-4022-90ac-0b8467fcd625" volumeName="kubernetes.io/configmap/f3792522-fec6-4022-90ac-0b8467fcd625-config" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891608 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1871a9d6-6369-4d08-816f-9c6310b61ddf" volumeName="kubernetes.io/configmap/1871a9d6-6369-4d08-816f-9c6310b61ddf-config" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891620 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f4a70855-80b5-4d6a-bed1-b42364940de0" volumeName="kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891637 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba095394-1873-4793-969d-3be979fa0771" volumeName="kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-service-ca-bundle" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891649 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5efad170-c154-42ec-a7c0-b36a98d2bfcc" volumeName="kubernetes.io/projected/5efad170-c154-42ec-a7c0-b36a98d2bfcc-kube-api-access-996h9" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891661 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5efad170-c154-42ec-a7c0-b36a98d2bfcc" volumeName="kubernetes.io/secret/5efad170-c154-42ec-a7c0-b36a98d2bfcc-metrics-tls" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891690 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" volumeName="kubernetes.io/secret/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-serving-cert" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891702 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba095394-1873-4793-969d-3be979fa0771" volumeName="kubernetes.io/secret/ba095394-1873-4793-969d-3be979fa0771-serving-cert" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891713 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f725fa37-ef11-479a-8cf9-f4b90fe5e7a1" volumeName="kubernetes.io/projected/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-kube-api-access-x59kd" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891757 8731 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="594aaded-5615-4bed-87ee-6173059a73be" volumeName="kubernetes.io/configmap/594aaded-5615-4bed-87ee-6173059a73be-config" seLinuxMountContext="" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891766 8731 reconstruct.go:97] "Volume reconstruction finished" Dec 05 12:31:41.892494 master-0 kubenswrapper[8731]: I1205 12:31:41.891773 8731 reconciler.go:26] "Reconciler: start to sync state" Dec 05 12:31:41.894897 master-0 kubenswrapper[8731]: I1205 12:31:41.894835 8731 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 12:31:41.929404 master-0 kubenswrapper[8731]: I1205 12:31:41.929322 8731 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 12:31:41.933306 master-0 kubenswrapper[8731]: I1205 12:31:41.933245 8731 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 12:31:41.933306 master-0 kubenswrapper[8731]: I1205 12:31:41.933310 8731 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 12:31:41.933418 master-0 kubenswrapper[8731]: I1205 12:31:41.933343 8731 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 12:31:41.933559 master-0 kubenswrapper[8731]: E1205 12:31:41.933503 8731 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 12:31:41.935810 master-0 kubenswrapper[8731]: I1205 12:31:41.935749 8731 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 12:31:41.946996 master-0 kubenswrapper[8731]: I1205 12:31:41.946939 8731 generic.go:334] "Generic (PLEG): container finished" podID="d75143d9bc4a2dc15781dc51ccff632a" containerID="697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334" exitCode=0 Dec 05 12:31:41.950984 master-0 kubenswrapper[8731]: I1205 12:31:41.950932 8731 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="123ca114b6002ab3cd24848ba210c8015d871a3bf5c2f6653a7daa022e0dea48" exitCode=1 Dec 05 12:31:41.957416 master-0 kubenswrapper[8731]: I1205 12:31:41.957353 8731 generic.go:334] "Generic (PLEG): container finished" podID="b58aa15d-cdea-4a90-ba40-706d6a85735e" containerID="87e2f0751f7349d9f2700480abbb17089facf86a7329bd4aecf04d7f2bed205a" exitCode=0 Dec 05 12:31:41.961963 master-0 kubenswrapper[8731]: I1205 12:31:41.961926 8731 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="48cc412fc0495a9b989b3163afe32a67e585bd82e370a59d4690f30fe1abc9dc" exitCode=0 Dec 05 12:31:41.962057 master-0 kubenswrapper[8731]: I1205 12:31:41.961964 8731 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="d98d05970b7b2ac04c6af16edb9c07e4ea790e687fa82b42828f83752f9655a5" exitCode=0 Dec 05 12:31:41.962057 master-0 kubenswrapper[8731]: I1205 12:31:41.961978 8731 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="678a3e3b29045fc802f2f4ea9939ca067adfe6ff12b24bb2dd5f895390e55a41" exitCode=0 Dec 05 12:31:41.962057 master-0 kubenswrapper[8731]: I1205 12:31:41.961987 8731 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="46d777da61d52678086a53c15e814977a05f1e509e1945fa53a5e65cac047f51" exitCode=0 Dec 05 12:31:41.962057 master-0 kubenswrapper[8731]: I1205 12:31:41.961998 8731 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="d81a6813a03e38c556e737371d737471f12aa2c77281926715e2cfe7ffc056aa" exitCode=0 Dec 05 12:31:41.962057 master-0 kubenswrapper[8731]: I1205 12:31:41.962008 8731 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="503a0b99be77d72f51d7afcf8403bc7d040b77fef62f126cd910c2ff4b520892" exitCode=0 Dec 05 12:31:41.963846 master-0 kubenswrapper[8731]: I1205 12:31:41.963800 8731 generic.go:334] "Generic (PLEG): container finished" podID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerID="ded126662555b11ef5f6022975feef5471a12cb6870d5933adf38dcb51422cc7" exitCode=0 Dec 05 12:31:41.986541 master-0 kubenswrapper[8731]: I1205 12:31:41.986477 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/2.log" Dec 05 12:31:41.987049 master-0 kubenswrapper[8731]: I1205 12:31:41.986981 8731 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="1071666e1f17d7cf23c890a67d65947ba5ea19368b6f26e80669ec0f695e375b" exitCode=1 Dec 05 12:31:41.987049 master-0 kubenswrapper[8731]: I1205 12:31:41.987035 8731 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="419ef08e96de1310d58b89d9dc91be12123ef06b7cf2f7b293589e349077e04c" exitCode=0 Dec 05 12:31:41.994873 master-0 kubenswrapper[8731]: I1205 12:31:41.994839 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5nqhk_f725fa37-ef11-479a-8cf9-f4b90fe5e7a1/kube-multus/0.log" Dec 05 12:31:41.994991 master-0 kubenswrapper[8731]: I1205 12:31:41.994887 8731 generic.go:334] "Generic (PLEG): container finished" podID="f725fa37-ef11-479a-8cf9-f4b90fe5e7a1" containerID="60d32869d5d76c04555375fdfd9ab0f008a07a41f85b96737cde09fadc0deeb4" exitCode=1 Dec 05 12:31:42.004068 master-0 kubenswrapper[8731]: I1205 12:31:42.004022 8731 generic.go:334] "Generic (PLEG): container finished" podID="4492c55f-701b-4ec8-ada1-0a5dc126d405" containerID="1e1ba9d3a2cd6fc3c76c6b40cc81f5a9fa8707214a43505b547185529870eae9" exitCode=0 Dec 05 12:31:42.034523 master-0 kubenswrapper[8731]: E1205 12:31:42.034457 8731 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 05 12:31:42.047958 master-0 kubenswrapper[8731]: I1205 12:31:42.047906 8731 manager.go:324] Recovery completed Dec 05 12:31:42.092746 master-0 kubenswrapper[8731]: I1205 12:31:42.092690 8731 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 12:31:42.092746 master-0 kubenswrapper[8731]: I1205 12:31:42.092738 8731 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 12:31:42.092930 master-0 kubenswrapper[8731]: I1205 12:31:42.092823 8731 state_mem.go:36] "Initialized new in-memory state store" Dec 05 12:31:42.093164 master-0 kubenswrapper[8731]: I1205 12:31:42.093139 8731 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 05 12:31:42.093228 master-0 kubenswrapper[8731]: I1205 12:31:42.093161 8731 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 05 12:31:42.093228 master-0 kubenswrapper[8731]: I1205 12:31:42.093219 8731 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Dec 05 12:31:42.093228 master-0 kubenswrapper[8731]: I1205 12:31:42.093228 8731 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Dec 05 12:31:42.093311 master-0 kubenswrapper[8731]: I1205 12:31:42.093239 8731 policy_none.go:49] "None policy: Start" Dec 05 12:31:42.095193 master-0 kubenswrapper[8731]: I1205 12:31:42.095139 8731 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 12:31:42.095253 master-0 kubenswrapper[8731]: I1205 12:31:42.095197 8731 state_mem.go:35] "Initializing new in-memory state store" Dec 05 12:31:42.095488 master-0 kubenswrapper[8731]: I1205 12:31:42.095457 8731 state_mem.go:75] "Updated machine memory state" Dec 05 12:31:42.095488 master-0 kubenswrapper[8731]: I1205 12:31:42.095473 8731 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Dec 05 12:31:42.108741 master-0 kubenswrapper[8731]: I1205 12:31:42.108616 8731 manager.go:334] "Starting Device Plugin manager" Dec 05 12:31:42.109011 master-0 kubenswrapper[8731]: I1205 12:31:42.108770 8731 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 12:31:42.109011 master-0 kubenswrapper[8731]: I1205 12:31:42.108792 8731 server.go:79] "Starting device plugin registration server" Dec 05 12:31:42.109554 master-0 kubenswrapper[8731]: I1205 12:31:42.109524 8731 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 12:31:42.109609 master-0 kubenswrapper[8731]: I1205 12:31:42.109549 8731 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 12:31:42.109752 master-0 kubenswrapper[8731]: I1205 12:31:42.109701 8731 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 12:31:42.109909 master-0 kubenswrapper[8731]: I1205 12:31:42.109874 8731 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 12:31:42.109909 master-0 kubenswrapper[8731]: I1205 12:31:42.109891 8731 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 12:31:42.211010 master-0 kubenswrapper[8731]: I1205 12:31:42.210946 8731 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:31:42.214295 master-0 kubenswrapper[8731]: I1205 12:31:42.213343 8731 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:31:42.214376 master-0 kubenswrapper[8731]: I1205 12:31:42.214326 8731 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:31:42.214376 master-0 kubenswrapper[8731]: I1205 12:31:42.214342 8731 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:31:42.214524 master-0 kubenswrapper[8731]: I1205 12:31:42.214506 8731 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:31:42.226583 master-0 kubenswrapper[8731]: I1205 12:31:42.226511 8731 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Dec 05 12:31:42.227090 master-0 kubenswrapper[8731]: I1205 12:31:42.227046 8731 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 05 12:31:42.235634 master-0 kubenswrapper[8731]: I1205 12:31:42.235416 8731 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Dec 05 12:31:42.237393 master-0 kubenswrapper[8731]: I1205 12:31:42.237291 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d"} Dec 05 12:31:42.237472 master-0 kubenswrapper[8731]: I1205 12:31:42.237401 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc"} Dec 05 12:31:42.237472 master-0 kubenswrapper[8731]: I1205 12:31:42.237418 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerDied","Data":"697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334"} Dec 05 12:31:42.237472 master-0 kubenswrapper[8731]: I1205 12:31:42.237437 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"d75143d9bc4a2dc15781dc51ccff632a","Type":"ContainerStarted","Data":"e5863ee594cde86aeedce8416be9b249f569b2f49267eb70875c7f8a2e451e4e"} Dec 05 12:31:42.237472 master-0 kubenswrapper[8731]: I1205 12:31:42.237451 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18"} Dec 05 12:31:42.237472 master-0 kubenswrapper[8731]: I1205 12:31:42.237465 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"878914476f342bbe09935d11750836541a3cd256e73418d2dbee280993c5f191"} Dec 05 12:31:42.237472 master-0 kubenswrapper[8731]: I1205 12:31:42.237478 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"123ca114b6002ab3cd24848ba210c8015d871a3bf5c2f6653a7daa022e0dea48"} Dec 05 12:31:42.237640 master-0 kubenswrapper[8731]: I1205 12:31:42.237496 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"34418050489e8b48781fa5128a0548228f5bdb58f7e6a5f88226bbd7dacf7bb5"} Dec 05 12:31:42.237640 master-0 kubenswrapper[8731]: I1205 12:31:42.237526 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3564c4f7be418f0fb673bf15e0683bd1a8124446576d69fea4a76db52877c172" Dec 05 12:31:42.237640 master-0 kubenswrapper[8731]: I1205 12:31:42.237562 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75e9032b6429595bd1a6f97d2cb17682f851991bfdb1cc650ef529d5407494ac" Dec 05 12:31:42.237640 master-0 kubenswrapper[8731]: I1205 12:31:42.237581 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"8c7e83119fdbf7fba596a8756e22362ec175fbd883171a7a50b5c673c4302ba8"} Dec 05 12:31:42.237640 master-0 kubenswrapper[8731]: I1205 12:31:42.237594 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"1bd06826dd54922214ff0bdf4dd49e3e4fb5917fe2431fd30da1ce39eb71cae2"} Dec 05 12:31:42.237640 master-0 kubenswrapper[8731]: I1205 12:31:42.237613 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954"} Dec 05 12:31:42.237640 master-0 kubenswrapper[8731]: I1205 12:31:42.237626 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715"} Dec 05 12:31:42.237640 master-0 kubenswrapper[8731]: I1205 12:31:42.237642 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"cc0396a9a2689b3e8c132c12640cbe83","Type":"ContainerStarted","Data":"e0816ccdefc3d19a555c704cf7914804a097b5a95e2655805ebd92880ab7a03f"} Dec 05 12:31:42.237860 master-0 kubenswrapper[8731]: I1205 12:31:42.237656 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"f9824f2538239be2916d2115cdd6e15355f5d12571e5c02316bdba7857f30ff8"} Dec 05 12:31:42.237860 master-0 kubenswrapper[8731]: I1205 12:31:42.237672 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"1071666e1f17d7cf23c890a67d65947ba5ea19368b6f26e80669ec0f695e375b"} Dec 05 12:31:42.237860 master-0 kubenswrapper[8731]: I1205 12:31:42.237738 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"419ef08e96de1310d58b89d9dc91be12123ef06b7cf2f7b293589e349077e04c"} Dec 05 12:31:42.237860 master-0 kubenswrapper[8731]: I1205 12:31:42.237751 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76"} Dec 05 12:31:42.252375 master-0 kubenswrapper[8731]: W1205 12:31:42.252150 8731 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Dec 05 12:31:42.252375 master-0 kubenswrapper[8731]: E1205 12:31:42.252246 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.252375 master-0 kubenswrapper[8731]: E1205 12:31:42.252293 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:31:42.252375 master-0 kubenswrapper[8731]: E1205 12:31:42.252332 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:31:42.252804 master-0 kubenswrapper[8731]: E1205 12:31:42.252748 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:31:42.256167 master-0 kubenswrapper[8731]: E1205 12:31:42.256137 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.296154 master-0 kubenswrapper[8731]: I1205 12:31:42.296092 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.296154 master-0 kubenswrapper[8731]: I1205 12:31:42.296166 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.296512 master-0 kubenswrapper[8731]: I1205 12:31:42.296233 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.296512 master-0 kubenswrapper[8731]: I1205 12:31:42.296327 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.296620 master-0 kubenswrapper[8731]: I1205 12:31:42.296566 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.296669 master-0 kubenswrapper[8731]: I1205 12:31:42.296644 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.296706 master-0 kubenswrapper[8731]: I1205 12:31:42.296683 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:31:42.296738 master-0 kubenswrapper[8731]: I1205 12:31:42.296715 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:31:42.296793 master-0 kubenswrapper[8731]: I1205 12:31:42.296744 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:31:42.296827 master-0 kubenswrapper[8731]: I1205 12:31:42.296788 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:31:42.296857 master-0 kubenswrapper[8731]: I1205 12:31:42.296842 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.296890 master-0 kubenswrapper[8731]: I1205 12:31:42.296875 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.296921 master-0 kubenswrapper[8731]: I1205 12:31:42.296909 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.297001 master-0 kubenswrapper[8731]: I1205 12:31:42.296968 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:31:42.297200 master-0 kubenswrapper[8731]: I1205 12:31:42.297015 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:31:42.297200 master-0 kubenswrapper[8731]: I1205 12:31:42.297073 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.297200 master-0 kubenswrapper[8731]: I1205 12:31:42.297142 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.397894 master-0 kubenswrapper[8731]: I1205 12:31:42.397758 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:31:42.397894 master-0 kubenswrapper[8731]: I1205 12:31:42.397839 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398133 master-0 kubenswrapper[8731]: I1205 12:31:42.397958 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:31:42.398133 master-0 kubenswrapper[8731]: I1205 12:31:42.398015 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398133 master-0 kubenswrapper[8731]: I1205 12:31:42.398054 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:31:42.398133 master-0 kubenswrapper[8731]: I1205 12:31:42.398059 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398133 master-0 kubenswrapper[8731]: I1205 12:31:42.398075 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:31:42.398312 master-0 kubenswrapper[8731]: I1205 12:31:42.398225 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398312 master-0 kubenswrapper[8731]: I1205 12:31:42.398248 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:31:42.398373 master-0 kubenswrapper[8731]: I1205 12:31:42.398335 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:31:42.398373 master-0 kubenswrapper[8731]: I1205 12:31:42.398344 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398436 master-0 kubenswrapper[8731]: I1205 12:31:42.398387 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:31:42.398436 master-0 kubenswrapper[8731]: I1205 12:31:42.398271 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"etcd-master-0-master-0\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:31:42.398510 master-0 kubenswrapper[8731]: I1205 12:31:42.398404 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398510 master-0 kubenswrapper[8731]: I1205 12:31:42.398425 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398510 master-0 kubenswrapper[8731]: I1205 12:31:42.398468 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398510 master-0 kubenswrapper[8731]: I1205 12:31:42.398493 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.398659 master-0 kubenswrapper[8731]: I1205 12:31:42.398530 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:31:42.398659 master-0 kubenswrapper[8731]: I1205 12:31:42.398557 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.398659 master-0 kubenswrapper[8731]: I1205 12:31:42.398584 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398659 master-0 kubenswrapper[8731]: I1205 12:31:42.398584 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.398659 master-0 kubenswrapper[8731]: I1205 12:31:42.398611 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398659 master-0 kubenswrapper[8731]: I1205 12:31:42.398627 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.398659 master-0 kubenswrapper[8731]: I1205 12:31:42.398640 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.398659 master-0 kubenswrapper[8731]: I1205 12:31:42.398654 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398907 master-0 kubenswrapper[8731]: I1205 12:31:42.398679 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.398907 master-0 kubenswrapper[8731]: I1205 12:31:42.398687 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:31:42.398907 master-0 kubenswrapper[8731]: I1205 12:31:42.398721 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.398907 master-0 kubenswrapper[8731]: I1205 12:31:42.398720 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:42.398907 master-0 kubenswrapper[8731]: I1205 12:31:42.398728 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.398907 master-0 kubenswrapper[8731]: I1205 12:31:42.398774 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.398907 master-0 kubenswrapper[8731]: I1205 12:31:42.398747 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:42.398907 master-0 kubenswrapper[8731]: I1205 12:31:42.398817 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:31:42.398907 master-0 kubenswrapper[8731]: I1205 12:31:42.398871 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:31:42.858215 master-0 kubenswrapper[8731]: I1205 12:31:42.858121 8731 apiserver.go:52] "Watching apiserver" Dec 05 12:31:42.867384 master-0 kubenswrapper[8731]: I1205 12:31:42.867309 8731 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 12:31:42.868429 master-0 kubenswrapper[8731]: I1205 12:31:42.868368 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0","openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-multus/multus-5nqhk","openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw","openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc","openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c","openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt","openshift-multus/multus-additional-cni-plugins-prt97","openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs","openshift-etcd/etcd-master-0-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5","openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z","openshift-network-node-identity/network-node-identity-xwx26","assisted-installer/assisted-installer-controller-m6pn4","kube-system/bootstrap-kube-controller-manager-master-0","openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz","openshift-ingress-operator/ingress-operator-8649c48786-7xrk6","openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp","openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq","openshift-network-diagnostics/network-check-target-qsggt","openshift-network-operator/iptables-alerter-nwplt","openshift-multus/network-metrics-daemon-99djw","openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh","openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24","openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv","openshift-marketplace/marketplace-operator-f797b99b6-vwhxt","openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv","openshift-network-operator/network-operator-79767b7ff9-h8qkj","openshift-ovn-kubernetes/ovnkube-node-9vqtb"] Dec 05 12:31:42.868708 master-0 kubenswrapper[8731]: I1205 12:31:42.868664 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:31:42.869741 master-0 kubenswrapper[8731]: I1205 12:31:42.869474 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:42.869741 master-0 kubenswrapper[8731]: I1205 12:31:42.869592 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:42.870585 master-0 kubenswrapper[8731]: I1205 12:31:42.870115 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:42.870694 master-0 kubenswrapper[8731]: I1205 12:31:42.870647 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:42.870881 master-0 kubenswrapper[8731]: I1205 12:31:42.870842 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:42.870881 master-0 kubenswrapper[8731]: I1205 12:31:42.870854 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:42.870996 master-0 kubenswrapper[8731]: I1205 12:31:42.870950 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:42.871222 master-0 kubenswrapper[8731]: I1205 12:31:42.871168 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:42.872467 master-0 kubenswrapper[8731]: I1205 12:31:42.872219 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.876121 master-0 kubenswrapper[8731]: I1205 12:31:42.875210 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 12:31:42.876121 master-0 kubenswrapper[8731]: I1205 12:31:42.875267 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 12:31:42.876121 master-0 kubenswrapper[8731]: I1205 12:31:42.875383 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.876121 master-0 kubenswrapper[8731]: I1205 12:31:42.875482 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 12:31:42.876121 master-0 kubenswrapper[8731]: I1205 12:31:42.875250 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.876121 master-0 kubenswrapper[8731]: I1205 12:31:42.875566 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.876121 master-0 kubenswrapper[8731]: I1205 12:31:42.875660 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 12:31:42.876121 master-0 kubenswrapper[8731]: I1205 12:31:42.875711 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 12:31:42.876121 master-0 kubenswrapper[8731]: I1205 12:31:42.875727 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:42.876121 master-0 kubenswrapper[8731]: I1205 12:31:42.876112 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 12:31:42.878691 master-0 kubenswrapper[8731]: I1205 12:31:42.876713 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:42.884149 master-0 kubenswrapper[8731]: I1205 12:31:42.883729 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 12:31:42.886006 master-0 kubenswrapper[8731]: I1205 12:31:42.885973 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 12:31:42.886266 master-0 kubenswrapper[8731]: I1205 12:31:42.886197 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 12:31:42.886350 master-0 kubenswrapper[8731]: I1205 12:31:42.886327 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 12:31:42.886435 master-0 kubenswrapper[8731]: I1205 12:31:42.886342 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 12:31:42.886435 master-0 kubenswrapper[8731]: I1205 12:31:42.886433 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.886530 master-0 kubenswrapper[8731]: I1205 12:31:42.886471 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 12:31:42.886530 master-0 kubenswrapper[8731]: I1205 12:31:42.886521 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:31:42.886595 master-0 kubenswrapper[8731]: I1205 12:31:42.886547 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 12:31:42.886595 master-0 kubenswrapper[8731]: I1205 12:31:42.886556 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 05 12:31:42.886694 master-0 kubenswrapper[8731]: I1205 12:31:42.886665 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 12:31:42.886694 master-0 kubenswrapper[8731]: I1205 12:31:42.886678 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 12:31:42.886751 master-0 kubenswrapper[8731]: I1205 12:31:42.886713 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 12:31:42.886751 master-0 kubenswrapper[8731]: I1205 12:31:42.886729 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 12:31:42.886751 master-0 kubenswrapper[8731]: I1205 12:31:42.886746 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 12:31:42.886751 master-0 kubenswrapper[8731]: I1205 12:31:42.886762 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 05 12:31:42.886880 master-0 kubenswrapper[8731]: I1205 12:31:42.886805 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.886880 master-0 kubenswrapper[8731]: I1205 12:31:42.886851 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 12:31:42.886880 master-0 kubenswrapper[8731]: I1205 12:31:42.886877 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.886958 master-0 kubenswrapper[8731]: I1205 12:31:42.886894 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 12:31:42.887065 master-0 kubenswrapper[8731]: I1205 12:31:42.887046 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 12:31:42.887065 master-0 kubenswrapper[8731]: I1205 12:31:42.887057 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 12:31:42.887130 master-0 kubenswrapper[8731]: I1205 12:31:42.887050 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 12:31:42.887220 master-0 kubenswrapper[8731]: I1205 12:31:42.887167 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 12:31:42.887311 master-0 kubenswrapper[8731]: I1205 12:31:42.887290 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 12:31:42.887380 master-0 kubenswrapper[8731]: I1205 12:31:42.887235 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 12:31:42.889024 master-0 kubenswrapper[8731]: I1205 12:31:42.888936 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 05 12:31:42.890314 master-0 kubenswrapper[8731]: I1205 12:31:42.889216 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.890314 master-0 kubenswrapper[8731]: I1205 12:31:42.889233 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 12:31:42.890314 master-0 kubenswrapper[8731]: I1205 12:31:42.889541 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 12:31:42.890626 master-0 kubenswrapper[8731]: I1205 12:31:42.890583 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 12:31:42.890626 master-0 kubenswrapper[8731]: I1205 12:31:42.890589 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 05 12:31:42.891045 master-0 kubenswrapper[8731]: I1205 12:31:42.891018 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 12:31:42.891261 master-0 kubenswrapper[8731]: I1205 12:31:42.891241 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 05 12:31:42.891442 master-0 kubenswrapper[8731]: I1205 12:31:42.891423 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 12:31:42.891571 master-0 kubenswrapper[8731]: I1205 12:31:42.891547 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.892208 master-0 kubenswrapper[8731]: I1205 12:31:42.892166 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 12:31:42.892706 master-0 kubenswrapper[8731]: I1205 12:31:42.892671 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:31:42.892853 master-0 kubenswrapper[8731]: I1205 12:31:42.892817 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 12:31:42.893042 master-0 kubenswrapper[8731]: I1205 12:31:42.893009 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 05 12:31:42.895104 master-0 kubenswrapper[8731]: I1205 12:31:42.895065 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 12:31:42.895218 master-0 kubenswrapper[8731]: I1205 12:31:42.895162 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 12:31:42.895317 master-0 kubenswrapper[8731]: I1205 12:31:42.895289 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 05 12:31:42.895510 master-0 kubenswrapper[8731]: I1205 12:31:42.895411 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.895637 master-0 kubenswrapper[8731]: I1205 12:31:42.895553 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 12:31:42.896002 master-0 kubenswrapper[8731]: I1205 12:31:42.895967 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 12:31:42.896365 master-0 kubenswrapper[8731]: I1205 12:31:42.896263 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 12:31:42.896926 master-0 kubenswrapper[8731]: I1205 12:31:42.896611 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 05 12:31:42.897435 master-0 kubenswrapper[8731]: I1205 12:31:42.896956 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 05 12:31:42.897435 master-0 kubenswrapper[8731]: I1205 12:31:42.897104 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 12:31:42.897435 master-0 kubenswrapper[8731]: I1205 12:31:42.897194 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 12:31:42.897435 master-0 kubenswrapper[8731]: I1205 12:31:42.897283 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 12:31:42.898867 master-0 kubenswrapper[8731]: I1205 12:31:42.897992 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.898867 master-0 kubenswrapper[8731]: I1205 12:31:42.898082 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 05 12:31:42.898867 master-0 kubenswrapper[8731]: I1205 12:31:42.898415 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 12:31:42.898867 master-0 kubenswrapper[8731]: I1205 12:31:42.898150 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 12:31:42.898867 master-0 kubenswrapper[8731]: I1205 12:31:42.898222 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 12:31:42.898867 master-0 kubenswrapper[8731]: I1205 12:31:42.898698 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 12:31:42.898867 master-0 kubenswrapper[8731]: I1205 12:31:42.898826 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 12:31:42.898867 master-0 kubenswrapper[8731]: I1205 12:31:42.898844 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 12:31:42.903671 master-0 kubenswrapper[8731]: I1205 12:31:42.899285 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 05 12:31:42.903671 master-0 kubenswrapper[8731]: I1205 12:31:42.901106 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 12:31:42.903671 master-0 kubenswrapper[8731]: I1205 12:31:42.901214 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.903671 master-0 kubenswrapper[8731]: I1205 12:31:42.901603 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 12:31:42.905867 master-0 kubenswrapper[8731]: I1205 12:31:42.903709 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:31:42.905867 master-0 kubenswrapper[8731]: I1205 12:31:42.903787 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807d9093-aa67-4840-b5be-7f3abcc1beed-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:42.908050 master-0 kubenswrapper[8731]: I1205 12:31:42.904394 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/58187662-b502-4d90-95ce-2aa91a81d256-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:42.913297 master-0 kubenswrapper[8731]: I1205 12:31:42.908121 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5efad170-c154-42ec-a7c0-b36a98d2bfcc-metrics-tls\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:31:42.913416 master-0 kubenswrapper[8731]: I1205 12:31:42.913374 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-trusted-ca\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:42.913416 master-0 kubenswrapper[8731]: I1205 12:31:42.908655 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5efad170-c154-42ec-a7c0-b36a98d2bfcc-metrics-tls\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:31:42.913512 master-0 kubenswrapper[8731]: I1205 12:31:42.908490 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807d9093-aa67-4840-b5be-7f3abcc1beed-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:42.913512 master-0 kubenswrapper[8731]: I1205 12:31:42.908725 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/58187662-b502-4d90-95ce-2aa91a81d256-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:42.913576 master-0 kubenswrapper[8731]: I1205 12:31:42.913427 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxw7\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-kube-api-access-fxxw7\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:42.913819 master-0 kubenswrapper[8731]: I1205 12:31:42.913757 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 12:31:42.913859 master-0 kubenswrapper[8731]: I1205 12:31:42.913776 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:42.913949 master-0 kubenswrapper[8731]: I1205 12:31:42.913870 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 12:31:42.913949 master-0 kubenswrapper[8731]: I1205 12:31:42.913880 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-k8s-cni-cncf-io\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.914009 master-0 kubenswrapper[8731]: I1205 12:31:42.913955 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-host-slash\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:42.914045 master-0 kubenswrapper[8731]: I1205 12:31:42.914015 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmq98\" (UniqueName: \"kubernetes.io/projected/4492c55f-701b-4ec8-ada1-0a5dc126d405-kube-api-access-dmq98\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.914078 master-0 kubenswrapper[8731]: I1205 12:31:42.914055 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovn-node-metrics-cert\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.914403 master-0 kubenswrapper[8731]: I1205 12:31:42.914372 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 12:31:42.914481 master-0 kubenswrapper[8731]: I1205 12:31:42.914432 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-trusted-ca\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:42.914545 master-0 kubenswrapper[8731]: I1205 12:31:42.914526 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 12:31:42.914578 master-0 kubenswrapper[8731]: I1205 12:31:42.914499 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps4ws\" (UniqueName: \"kubernetes.io/projected/58187662-b502-4d90-95ce-2aa91a81d256-kube-api-access-ps4ws\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:42.914640 master-0 kubenswrapper[8731]: I1205 12:31:42.914617 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 12:31:42.914937 master-0 kubenswrapper[8731]: I1205 12:31:42.914880 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 12:31:42.915414 master-0 kubenswrapper[8731]: I1205 12:31:42.914610 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:42.916038 master-0 kubenswrapper[8731]: I1205 12:31:42.916007 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:42.916088 master-0 kubenswrapper[8731]: I1205 12:31:42.916061 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-iptables-alerter-script\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:42.916118 master-0 kubenswrapper[8731]: I1205 12:31:42.916085 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55qpg\" (UniqueName: \"kubernetes.io/projected/ba095394-1873-4793-969d-3be979fa0771-kube-api-access-55qpg\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:42.916118 master-0 kubenswrapper[8731]: I1205 12:31:42.916113 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38941513-e968-45f1-9cb2-b63d40338f36-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:42.916268 master-0 kubenswrapper[8731]: I1205 12:31:42.916217 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5hdg\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-kube-api-access-t5hdg\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:42.916309 master-0 kubenswrapper[8731]: I1205 12:31:42.916243 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 05 12:31:42.916476 master-0 kubenswrapper[8731]: I1205 12:31:42.916428 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-serving-cert\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.916617 master-0 kubenswrapper[8731]: I1205 12:31:42.916507 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/594aaded-5615-4bed-87ee-6173059a73be-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:42.916617 master-0 kubenswrapper[8731]: I1205 12:31:42.916524 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 12:31:42.916617 master-0 kubenswrapper[8731]: I1205 12:31:42.916588 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlnqb\" (UniqueName: \"kubernetes.io/projected/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-kube-api-access-mlnqb\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:42.916782 master-0 kubenswrapper[8731]: I1205 12:31:42.916726 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 12:31:42.916782 master-0 kubenswrapper[8731]: I1205 12:31:42.916765 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1871a9d6-6369-4d08-816f-9c6310b61ddf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:42.916840 master-0 kubenswrapper[8731]: I1205 12:31:42.916812 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:31:42.916840 master-0 kubenswrapper[8731]: I1205 12:31:42.916829 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38941513-e968-45f1-9cb2-b63d40338f36-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:42.916963 master-0 kubenswrapper[8731]: I1205 12:31:42.916853 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 12:31:42.916963 master-0 kubenswrapper[8731]: I1205 12:31:42.916865 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:42.916963 master-0 kubenswrapper[8731]: I1205 12:31:42.916901 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-hostroot\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.917197 master-0 kubenswrapper[8731]: I1205 12:31:42.916962 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-config\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.917197 master-0 kubenswrapper[8731]: I1205 12:31:42.917070 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-client\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.917197 master-0 kubenswrapper[8731]: I1205 12:31:42.917083 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 12:31:42.917197 master-0 kubenswrapper[8731]: I1205 12:31:42.917113 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-bin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.917197 master-0 kubenswrapper[8731]: I1205 12:31:42.917156 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba095394-1873-4793-969d-3be979fa0771-serving-cert\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:42.917336 master-0 kubenswrapper[8731]: I1205 12:31:42.917209 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2acba71-b9dc-4b85-be35-c995b8be2f19-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:42.917336 master-0 kubenswrapper[8731]: I1205 12:31:42.917247 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-serving-cert\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.917336 master-0 kubenswrapper[8731]: I1205 12:31:42.917278 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 12:31:42.918605 master-0 kubenswrapper[8731]: I1205 12:31:42.917272 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-996h9\" (UniqueName: \"kubernetes.io/projected/5efad170-c154-42ec-a7c0-b36a98d2bfcc-kube-api-access-996h9\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:31:42.918682 master-0 kubenswrapper[8731]: I1205 12:31:42.917303 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 12:31:42.918810 master-0 kubenswrapper[8731]: I1205 12:31:42.918778 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.918944 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba095394-1873-4793-969d-3be979fa0771-serving-cert\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919040 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.917388 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-config\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919121 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919156 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/807d9093-aa67-4840-b5be-7f3abcc1beed-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919221 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cni-binary-copy\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919259 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69n5s\" (UniqueName: \"kubernetes.io/projected/fb7003a6-4341-49eb-bec3-76ba8610fa12-kube-api-access-69n5s\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919366 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvbfq\" (UniqueName: \"kubernetes.io/projected/b8233dad-bd19-4842-a4d5-cfa84f1feb83-kube-api-access-mvbfq\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919396 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-script-lib\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919417 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-socket-dir-parent\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919441 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-multus\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919462 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919482 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919502 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62nqj\" (UniqueName: \"kubernetes.io/projected/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-kube-api-access-62nqj\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919508 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cni-binary-copy\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919522 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d53a4886-db25-43a1-825a-66a9a9a58590-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919554 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-log-socket\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919721 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2gd8\" (UniqueName: \"kubernetes.io/projected/1e6babfe-724a-4eab-bb3b-bc318bf57b70-kube-api-access-c2gd8\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919755 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919779 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-netd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919800 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919811 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d53a4886-db25-43a1-825a-66a9a9a58590-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919824 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5p5d\" (UniqueName: \"kubernetes.io/projected/5f0c6889-0739-48a3-99cd-6db9d1f83242-kube-api-access-p5p5d\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919856 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29812c4b-48ac-488c-863c-1d52e39ea2ae-service-ca\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919886 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d53a4886-db25-43a1-825a-66a9a9a58590-config\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919922 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919927 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.919949 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-slash\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.920044 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-os-release\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.920069 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-kubelet\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.920072 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.920096 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6mb6\" (UniqueName: \"kubernetes.io/projected/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-kube-api-access-z6mb6\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.920131 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph9w6\" (UniqueName: \"kubernetes.io/projected/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-kube-api-access-ph9w6\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.920156 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.920257 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.920285 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-etc-kubernetes\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.920287 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:42.920360 master-0 kubenswrapper[8731]: I1205 12:31:42.920309 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1871a9d6-6369-4d08-816f-9c6310b61ddf-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.920508 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29812c4b-48ac-488c-863c-1d52e39ea2ae-service-ca\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.920563 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.920641 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.920773 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594aaded-5615-4bed-87ee-6173059a73be-serving-cert\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.920889 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.920936 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1871a9d6-6369-4d08-816f-9c6310b61ddf-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.920946 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d53a4886-db25-43a1-825a-66a9a9a58590-config\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.920939 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921095 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594aaded-5615-4bed-87ee-6173059a73be-serving-cert\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921155 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921171 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921146 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-config\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.920993 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921415 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-netns\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921433 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921067 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921570 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594aaded-5615-4bed-87ee-6173059a73be-config\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921601 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921629 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921668 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921740 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-config\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921823 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594aaded-5615-4bed-87ee-6173059a73be-config\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921859 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3792522-fec6-4022-90ac-0b8467fcd625-config\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921895 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:42.922010 master-0 kubenswrapper[8731]: I1205 12:31:42.921991 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z8h9\" (UniqueName: \"kubernetes.io/projected/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-kube-api-access-9z8h9\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:31:42.923152 master-0 kubenswrapper[8731]: I1205 12:31:42.922141 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3792522-fec6-4022-90ac-0b8467fcd625-config\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:42.923152 master-0 kubenswrapper[8731]: I1205 12:31:42.922319 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-client\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.923152 master-0 kubenswrapper[8731]: I1205 12:31:42.922352 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-env-overrides\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.923152 master-0 kubenswrapper[8731]: I1205 12:31:42.922622 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-binary-copy\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.923152 master-0 kubenswrapper[8731]: I1205 12:31:42.922959 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:31:42.923382 master-0 kubenswrapper[8731]: I1205 12:31:42.923142 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-env-overrides\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.923382 master-0 kubenswrapper[8731]: I1205 12:31:42.923171 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-binary-copy\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.923382 master-0 kubenswrapper[8731]: I1205 12:31:42.923194 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923390 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923425 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-netns\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923495 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923625 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923662 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-var-lib-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923661 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923689 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-node-log\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923731 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923750 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923803 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-conf-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.923948 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-multus-certs\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924030 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-systemd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924071 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-etc-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924168 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-kubelet\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924261 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-config\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924329 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29812c4b-48ac-488c-863c-1d52e39ea2ae-kube-api-access\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924379 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-daemon-config\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924446 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-config\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924487 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtvzs\" (UniqueName: \"kubernetes.io/projected/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-kube-api-access-dtvzs\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924549 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vx2z\" (UniqueName: \"kubernetes.io/projected/708bf629-9949-4b79-a88a-c73ba033475b-kube-api-access-6vx2z\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924583 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924649 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-systemd-units\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924832 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924888 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-daemon-config\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.925151 master-0 kubenswrapper[8731]: I1205 12:31:42.924932 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.925563 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cnibin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.925612 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.925646 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-env-overrides\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.925700 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.925736 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.925878 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.925929 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.925993 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x59kd\" (UniqueName: \"kubernetes.io/projected/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-kube-api-access-x59kd\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926022 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-config\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926025 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-env-overrides\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926123 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nml2g\" (UniqueName: \"kubernetes.io/projected/a2acba71-b9dc-4b85-be35-c995b8be2f19-kube-api-access-nml2g\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926160 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-ovnkube-identity-cm\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926252 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926296 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovn-node-metrics-cert\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926345 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-iptables-alerter-script\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926404 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-bound-sa-token\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926417 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-config\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926538 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926640 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-ovnkube-identity-cm\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926680 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926744 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/5efad170-c154-42ec-a7c0-b36a98d2bfcc-host-etc-kube\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926768 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926795 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flxbg\" (UniqueName: \"kubernetes.io/projected/f3792522-fec6-4022-90ac-0b8467fcd625-kube-api-access-flxbg\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926859 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-os-release\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926915 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926953 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3792522-fec6-4022-90ac-0b8467fcd625-serving-cert\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926984 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.926998 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927016 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-system-cni-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927125 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-cnibin\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927247 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3792522-fec6-4022-90ac-0b8467fcd625-serving-cert\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927306 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1871a9d6-6369-4d08-816f-9c6310b61ddf-config\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927514 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807d9093-aa67-4840-b5be-7f3abcc1beed-config\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927663 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927712 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwcr9\" (UniqueName: \"kubernetes.io/projected/f119ffe4-16bd-49eb-916d-b18ba0d79b54-kube-api-access-wwcr9\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927750 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tngh\" (UniqueName: \"kubernetes.io/projected/d53a4886-db25-43a1-825a-66a9a9a58590-kube-api-access-2tngh\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927781 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtjln\" (UniqueName: \"kubernetes.io/projected/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-kube-api-access-xtjln\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927836 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1871a9d6-6369-4d08-816f-9c6310b61ddf-config\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927810 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.927963 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-ovn\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.928144 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-system-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:42.928274 master-0 kubenswrapper[8731]: I1205 12:31:42.928226 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26x2z\" (UniqueName: \"kubernetes.io/projected/49760d62-02e5-4882-b47f-663102b04946-kube-api-access-26x2z\") pod \"csi-snapshot-controller-operator-6bc8656fdc-zn7hv\" (UID: \"49760d62-02e5-4882-b47f-663102b04946\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" Dec 05 12:31:42.930998 master-0 kubenswrapper[8731]: I1205 12:31:42.928397 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 12:31:42.930998 master-0 kubenswrapper[8731]: I1205 12:31:42.929039 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 12:31:42.930998 master-0 kubenswrapper[8731]: I1205 12:31:42.928936 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807d9093-aa67-4840-b5be-7f3abcc1beed-config\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:42.930998 master-0 kubenswrapper[8731]: I1205 12:31:42.929285 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:42.930998 master-0 kubenswrapper[8731]: I1205 12:31:42.929314 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-bin\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.930998 master-0 kubenswrapper[8731]: I1205 12:31:42.929499 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:42.930998 master-0 kubenswrapper[8731]: I1205 12:31:42.930115 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 05 12:31:42.930998 master-0 kubenswrapper[8731]: I1205 12:31:42.930305 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-script-lib\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.931691 master-0 kubenswrapper[8731]: I1205 12:31:42.931643 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:42.936718 master-0 kubenswrapper[8731]: I1205 12:31:42.936674 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxw7\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-kube-api-access-fxxw7\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:42.938665 master-0 kubenswrapper[8731]: I1205 12:31:42.938620 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2acba71-b9dc-4b85-be35-c995b8be2f19-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:42.938744 master-0 kubenswrapper[8731]: I1205 12:31:42.938621 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:42.953109 master-0 kubenswrapper[8731]: I1205 12:31:42.953057 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmq98\" (UniqueName: \"kubernetes.io/projected/4492c55f-701b-4ec8-ada1-0a5dc126d405-kube-api-access-dmq98\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:42.977842 master-0 kubenswrapper[8731]: I1205 12:31:42.977789 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps4ws\" (UniqueName: \"kubernetes.io/projected/58187662-b502-4d90-95ce-2aa91a81d256-kube-api-access-ps4ws\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:42.982833 master-0 kubenswrapper[8731]: I1205 12:31:42.982782 8731 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 05 12:31:42.996088 master-0 kubenswrapper[8731]: I1205 12:31:42.996043 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55qpg\" (UniqueName: \"kubernetes.io/projected/ba095394-1873-4793-969d-3be979fa0771-kube-api-access-55qpg\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:31:43.013685 master-0 kubenswrapper[8731]: I1205 12:31:43.013611 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5hdg\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-kube-api-access-t5hdg\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:43.030017 master-0 kubenswrapper[8731]: I1205 12:31:43.029951 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/5efad170-c154-42ec-a7c0-b36a98d2bfcc-host-etc-kube\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:31:43.030325 master-0 kubenswrapper[8731]: I1205 12:31:43.030130 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/5efad170-c154-42ec-a7c0-b36a98d2bfcc-host-etc-kube\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:31:43.030325 master-0 kubenswrapper[8731]: I1205 12:31:43.030270 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-os-release\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:43.030392 master-0 kubenswrapper[8731]: I1205 12:31:43.030158 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-os-release\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:43.030392 master-0 kubenswrapper[8731]: I1205 12:31:43.030373 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:43.030538 master-0 kubenswrapper[8731]: E1205 12:31:43.030506 8731 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:31:43.030610 master-0 kubenswrapper[8731]: E1205 12:31:43.030590 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.530569304 +0000 UTC m=+1.834553471 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:31:43.031117 master-0 kubenswrapper[8731]: I1205 12:31:43.030924 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-system-cni-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:43.031117 master-0 kubenswrapper[8731]: I1205 12:31:43.031004 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-system-cni-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:43.031117 master-0 kubenswrapper[8731]: I1205 12:31:43.030955 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-cnibin\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:43.031117 master-0 kubenswrapper[8731]: I1205 12:31:43.031062 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:43.031343 master-0 kubenswrapper[8731]: I1205 12:31:43.031128 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-cnibin\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:43.031343 master-0 kubenswrapper[8731]: I1205 12:31:43.031218 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-ovn\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.031343 master-0 kubenswrapper[8731]: I1205 12:31:43.031309 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-system-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.031438 master-0 kubenswrapper[8731]: I1205 12:31:43.031345 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-bin\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.031438 master-0 kubenswrapper[8731]: E1205 12:31:43.031353 8731 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:43.031438 master-0 kubenswrapper[8731]: I1205 12:31:43.031376 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:43.031438 master-0 kubenswrapper[8731]: E1205 12:31:43.031402 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.531387416 +0000 UTC m=+1.835371583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:31:43.031438 master-0 kubenswrapper[8731]: I1205 12:31:43.031417 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-ssl-certs\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:43.031438 master-0 kubenswrapper[8731]: I1205 12:31:43.031429 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-k8s-cni-cncf-io\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.031603 master-0 kubenswrapper[8731]: I1205 12:31:43.031452 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-bin\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.031603 master-0 kubenswrapper[8731]: I1205 12:31:43.031456 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-host-slash\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:43.031603 master-0 kubenswrapper[8731]: I1205 12:31:43.031481 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-host-slash\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:43.031603 master-0 kubenswrapper[8731]: I1205 12:31:43.031459 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-system-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.031737 master-0 kubenswrapper[8731]: I1205 12:31:43.031608 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:43.031737 master-0 kubenswrapper[8731]: I1205 12:31:43.031655 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:43.031800 master-0 kubenswrapper[8731]: I1205 12:31:43.031769 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-hostroot\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.031800 master-0 kubenswrapper[8731]: E1205 12:31:43.031788 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:43.031902 master-0 kubenswrapper[8731]: E1205 12:31:43.031820 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.531810867 +0000 UTC m=+1.835795034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:43.031902 master-0 kubenswrapper[8731]: I1205 12:31:43.031820 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-bin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.031902 master-0 kubenswrapper[8731]: E1205 12:31:43.031873 8731 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:43.031902 master-0 kubenswrapper[8731]: I1205 12:31:43.031882 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.031902 master-0 kubenswrapper[8731]: E1205 12:31:43.031897 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.531891609 +0000 UTC m=+1.835875776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:31:43.031902 master-0 kubenswrapper[8731]: I1205 12:31:43.031893 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-hostroot\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032062 master-0 kubenswrapper[8731]: I1205 12:31:43.031911 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:43.032062 master-0 kubenswrapper[8731]: I1205 12:31:43.031938 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-k8s-cni-cncf-io\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032062 master-0 kubenswrapper[8731]: I1205 12:31:43.031963 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-socket-dir-parent\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032062 master-0 kubenswrapper[8731]: I1205 12:31:43.031968 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-bin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032062 master-0 kubenswrapper[8731]: I1205 12:31:43.031970 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032062 master-0 kubenswrapper[8731]: I1205 12:31:43.031981 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-multus\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032062 master-0 kubenswrapper[8731]: I1205 12:31:43.032019 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-multus\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032062 master-0 kubenswrapper[8731]: I1205 12:31:43.032055 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-socket-dir-parent\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032294 master-0 kubenswrapper[8731]: I1205 12:31:43.032091 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-log-socket\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032294 master-0 kubenswrapper[8731]: I1205 12:31:43.032119 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-netd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032294 master-0 kubenswrapper[8731]: I1205 12:31:43.032139 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:43.032294 master-0 kubenswrapper[8731]: I1205 12:31:43.032210 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-netd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032407 master-0 kubenswrapper[8731]: I1205 12:31:43.032294 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-slash\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032407 master-0 kubenswrapper[8731]: I1205 12:31:43.032321 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-os-release\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032407 master-0 kubenswrapper[8731]: I1205 12:31:43.032338 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-kubelet\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032407 master-0 kubenswrapper[8731]: I1205 12:31:43.032385 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-slash\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032407 master-0 kubenswrapper[8731]: I1205 12:31:43.032393 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-ovn\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032537 master-0 kubenswrapper[8731]: I1205 12:31:43.032429 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-log-socket\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032567 master-0 kubenswrapper[8731]: I1205 12:31:43.032529 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:43.032633 master-0 kubenswrapper[8731]: I1205 12:31:43.032570 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-etc-kubernetes\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032668 master-0 kubenswrapper[8731]: I1205 12:31:43.032645 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-netns\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032668 master-0 kubenswrapper[8731]: I1205 12:31:43.032652 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-kubelet\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032668 master-0 kubenswrapper[8731]: E1205 12:31:43.032607 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:31:43.032747 master-0 kubenswrapper[8731]: E1205 12:31:43.032690 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.53268121 +0000 UTC m=+1.836665377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:31:43.032747 master-0 kubenswrapper[8731]: I1205 12:31:43.032710 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-etc-kubernetes\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032747 master-0 kubenswrapper[8731]: I1205 12:31:43.032719 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:43.032747 master-0 kubenswrapper[8731]: I1205 12:31:43.032745 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-netns\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032857 master-0 kubenswrapper[8731]: I1205 12:31:43.032765 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:43.032857 master-0 kubenswrapper[8731]: I1205 12:31:43.032774 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-os-release\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032857 master-0 kubenswrapper[8731]: E1205 12:31:43.032630 8731 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:43.032857 master-0 kubenswrapper[8731]: I1205 12:31:43.032779 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-netns\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.032857 master-0 kubenswrapper[8731]: I1205 12:31:43.032808 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-var-lib-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032857 master-0 kubenswrapper[8731]: E1205 12:31:43.032814 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.532805923 +0000 UTC m=+1.836790090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:43.032857 master-0 kubenswrapper[8731]: I1205 12:31:43.032784 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-var-lib-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.032857 master-0 kubenswrapper[8731]: E1205 12:31:43.032637 8731 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:43.032857 master-0 kubenswrapper[8731]: I1205 12:31:43.032862 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-node-log\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: E1205 12:31:43.032870 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.532863395 +0000 UTC m=+1.836847562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.032872 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.032887 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-node-log\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.032892 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-netns\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.032909 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.032936 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.033026 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-cvo-updatepayloads\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.033048 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.033091 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-conf-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.033113 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-multus-certs\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.033130 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-systemd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.033147 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-etc-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.033165 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-kubelet\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.033227 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-conf-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.033250 master-0 kubenswrapper[8731]: I1205 12:31:43.033253 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-multus-certs\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: I1205 12:31:43.033284 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-systemd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: E1205 12:31:43.033305 8731 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: E1205 12:31:43.033389 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.533361627 +0000 UTC m=+1.837345834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: I1205 12:31:43.033433 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-systemd-units\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: I1205 12:31:43.033472 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: I1205 12:31:43.033507 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cnibin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: I1205 12:31:43.033317 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-etc-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: I1205 12:31:43.033551 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: I1205 12:31:43.033577 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cnibin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: I1205 12:31:43.033592 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:43.033625 master-0 kubenswrapper[8731]: I1205 12:31:43.033606 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-systemd-units\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033897 master-0 kubenswrapper[8731]: E1205 12:31:43.033665 8731 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:31:43.033897 master-0 kubenswrapper[8731]: E1205 12:31:43.033692 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.533683605 +0000 UTC m=+1.837667772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:31:43.033897 master-0 kubenswrapper[8731]: I1205 12:31:43.033713 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033897 master-0 kubenswrapper[8731]: E1205 12:31:43.033720 8731 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 05 12:31:43.033897 master-0 kubenswrapper[8731]: I1205 12:31:43.033735 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033897 master-0 kubenswrapper[8731]: E1205 12:31:43.033788 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.533768078 +0000 UTC m=+1.837752325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : secret "metrics-daemon-secret" not found Dec 05 12:31:43.033897 master-0 kubenswrapper[8731]: I1205 12:31:43.033796 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033897 master-0 kubenswrapper[8731]: I1205 12:31:43.033824 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:43.033897 master-0 kubenswrapper[8731]: I1205 12:31:43.033845 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-kubelet\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.033897 master-0 kubenswrapper[8731]: E1205 12:31:43.033873 8731 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:31:43.034219 master-0 kubenswrapper[8731]: E1205 12:31:43.033931 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:43.533915891 +0000 UTC m=+1.837900188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:31:43.037711 master-0 kubenswrapper[8731]: I1205 12:31:43.037669 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1871a9d6-6369-4d08-816f-9c6310b61ddf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:31:43.052846 master-0 kubenswrapper[8731]: I1205 12:31:43.052802 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/594aaded-5615-4bed-87ee-6173059a73be-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:31:43.072435 master-0 kubenswrapper[8731]: I1205 12:31:43.072389 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlnqb\" (UniqueName: \"kubernetes.io/projected/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-kube-api-access-mlnqb\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:31:43.093069 master-0 kubenswrapper[8731]: I1205 12:31:43.093022 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-996h9\" (UniqueName: \"kubernetes.io/projected/5efad170-c154-42ec-a7c0-b36a98d2bfcc-kube-api-access-996h9\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:31:43.113393 master-0 kubenswrapper[8731]: I1205 12:31:43.113268 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:43.154470 master-0 kubenswrapper[8731]: I1205 12:31:43.154408 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/807d9093-aa67-4840-b5be-7f3abcc1beed-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:31:43.175902 master-0 kubenswrapper[8731]: I1205 12:31:43.174341 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69n5s\" (UniqueName: \"kubernetes.io/projected/fb7003a6-4341-49eb-bec3-76ba8610fa12-kube-api-access-69n5s\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:43.175902 master-0 kubenswrapper[8731]: I1205 12:31:43.174671 8731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:31:43.192735 master-0 kubenswrapper[8731]: I1205 12:31:43.192688 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvbfq\" (UniqueName: \"kubernetes.io/projected/b8233dad-bd19-4842-a4d5-cfa84f1feb83-kube-api-access-mvbfq\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:31:43.216441 master-0 kubenswrapper[8731]: I1205 12:31:43.216299 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5p5d\" (UniqueName: \"kubernetes.io/projected/5f0c6889-0739-48a3-99cd-6db9d1f83242-kube-api-access-p5p5d\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:43.242052 master-0 kubenswrapper[8731]: I1205 12:31:43.242006 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2gd8\" (UniqueName: \"kubernetes.io/projected/1e6babfe-724a-4eab-bb3b-bc318bf57b70-kube-api-access-c2gd8\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:43.254395 master-0 kubenswrapper[8731]: I1205 12:31:43.254329 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62nqj\" (UniqueName: \"kubernetes.io/projected/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-kube-api-access-62nqj\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:43.272857 master-0 kubenswrapper[8731]: I1205 12:31:43.272818 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph9w6\" (UniqueName: \"kubernetes.io/projected/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-kube-api-access-ph9w6\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:31:43.292525 master-0 kubenswrapper[8731]: I1205 12:31:43.292492 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6mb6\" (UniqueName: \"kubernetes.io/projected/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-kube-api-access-z6mb6\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:43.314347 master-0 kubenswrapper[8731]: I1205 12:31:43.314295 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z8h9\" (UniqueName: \"kubernetes.io/projected/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-kube-api-access-9z8h9\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:31:43.333047 master-0 kubenswrapper[8731]: I1205 12:31:43.332997 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29812c4b-48ac-488c-863c-1d52e39ea2ae-kube-api-access\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:43.355285 master-0 kubenswrapper[8731]: I1205 12:31:43.355241 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vx2z\" (UniqueName: \"kubernetes.io/projected/708bf629-9949-4b79-a88a-c73ba033475b-kube-api-access-6vx2z\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:31:43.373674 master-0 kubenswrapper[8731]: I1205 12:31:43.373551 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtvzs\" (UniqueName: \"kubernetes.io/projected/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-kube-api-access-dtvzs\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:31:43.392948 master-0 kubenswrapper[8731]: I1205 12:31:43.392841 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nml2g\" (UniqueName: \"kubernetes.io/projected/a2acba71-b9dc-4b85-be35-c995b8be2f19-kube-api-access-nml2g\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:43.414765 master-0 kubenswrapper[8731]: I1205 12:31:43.414710 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x59kd\" (UniqueName: \"kubernetes.io/projected/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-kube-api-access-x59kd\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:31:43.432981 master-0 kubenswrapper[8731]: I1205 12:31:43.432933 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-bound-sa-token\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:43.453368 master-0 kubenswrapper[8731]: I1205 12:31:43.453311 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flxbg\" (UniqueName: \"kubernetes.io/projected/f3792522-fec6-4022-90ac-0b8467fcd625-kube-api-access-flxbg\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:31:43.472901 master-0 kubenswrapper[8731]: I1205 12:31:43.472862 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwcr9\" (UniqueName: \"kubernetes.io/projected/f119ffe4-16bd-49eb-916d-b18ba0d79b54-kube-api-access-wwcr9\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:31:43.493549 master-0 kubenswrapper[8731]: I1205 12:31:43.493503 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26x2z\" (UniqueName: \"kubernetes.io/projected/49760d62-02e5-4882-b47f-663102b04946-kube-api-access-26x2z\") pod \"csi-snapshot-controller-operator-6bc8656fdc-zn7hv\" (UID: \"49760d62-02e5-4882-b47f-663102b04946\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" Dec 05 12:31:43.514080 master-0 kubenswrapper[8731]: I1205 12:31:43.514022 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtjln\" (UniqueName: \"kubernetes.io/projected/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-kube-api-access-xtjln\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:31:43.533200 master-0 kubenswrapper[8731]: I1205 12:31:43.533146 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:43.539523 master-0 kubenswrapper[8731]: I1205 12:31:43.539482 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:43.539600 master-0 kubenswrapper[8731]: I1205 12:31:43.539526 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:43.539600 master-0 kubenswrapper[8731]: I1205 12:31:43.539573 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:43.539600 master-0 kubenswrapper[8731]: I1205 12:31:43.539594 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:43.539683 master-0 kubenswrapper[8731]: I1205 12:31:43.539620 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:43.539683 master-0 kubenswrapper[8731]: I1205 12:31:43.539655 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:43.539751 master-0 kubenswrapper[8731]: I1205 12:31:43.539698 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:43.539751 master-0 kubenswrapper[8731]: I1205 12:31:43.539732 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:43.539814 master-0 kubenswrapper[8731]: I1205 12:31:43.539761 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:43.539854 master-0 kubenswrapper[8731]: I1205 12:31:43.539787 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:43.539854 master-0 kubenswrapper[8731]: I1205 12:31:43.539845 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:43.540017 master-0 kubenswrapper[8731]: E1205 12:31:43.539979 8731 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 05 12:31:43.540056 master-0 kubenswrapper[8731]: E1205 12:31:43.540043 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.540025528 +0000 UTC m=+2.844009695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : secret "metrics-daemon-secret" not found Dec 05 12:31:43.540694 master-0 kubenswrapper[8731]: E1205 12:31:43.540661 8731 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:31:43.540749 master-0 kubenswrapper[8731]: E1205 12:31:43.540705 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.540694975 +0000 UTC m=+2.844679142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:31:43.540871 master-0 kubenswrapper[8731]: E1205 12:31:43.540779 8731 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:43.540871 master-0 kubenswrapper[8731]: E1205 12:31:43.540807 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.540799688 +0000 UTC m=+2.844783865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:31:43.540871 master-0 kubenswrapper[8731]: E1205 12:31:43.540852 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:43.540871 master-0 kubenswrapper[8731]: E1205 12:31:43.540876 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.540868639 +0000 UTC m=+2.844852806 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:43.540986 master-0 kubenswrapper[8731]: E1205 12:31:43.540921 8731 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:43.540986 master-0 kubenswrapper[8731]: E1205 12:31:43.540947 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.540938802 +0000 UTC m=+2.844922979 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:31:43.540986 master-0 kubenswrapper[8731]: E1205 12:31:43.540985 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:31:43.541066 master-0 kubenswrapper[8731]: E1205 12:31:43.541011 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.541001854 +0000 UTC m=+2.844986021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:31:43.541066 master-0 kubenswrapper[8731]: E1205 12:31:43.541055 8731 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:43.541125 master-0 kubenswrapper[8731]: E1205 12:31:43.541080 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.541072636 +0000 UTC m=+2.845056793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:43.541173 master-0 kubenswrapper[8731]: E1205 12:31:43.541129 8731 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:43.541173 master-0 kubenswrapper[8731]: E1205 12:31:43.541155 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.541147638 +0000 UTC m=+2.845131805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:43.541272 master-0 kubenswrapper[8731]: E1205 12:31:43.541219 8731 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:31:43.541272 master-0 kubenswrapper[8731]: E1205 12:31:43.541250 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.54124169 +0000 UTC m=+2.845225857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:31:43.541396 master-0 kubenswrapper[8731]: E1205 12:31:43.541291 8731 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:31:43.541396 master-0 kubenswrapper[8731]: E1205 12:31:43.541328 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.541320722 +0000 UTC m=+2.845304889 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:31:43.541396 master-0 kubenswrapper[8731]: E1205 12:31:43.541370 8731 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:31:43.541396 master-0 kubenswrapper[8731]: E1205 12:31:43.541393 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:44.541385864 +0000 UTC m=+2.845370021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:31:43.553523 master-0 kubenswrapper[8731]: I1205 12:31:43.553480 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tngh\" (UniqueName: \"kubernetes.io/projected/d53a4886-db25-43a1-825a-66a9a9a58590-kube-api-access-2tngh\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:31:43.568305 master-0 kubenswrapper[8731]: W1205 12:31:43.568260 8731 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Dec 05 12:31:43.569307 master-0 kubenswrapper[8731]: E1205 12:31:43.569252 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:31:43.589913 master-0 kubenswrapper[8731]: E1205 12:31:43.589847 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:31:43.607803 master-0 kubenswrapper[8731]: E1205 12:31:43.607752 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:43.628390 master-0 kubenswrapper[8731]: E1205 12:31:43.628226 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:31:43.651420 master-0 kubenswrapper[8731]: E1205 12:31:43.651363 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:43.846931 master-0 kubenswrapper[8731]: E1205 12:31:43.846833 8731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e85850a4ae1a1e3ec2c590a4936d640882b6550124da22031c85b526afbf52df" Dec 05 12:31:43.847237 master-0 kubenswrapper[8731]: E1205 12:31:43.847130 8731 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 05 12:31:43.847237 master-0 kubenswrapper[8731]: container &Container{Name:authentication-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e85850a4ae1a1e3ec2c590a4936d640882b6550124da22031c85b526afbf52df,Command:[/bin/bash -ec],Args:[if [ -s /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt ]; then Dec 05 12:31:43.847237 master-0 kubenswrapper[8731]: echo "Copying system trust bundle" Dec 05 12:31:43.847237 master-0 kubenswrapper[8731]: cp -f /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem Dec 05 12:31:43.847237 master-0 kubenswrapper[8731]: fi Dec 05 12:31:43.847237 master-0 kubenswrapper[8731]: exec authentication-operator operator --config=/var/run/configmaps/config/operator-config.yaml --v=2 --terminate-on-files=/var/run/configmaps/trusted-ca-bundle/ca-bundle.crt --terminate-on-files=/tmp/terminate Dec 05 12:31:43.847237 master-0 kubenswrapper[8731]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:IMAGE_OAUTH_SERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8860e00f858d1bca98344f21b5a5c4acc43c9c6eca8216582514021f0ab3cf7b,ValueFrom:nil,},EnvVar{Name:IMAGE_OAUTH_APISERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91af633e585621630c40d14f188e37d36b44678d0a59e582d850bf8d593d3a0c,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.29,ValueFrom:nil,},EnvVar{Name:OPERAND_OAUTH_SERVER_IMAGE_VERSION,Value:4.18.29_openshift,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{209715200 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:trusted-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/trusted-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:service-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/service-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-55qpg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod authentication-operator-6c968fdfdf-xxmfp_openshift-authentication-operator(ba095394-1873-4793-969d-3be979fa0771): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Dec 05 12:31:43.847237 master-0 kubenswrapper[8731]: > logger="UnhandledError" Dec 05 12:31:43.848415 master-0 kubenswrapper[8731]: E1205 12:31:43.848341 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" podUID="ba095394-1873-4793-969d-3be979fa0771" Dec 05 12:31:44.126585 master-0 kubenswrapper[8731]: I1205 12:31:44.126515 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:31:44.499253 master-0 kubenswrapper[8731]: E1205 12:31:44.499142 8731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9" Dec 05 12:31:44.499514 master-0 kubenswrapper[8731]: E1205 12:31:44.499380 8731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-controller-manager-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9,Command:[cluster-kube-controller-manager-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9,ValueFrom:nil,},EnvVar{Name:CLUSTER_POLICY_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96,ValueFrom:nil,},EnvVar{Name:TOOLS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fa40d32981d88f32a9cdedc4cdd4a08c43e0d17bd4cc3fc3a87e9d1c7e1259d0,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.29,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.31.13,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-controller-manager-operator-848f645654-g6nj5_openshift-kube-controller-manager-operator(594aaded-5615-4bed-87ee-6173059a73be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:31:44.500616 master-0 kubenswrapper[8731]: E1205 12:31:44.500569 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" podUID="594aaded-5615-4bed-87ee-6173059a73be" Dec 05 12:31:44.552717 master-0 kubenswrapper[8731]: I1205 12:31:44.552648 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:44.552717 master-0 kubenswrapper[8731]: I1205 12:31:44.552706 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:44.552717 master-0 kubenswrapper[8731]: I1205 12:31:44.552736 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:44.553122 master-0 kubenswrapper[8731]: E1205 12:31:44.552832 8731 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:44.553122 master-0 kubenswrapper[8731]: I1205 12:31:44.552856 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:44.553122 master-0 kubenswrapper[8731]: E1205 12:31:44.552897 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.552872986 +0000 UTC m=+4.856857153 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:31:44.553122 master-0 kubenswrapper[8731]: E1205 12:31:44.552918 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:44.553122 master-0 kubenswrapper[8731]: I1205 12:31:44.552923 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:44.553122 master-0 kubenswrapper[8731]: E1205 12:31:44.552834 8731 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:31:44.553122 master-0 kubenswrapper[8731]: E1205 12:31:44.552950 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.552939298 +0000 UTC m=+4.856923465 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:44.553122 master-0 kubenswrapper[8731]: E1205 12:31:44.552974 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.552967088 +0000 UTC m=+4.856951255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:31:44.553532 master-0 kubenswrapper[8731]: E1205 12:31:44.553246 8731 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:44.553532 master-0 kubenswrapper[8731]: I1205 12:31:44.553427 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:44.553532 master-0 kubenswrapper[8731]: E1205 12:31:44.553502 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:31:44.553697 master-0 kubenswrapper[8731]: E1205 12:31:44.553505 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.553483563 +0000 UTC m=+4.857467730 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:31:44.553697 master-0 kubenswrapper[8731]: E1205 12:31:44.553570 8731 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:44.553697 master-0 kubenswrapper[8731]: E1205 12:31:44.553629 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.553583105 +0000 UTC m=+4.857567392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:31:44.553869 master-0 kubenswrapper[8731]: I1205 12:31:44.553713 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:44.553869 master-0 kubenswrapper[8731]: E1205 12:31:44.553856 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.553817951 +0000 UTC m=+4.857802118 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:44.553987 master-0 kubenswrapper[8731]: E1205 12:31:44.553880 8731 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:44.553987 master-0 kubenswrapper[8731]: I1205 12:31:44.553907 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:44.553987 master-0 kubenswrapper[8731]: I1205 12:31:44.553955 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:44.553987 master-0 kubenswrapper[8731]: E1205 12:31:44.553974 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.553950175 +0000 UTC m=+4.857934342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:44.554225 master-0 kubenswrapper[8731]: I1205 12:31:44.554000 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:44.554225 master-0 kubenswrapper[8731]: E1205 12:31:44.554038 8731 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:31:44.554225 master-0 kubenswrapper[8731]: I1205 12:31:44.554052 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:44.554225 master-0 kubenswrapper[8731]: E1205 12:31:44.554068 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.554060657 +0000 UTC m=+4.858044824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:31:44.554225 master-0 kubenswrapper[8731]: E1205 12:31:44.554126 8731 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:31:44.554225 master-0 kubenswrapper[8731]: E1205 12:31:44.554157 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.55414805 +0000 UTC m=+4.858132217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:31:44.554225 master-0 kubenswrapper[8731]: E1205 12:31:44.554156 8731 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:31:44.554225 master-0 kubenswrapper[8731]: E1205 12:31:44.554217 8731 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 05 12:31:44.555145 master-0 kubenswrapper[8731]: E1205 12:31:44.554257 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.554250102 +0000 UTC m=+4.858234269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : secret "metrics-daemon-secret" not found Dec 05 12:31:44.555145 master-0 kubenswrapper[8731]: E1205 12:31:44.554293 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:31:46.554285903 +0000 UTC m=+4.858270070 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:31:44.959064 master-0 kubenswrapper[8731]: I1205 12:31:44.958974 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:44.982954 master-0 kubenswrapper[8731]: I1205 12:31:44.982854 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:45.014609 master-0 kubenswrapper[8731]: I1205 12:31:45.014559 8731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:31:45.014609 master-0 kubenswrapper[8731]: I1205 12:31:45.014597 8731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:31:45.215021 master-0 kubenswrapper[8731]: E1205 12:31:45.214948 8731 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da" Dec 05 12:31:45.215510 master-0 kubenswrapper[8731]: E1205 12:31:45.215197 8731 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:service-ca-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da,Command:[service-ca-operator operator],Args:[--config=/var/run/configmaps/config/operator-config.yaml -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.29,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{83886080 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flxbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-ca-operator-77758bc754-hfqsp_openshift-service-ca-operator(f3792522-fec6-4022-90ac-0b8467fcd625): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 12:31:45.217597 master-0 kubenswrapper[8731]: E1205 12:31:45.217513 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" podUID="f3792522-fec6-4022-90ac-0b8467fcd625" Dec 05 12:31:45.242572 master-0 kubenswrapper[8731]: I1205 12:31:45.242517 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:45.255204 master-0 kubenswrapper[8731]: I1205 12:31:45.255088 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:46.020815 master-0 kubenswrapper[8731]: I1205 12:31:46.019801 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" event={"ID":"49760d62-02e5-4882-b47f-663102b04946","Type":"ContainerStarted","Data":"987d6983e55310f76f89331773ed3d708557e669dfdedbfdd605e1afe8d494c4"} Dec 05 12:31:46.022360 master-0 kubenswrapper[8731]: I1205 12:31:46.022294 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" event={"ID":"1871a9d6-6369-4d08-816f-9c6310b61ddf","Type":"ContainerStarted","Data":"4f8a59bfccc80caaa9ccb9172563888264ac2bfba8642d650c783edb02a956b7"} Dec 05 12:31:46.027040 master-0 kubenswrapper[8731]: I1205 12:31:46.026995 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerStarted","Data":"e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072"} Dec 05 12:31:46.028814 master-0 kubenswrapper[8731]: I1205 12:31:46.028725 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerStarted","Data":"eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d"} Dec 05 12:31:46.031194 master-0 kubenswrapper[8731]: I1205 12:31:46.031130 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" event={"ID":"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e","Type":"ContainerStarted","Data":"401643c70c405d6156a16a3ab17611e0b06471ba9931da499a2092a2a6caa1f3"} Dec 05 12:31:46.034302 master-0 kubenswrapper[8731]: I1205 12:31:46.034248 8731 generic.go:334] "Generic (PLEG): container finished" podID="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" containerID="373b9eebb249846584e2d3e04b61f1d2ede61eec7ddbb37f633ff477767fcf89" exitCode=0 Dec 05 12:31:46.034380 master-0 kubenswrapper[8731]: I1205 12:31:46.034351 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerDied","Data":"373b9eebb249846584e2d3e04b61f1d2ede61eec7ddbb37f633ff477767fcf89"} Dec 05 12:31:46.039796 master-0 kubenswrapper[8731]: I1205 12:31:46.038505 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerStarted","Data":"47752e8beaf9f853c41667ca645eb6d00a5917c9b6cb4206f48e1b5596bdcc79"} Dec 05 12:31:46.428477 master-0 kubenswrapper[8731]: I1205 12:31:46.428348 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv"] Dec 05 12:31:46.429959 master-0 kubenswrapper[8731]: E1205 12:31:46.428524 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58aa15d-cdea-4a90-ba40-706d6a85735e" containerName="prober" Dec 05 12:31:46.429959 master-0 kubenswrapper[8731]: I1205 12:31:46.428537 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58aa15d-cdea-4a90-ba40-706d6a85735e" containerName="prober" Dec 05 12:31:46.429959 master-0 kubenswrapper[8731]: E1205 12:31:46.428547 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerName="assisted-installer-controller" Dec 05 12:31:46.429959 master-0 kubenswrapper[8731]: I1205 12:31:46.428552 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerName="assisted-installer-controller" Dec 05 12:31:46.429959 master-0 kubenswrapper[8731]: I1205 12:31:46.428637 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerName="assisted-installer-controller" Dec 05 12:31:46.429959 master-0 kubenswrapper[8731]: I1205 12:31:46.428652 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58aa15d-cdea-4a90-ba40-706d6a85735e" containerName="prober" Dec 05 12:31:46.429959 master-0 kubenswrapper[8731]: I1205 12:31:46.428958 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" Dec 05 12:31:46.450498 master-0 kubenswrapper[8731]: I1205 12:31:46.450446 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv"] Dec 05 12:31:46.477083 master-0 kubenswrapper[8731]: I1205 12:31:46.477019 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfl8f\" (UniqueName: \"kubernetes.io/projected/b9623eb8-55d2-4c5c-aa8d-74b6a27274d8-kube-api-access-hfl8f\") pod \"csi-snapshot-controller-6b958b6f94-7r5wv\" (UID: \"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" Dec 05 12:31:46.485069 master-0 kubenswrapper[8731]: I1205 12:31:46.485017 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:46.578394 master-0 kubenswrapper[8731]: I1205 12:31:46.578337 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:46.578394 master-0 kubenswrapper[8731]: I1205 12:31:46.578386 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:46.578663 master-0 kubenswrapper[8731]: I1205 12:31:46.578421 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:46.578663 master-0 kubenswrapper[8731]: I1205 12:31:46.578595 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:46.578663 master-0 kubenswrapper[8731]: I1205 12:31:46.578624 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:46.578663 master-0 kubenswrapper[8731]: E1205 12:31:46.578633 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:46.578774 master-0 kubenswrapper[8731]: E1205 12:31:46.578748 8731 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:31:46.578845 master-0 kubenswrapper[8731]: E1205 12:31:46.578824 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:31:46.578882 master-0 kubenswrapper[8731]: E1205 12:31:46.578872 8731 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:46.578927 master-0 kubenswrapper[8731]: E1205 12:31:46.578911 8731 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:46.578927 master-0 kubenswrapper[8731]: E1205 12:31:46.578751 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.578720627 +0000 UTC m=+8.882704794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:46.578996 master-0 kubenswrapper[8731]: E1205 12:31:46.578952 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.578935413 +0000 UTC m=+8.882919580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:31:46.578996 master-0 kubenswrapper[8731]: E1205 12:31:46.578963 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.578958143 +0000 UTC m=+8.882942300 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:31:46.578996 master-0 kubenswrapper[8731]: E1205 12:31:46.578974 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.578968954 +0000 UTC m=+8.882953121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:46.578996 master-0 kubenswrapper[8731]: E1205 12:31:46.578986 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.578979004 +0000 UTC m=+8.882963171 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:46.579112 master-0 kubenswrapper[8731]: E1205 12:31:46.579020 8731 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:46.579112 master-0 kubenswrapper[8731]: E1205 12:31:46.579037 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.579032265 +0000 UTC m=+8.883016432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:31:46.579112 master-0 kubenswrapper[8731]: I1205 12:31:46.578650 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:46.579112 master-0 kubenswrapper[8731]: I1205 12:31:46.579063 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:46.579112 master-0 kubenswrapper[8731]: I1205 12:31:46.579081 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:46.579112 master-0 kubenswrapper[8731]: I1205 12:31:46.579101 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:46.579295 master-0 kubenswrapper[8731]: I1205 12:31:46.579125 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:46.579295 master-0 kubenswrapper[8731]: I1205 12:31:46.579147 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfl8f\" (UniqueName: \"kubernetes.io/projected/b9623eb8-55d2-4c5c-aa8d-74b6a27274d8-kube-api-access-hfl8f\") pod \"csi-snapshot-controller-6b958b6f94-7r5wv\" (UID: \"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" Dec 05 12:31:46.579295 master-0 kubenswrapper[8731]: I1205 12:31:46.579166 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:46.579295 master-0 kubenswrapper[8731]: E1205 12:31:46.579267 8731 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:31:46.579412 master-0 kubenswrapper[8731]: E1205 12:31:46.579305 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.579292253 +0000 UTC m=+8.883276420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:31:46.579412 master-0 kubenswrapper[8731]: E1205 12:31:46.579352 8731 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:31:46.579412 master-0 kubenswrapper[8731]: E1205 12:31:46.579373 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.579366405 +0000 UTC m=+8.883350572 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:31:46.579502 master-0 kubenswrapper[8731]: E1205 12:31:46.579485 8731 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:46.579532 master-0 kubenswrapper[8731]: E1205 12:31:46.579510 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.579504138 +0000 UTC m=+8.883488305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:31:46.579566 master-0 kubenswrapper[8731]: E1205 12:31:46.579543 8731 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 05 12:31:46.579566 master-0 kubenswrapper[8731]: E1205 12:31:46.579564 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.57955827 +0000 UTC m=+8.883542437 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : secret "metrics-daemon-secret" not found Dec 05 12:31:46.579647 master-0 kubenswrapper[8731]: E1205 12:31:46.579622 8731 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:31:46.579683 master-0 kubenswrapper[8731]: E1205 12:31:46.579664 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.579654202 +0000 UTC m=+8.883638559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:31:46.620210 master-0 kubenswrapper[8731]: I1205 12:31:46.619917 8731 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 12:31:46.633171 master-0 kubenswrapper[8731]: I1205 12:31:46.631895 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfl8f\" (UniqueName: \"kubernetes.io/projected/b9623eb8-55d2-4c5c-aa8d-74b6a27274d8-kube-api-access-hfl8f\") pod \"csi-snapshot-controller-6b958b6f94-7r5wv\" (UID: \"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" Dec 05 12:31:46.666162 master-0 kubenswrapper[8731]: I1205 12:31:46.666088 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6"] Dec 05 12:31:46.666926 master-0 kubenswrapper[8731]: I1205 12:31:46.666900 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" Dec 05 12:31:46.669968 master-0 kubenswrapper[8731]: I1205 12:31:46.669897 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 12:31:46.670206 master-0 kubenswrapper[8731]: I1205 12:31:46.670106 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 12:31:46.686741 master-0 kubenswrapper[8731]: I1205 12:31:46.686300 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6"] Dec 05 12:31:46.748504 master-0 kubenswrapper[8731]: I1205 12:31:46.748450 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" Dec 05 12:31:46.789206 master-0 kubenswrapper[8731]: I1205 12:31:46.785982 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48ns8\" (UniqueName: \"kubernetes.io/projected/480c1f6e-0e13-49f9-bc4e-07350842f16c-kube-api-access-48ns8\") pod \"migrator-74b7b57c65-fp4s6\" (UID: \"480c1f6e-0e13-49f9-bc4e-07350842f16c\") " pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" Dec 05 12:31:46.814205 master-0 kubenswrapper[8731]: I1205 12:31:46.809801 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:46.830670 master-0 kubenswrapper[8731]: I1205 12:31:46.830604 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:46.887612 master-0 kubenswrapper[8731]: I1205 12:31:46.887133 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48ns8\" (UniqueName: \"kubernetes.io/projected/480c1f6e-0e13-49f9-bc4e-07350842f16c-kube-api-access-48ns8\") pod \"migrator-74b7b57c65-fp4s6\" (UID: \"480c1f6e-0e13-49f9-bc4e-07350842f16c\") " pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" Dec 05 12:31:46.928506 master-0 kubenswrapper[8731]: I1205 12:31:46.928003 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48ns8\" (UniqueName: \"kubernetes.io/projected/480c1f6e-0e13-49f9-bc4e-07350842f16c-kube-api-access-48ns8\") pod \"migrator-74b7b57c65-fp4s6\" (UID: \"480c1f6e-0e13-49f9-bc4e-07350842f16c\") " pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" Dec 05 12:31:46.989064 master-0 kubenswrapper[8731]: I1205 12:31:46.988986 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" Dec 05 12:31:47.037028 master-0 kubenswrapper[8731]: I1205 12:31:47.034570 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv"] Dec 05 12:31:47.045060 master-0 kubenswrapper[8731]: W1205 12:31:47.044988 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9623eb8_55d2_4c5c_aa8d_74b6a27274d8.slice/crio-9cfdae6ccb167d4a6f250b34ce3b8d4ec56326be1aca0a0b497bcb1caa6ac3cf WatchSource:0}: Error finding container 9cfdae6ccb167d4a6f250b34ce3b8d4ec56326be1aca0a0b497bcb1caa6ac3cf: Status 404 returned error can't find the container with id 9cfdae6ccb167d4a6f250b34ce3b8d4ec56326be1aca0a0b497bcb1caa6ac3cf Dec 05 12:31:47.046397 master-0 kubenswrapper[8731]: I1205 12:31:47.046124 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:31:47.191280 master-0 kubenswrapper[8731]: I1205 12:31:47.191200 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6"] Dec 05 12:31:47.200691 master-0 kubenswrapper[8731]: W1205 12:31:47.200542 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480c1f6e_0e13_49f9_bc4e_07350842f16c.slice/crio-54d1c55b3ab43714c6f9d30fae64742364176327ec7be4503594ab7c679b2007 WatchSource:0}: Error finding container 54d1c55b3ab43714c6f9d30fae64742364176327ec7be4503594ab7c679b2007: Status 404 returned error can't find the container with id 54d1c55b3ab43714c6f9d30fae64742364176327ec7be4503594ab7c679b2007 Dec 05 12:31:47.572424 master-0 kubenswrapper[8731]: I1205 12:31:47.571692 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc"] Dec 05 12:31:47.572424 master-0 kubenswrapper[8731]: I1205 12:31:47.572292 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.574958 master-0 kubenswrapper[8731]: I1205 12:31:47.574889 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 12:31:47.575086 master-0 kubenswrapper[8731]: I1205 12:31:47.575018 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 12:31:47.575266 master-0 kubenswrapper[8731]: I1205 12:31:47.574902 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 12:31:47.575501 master-0 kubenswrapper[8731]: I1205 12:31:47.575444 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 12:31:47.577972 master-0 kubenswrapper[8731]: I1205 12:31:47.577632 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 12:31:47.579158 master-0 kubenswrapper[8731]: I1205 12:31:47.579063 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 12:31:47.596462 master-0 kubenswrapper[8731]: I1205 12:31:47.594468 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc"] Dec 05 12:31:47.601011 master-0 kubenswrapper[8731]: I1205 12:31:47.600609 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.601136 master-0 kubenswrapper[8731]: I1205 12:31:47.601051 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.601136 master-0 kubenswrapper[8731]: I1205 12:31:47.601130 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.601242 master-0 kubenswrapper[8731]: I1205 12:31:47.601151 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szgcz\" (UniqueName: \"kubernetes.io/projected/f94de527-4a20-4d5b-8688-a2c46566c3c1-kube-api-access-szgcz\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.601242 master-0 kubenswrapper[8731]: I1205 12:31:47.601200 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.702895 master-0 kubenswrapper[8731]: I1205 12:31:47.702827 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.702895 master-0 kubenswrapper[8731]: I1205 12:31:47.702886 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szgcz\" (UniqueName: \"kubernetes.io/projected/f94de527-4a20-4d5b-8688-a2c46566c3c1-kube-api-access-szgcz\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.703295 master-0 kubenswrapper[8731]: I1205 12:31:47.703058 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.703295 master-0 kubenswrapper[8731]: E1205 12:31:47.703197 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Dec 05 12:31:47.703374 master-0 kubenswrapper[8731]: I1205 12:31:47.703290 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.703374 master-0 kubenswrapper[8731]: E1205 12:31:47.703317 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:48.203294141 +0000 UTC m=+6.507278308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : configmap "openshift-global-ca" not found Dec 05 12:31:47.703466 master-0 kubenswrapper[8731]: E1205 12:31:47.703337 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Dec 05 12:31:47.703466 master-0 kubenswrapper[8731]: E1205 12:31:47.703412 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:47.703466 master-0 kubenswrapper[8731]: E1205 12:31:47.703463 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:48.203433354 +0000 UTC m=+6.507417521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : configmap "config" not found Dec 05 12:31:47.703570 master-0 kubenswrapper[8731]: E1205 12:31:47.703488 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:48.203479396 +0000 UTC m=+6.507463563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : configmap "client-ca" not found Dec 05 12:31:47.703570 master-0 kubenswrapper[8731]: I1205 12:31:47.703543 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.703778 master-0 kubenswrapper[8731]: E1205 12:31:47.703752 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:47.703822 master-0 kubenswrapper[8731]: E1205 12:31:47.703791 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:48.203784884 +0000 UTC m=+6.507769051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : secret "serving-cert" not found Dec 05 12:31:47.724544 master-0 kubenswrapper[8731]: I1205 12:31:47.724435 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szgcz\" (UniqueName: \"kubernetes.io/projected/f94de527-4a20-4d5b-8688-a2c46566c3c1-kube-api-access-szgcz\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:47.964955 master-0 kubenswrapper[8731]: I1205 12:31:47.963144 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:47.968424 master-0 kubenswrapper[8731]: I1205 12:31:47.968406 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:48.064988 master-0 kubenswrapper[8731]: I1205 12:31:48.064415 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" event={"ID":"480c1f6e-0e13-49f9-bc4e-07350842f16c","Type":"ContainerStarted","Data":"54d1c55b3ab43714c6f9d30fae64742364176327ec7be4503594ab7c679b2007"} Dec 05 12:31:48.066998 master-0 kubenswrapper[8731]: I1205 12:31:48.066951 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerStarted","Data":"9cfdae6ccb167d4a6f250b34ce3b8d4ec56326be1aca0a0b497bcb1caa6ac3cf"} Dec 05 12:31:48.074976 master-0 kubenswrapper[8731]: I1205 12:31:48.072763 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: I1205 12:31:48.209573 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: E1205 12:31:48.209765 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: I1205 12:31:48.209798 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: E1205 12:31:48.209898 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:49.20987507 +0000 UTC m=+7.513859237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : configmap "config" not found Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: I1205 12:31:48.209981 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: E1205 12:31:48.209980 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: I1205 12:31:48.210054 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: E1205 12:31:48.210110 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:49.210076715 +0000 UTC m=+7.514061042 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : configmap "openshift-global-ca" not found Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: E1205 12:31:48.210208 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: E1205 12:31:48.210216 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: E1205 12:31:48.210285 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:49.21026598 +0000 UTC m=+7.514250147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : secret "serving-cert" not found Dec 05 12:31:48.210313 master-0 kubenswrapper[8731]: E1205 12:31:48.210333 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:49.210326431 +0000 UTC m=+7.514310598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : configmap "client-ca" not found Dec 05 12:31:49.223949 master-0 kubenswrapper[8731]: I1205 12:31:49.223904 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:49.224536 master-0 kubenswrapper[8731]: I1205 12:31:49.224517 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:49.224669 master-0 kubenswrapper[8731]: I1205 12:31:49.224655 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:49.224739 master-0 kubenswrapper[8731]: E1205 12:31:49.224067 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Dec 05 12:31:49.224841 master-0 kubenswrapper[8731]: E1205 12:31:49.224831 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:51.224809955 +0000 UTC m=+9.528794122 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : configmap "config" not found Dec 05 12:31:49.224923 master-0 kubenswrapper[8731]: E1205 12:31:49.224599 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Dec 05 12:31:49.224989 master-0 kubenswrapper[8731]: E1205 12:31:49.224969 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:51.224945278 +0000 UTC m=+9.528929485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : configmap "openshift-global-ca" not found Dec 05 12:31:49.225092 master-0 kubenswrapper[8731]: I1205 12:31:49.225076 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert\") pod \"controller-manager-77f4fc6d5d-9lbdc\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:49.225251 master-0 kubenswrapper[8731]: E1205 12:31:49.225213 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:49.225295 master-0 kubenswrapper[8731]: E1205 12:31:49.225282 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:51.225268636 +0000 UTC m=+9.529252823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : secret "serving-cert" not found Dec 05 12:31:49.225333 master-0 kubenswrapper[8731]: E1205 12:31:49.225291 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:49.225470 master-0 kubenswrapper[8731]: E1205 12:31:49.225444 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca podName:f94de527-4a20-4d5b-8688-a2c46566c3c1 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:51.2254042 +0000 UTC m=+9.529388397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca") pod "controller-manager-77f4fc6d5d-9lbdc" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1") : configmap "client-ca" not found Dec 05 12:31:49.437955 master-0 kubenswrapper[8731]: I1205 12:31:49.437876 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:49.438314 master-0 kubenswrapper[8731]: I1205 12:31:49.438079 8731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:31:49.438314 master-0 kubenswrapper[8731]: I1205 12:31:49.438092 8731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:31:49.466026 master-0 kubenswrapper[8731]: I1205 12:31:49.465967 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:49.780324 master-0 kubenswrapper[8731]: I1205 12:31:49.779538 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc"] Dec 05 12:31:49.780324 master-0 kubenswrapper[8731]: E1205 12:31:49.780192 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" podUID="f94de527-4a20-4d5b-8688-a2c46566c3c1" Dec 05 12:31:49.790776 master-0 kubenswrapper[8731]: I1205 12:31:49.788111 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q"] Dec 05 12:31:49.790776 master-0 kubenswrapper[8731]: I1205 12:31:49.788690 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:49.790987 master-0 kubenswrapper[8731]: I1205 12:31:49.790845 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 12:31:49.790987 master-0 kubenswrapper[8731]: I1205 12:31:49.790971 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 12:31:49.795317 master-0 kubenswrapper[8731]: I1205 12:31:49.791151 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 12:31:49.795317 master-0 kubenswrapper[8731]: I1205 12:31:49.791690 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 12:31:49.795317 master-0 kubenswrapper[8731]: I1205 12:31:49.791914 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 12:31:49.801612 master-0 kubenswrapper[8731]: I1205 12:31:49.801564 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q"] Dec 05 12:31:49.834130 master-0 kubenswrapper[8731]: I1205 12:31:49.834049 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-config\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:49.834432 master-0 kubenswrapper[8731]: I1205 12:31:49.834340 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:49.834432 master-0 kubenswrapper[8731]: I1205 12:31:49.834403 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:49.834522 master-0 kubenswrapper[8731]: I1205 12:31:49.834435 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7hvk\" (UniqueName: \"kubernetes.io/projected/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-kube-api-access-w7hvk\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:49.935867 master-0 kubenswrapper[8731]: I1205 12:31:49.935774 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-config\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:49.935867 master-0 kubenswrapper[8731]: I1205 12:31:49.935878 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:49.936225 master-0 kubenswrapper[8731]: I1205 12:31:49.936045 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:49.936390 master-0 kubenswrapper[8731]: E1205 12:31:49.936309 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:49.936474 master-0 kubenswrapper[8731]: E1205 12:31:49.936440 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.436416599 +0000 UTC m=+8.740400766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : secret "serving-cert" not found Dec 05 12:31:49.936640 master-0 kubenswrapper[8731]: E1205 12:31:49.936597 8731 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:49.936698 master-0 kubenswrapper[8731]: E1205 12:31:49.936683 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:50.436657296 +0000 UTC m=+8.740641463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : configmap "client-ca" not found Dec 05 12:31:49.936741 master-0 kubenswrapper[8731]: I1205 12:31:49.936594 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7hvk\" (UniqueName: \"kubernetes.io/projected/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-kube-api-access-w7hvk\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:49.939442 master-0 kubenswrapper[8731]: I1205 12:31:49.937717 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-config\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:49.956510 master-0 kubenswrapper[8731]: I1205 12:31:49.956458 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7hvk\" (UniqueName: \"kubernetes.io/projected/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-kube-api-access-w7hvk\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:50.076889 master-0 kubenswrapper[8731]: I1205 12:31:50.076736 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:50.076889 master-0 kubenswrapper[8731]: I1205 12:31:50.076720 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nwplt" event={"ID":"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6","Type":"ContainerStarted","Data":"00da80bef6b48eb1abc64fad064f00d41d97860ac5c2f760a1238efd21d8e70d"} Dec 05 12:31:50.077130 master-0 kubenswrapper[8731]: I1205 12:31:50.076899 8731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:31:50.086651 master-0 kubenswrapper[8731]: I1205 12:31:50.086602 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:50.139225 master-0 kubenswrapper[8731]: I1205 12:31:50.139140 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szgcz\" (UniqueName: \"kubernetes.io/projected/f94de527-4a20-4d5b-8688-a2c46566c3c1-kube-api-access-szgcz\") pod \"f94de527-4a20-4d5b-8688-a2c46566c3c1\" (UID: \"f94de527-4a20-4d5b-8688-a2c46566c3c1\") " Dec 05 12:31:50.144939 master-0 kubenswrapper[8731]: I1205 12:31:50.144857 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f94de527-4a20-4d5b-8688-a2c46566c3c1-kube-api-access-szgcz" (OuterVolumeSpecName: "kube-api-access-szgcz") pod "f94de527-4a20-4d5b-8688-a2c46566c3c1" (UID: "f94de527-4a20-4d5b-8688-a2c46566c3c1"). InnerVolumeSpecName "kube-api-access-szgcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:31:50.242815 master-0 kubenswrapper[8731]: I1205 12:31:50.242739 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szgcz\" (UniqueName: \"kubernetes.io/projected/f94de527-4a20-4d5b-8688-a2c46566c3c1-kube-api-access-szgcz\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:50.445510 master-0 kubenswrapper[8731]: E1205 12:31:50.445435 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:50.445883 master-0 kubenswrapper[8731]: E1205 12:31:50.445548 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:51.445522195 +0000 UTC m=+9.749506362 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : secret "serving-cert" not found Dec 05 12:31:50.446128 master-0 kubenswrapper[8731]: I1205 12:31:50.445272 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:50.446268 master-0 kubenswrapper[8731]: I1205 12:31:50.446143 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:50.446268 master-0 kubenswrapper[8731]: E1205 12:31:50.446243 8731 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:50.446376 master-0 kubenswrapper[8731]: E1205 12:31:50.446289 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:51.446276274 +0000 UTC m=+9.750260441 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : configmap "client-ca" not found Dec 05 12:31:50.515548 master-0 kubenswrapper[8731]: I1205 12:31:50.515444 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:50.522710 master-0 kubenswrapper[8731]: I1205 12:31:50.522672 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:50.648806 master-0 kubenswrapper[8731]: I1205 12:31:50.648719 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:50.648806 master-0 kubenswrapper[8731]: I1205 12:31:50.648793 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: I1205 12:31:50.648842 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: I1205 12:31:50.648919 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: I1205 12:31:50.648946 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: I1205 12:31:50.648980 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: E1205 12:31:50.648999 8731 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: I1205 12:31:50.649011 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: E1205 12:31:50.649106 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.649076452 +0000 UTC m=+16.953060659 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: I1205 12:31:50.649145 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: E1205 12:31:50.649190 8731 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: E1205 12:31:50.649233 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.649221296 +0000 UTC m=+16.953205463 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: E1205 12:31:50.649299 8731 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 05 12:31:50.649326 master-0 kubenswrapper[8731]: E1205 12:31:50.649348 8731 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649377 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.649368209 +0000 UTC m=+16.953352376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649416 8731 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649444 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.64940151 +0000 UTC m=+16.953385817 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : secret "metrics-daemon-secret" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649450 8731 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: I1205 12:31:50.649488 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649330 8731 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649504 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.649487704 +0000 UTC m=+16.953471871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649547 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.649535925 +0000 UTC m=+16.953520092 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649323 8731 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: I1205 12:31:50.649566 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649620 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.649612777 +0000 UTC m=+16.953596934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649660 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.649628457 +0000 UTC m=+16.953612624 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649666 8731 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649687 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: I1205 12:31:50.649711 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649770 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649713 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.649705569 +0000 UTC m=+16.953689726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649858 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.649800001 +0000 UTC m=+16.953784328 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:50.649854 master-0 kubenswrapper[8731]: E1205 12:31:50.649893 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.649879364 +0000 UTC m=+16.953863741 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:31:51.084138 master-0 kubenswrapper[8731]: I1205 12:31:51.084042 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerStarted","Data":"5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57"} Dec 05 12:31:51.087338 master-0 kubenswrapper[8731]: I1205 12:31:51.087278 8731 generic.go:334] "Generic (PLEG): container finished" podID="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" containerID="020f4fb4f4314f00ea400478b93e32903a1a30b5d332647ebe9614d7f944a537" exitCode=0 Dec 05 12:31:51.087426 master-0 kubenswrapper[8731]: I1205 12:31:51.087384 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerDied","Data":"020f4fb4f4314f00ea400478b93e32903a1a30b5d332647ebe9614d7f944a537"} Dec 05 12:31:51.089819 master-0 kubenswrapper[8731]: I1205 12:31:51.089726 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" event={"ID":"480c1f6e-0e13-49f9-bc4e-07350842f16c","Type":"ContainerStarted","Data":"e30c55a9fab66df10956ae03c408ba3a127fd7d10e3d72d2eb92d23500a928bc"} Dec 05 12:31:51.089819 master-0 kubenswrapper[8731]: I1205 12:31:51.089747 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc" Dec 05 12:31:51.101299 master-0 kubenswrapper[8731]: I1205 12:31:51.101233 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:31:51.106131 master-0 kubenswrapper[8731]: I1205 12:31:51.105992 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podStartSLOduration=1.571949351 podStartE2EDuration="5.105947675s" podCreationTimestamp="2025-12-05 12:31:46 +0000 UTC" firstStartedPulling="2025-12-05 12:31:47.049961778 +0000 UTC m=+5.353945945" lastFinishedPulling="2025-12-05 12:31:50.583960072 +0000 UTC m=+8.887944269" observedRunningTime="2025-12-05 12:31:51.10345327 +0000 UTC m=+9.407437457" watchObservedRunningTime="2025-12-05 12:31:51.105947675 +0000 UTC m=+9.409931882" Dec 05 12:31:51.127864 master-0 kubenswrapper[8731]: I1205 12:31:51.127782 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:51.128328 master-0 kubenswrapper[8731]: I1205 12:31:51.128094 8731 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:31:51.187239 master-0 kubenswrapper[8731]: I1205 12:31:51.187152 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc"] Dec 05 12:31:51.192742 master-0 kubenswrapper[8731]: I1205 12:31:51.192672 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77f4fc6d5d-9lbdc"] Dec 05 12:31:51.195808 master-0 kubenswrapper[8731]: I1205 12:31:51.195663 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:31:51.201856 master-0 kubenswrapper[8731]: I1205 12:31:51.201016 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw"] Dec 05 12:31:51.201856 master-0 kubenswrapper[8731]: I1205 12:31:51.201766 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.204620 master-0 kubenswrapper[8731]: I1205 12:31:51.204237 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 12:31:51.204620 master-0 kubenswrapper[8731]: I1205 12:31:51.204451 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 12:31:51.206754 master-0 kubenswrapper[8731]: I1205 12:31:51.205487 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 12:31:51.206754 master-0 kubenswrapper[8731]: I1205 12:31:51.205769 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 12:31:51.206754 master-0 kubenswrapper[8731]: I1205 12:31:51.206120 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 12:31:51.215994 master-0 kubenswrapper[8731]: I1205 12:31:51.215944 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 12:31:51.216333 master-0 kubenswrapper[8731]: I1205 12:31:51.216310 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw"] Dec 05 12:31:51.257743 master-0 kubenswrapper[8731]: I1205 12:31:51.257654 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f9s4\" (UniqueName: \"kubernetes.io/projected/96b47a46-dd09-41f4-83e3-f2548e64915b-kube-api-access-6f9s4\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.258806 master-0 kubenswrapper[8731]: I1205 12:31:51.257954 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.258806 master-0 kubenswrapper[8731]: I1205 12:31:51.258228 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-proxy-ca-bundles\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.258806 master-0 kubenswrapper[8731]: I1205 12:31:51.258786 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.258973 master-0 kubenswrapper[8731]: I1205 12:31:51.258824 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-config\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.258973 master-0 kubenswrapper[8731]: I1205 12:31:51.258956 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:51.259046 master-0 kubenswrapper[8731]: I1205 12:31:51.258977 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f94de527-4a20-4d5b-8688-a2c46566c3c1-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:51.259046 master-0 kubenswrapper[8731]: I1205 12:31:51.258992 8731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:51.259046 master-0 kubenswrapper[8731]: I1205 12:31:51.259002 8731 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f94de527-4a20-4d5b-8688-a2c46566c3c1-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 05 12:31:51.363139 master-0 kubenswrapper[8731]: I1205 12:31:51.363060 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.363139 master-0 kubenswrapper[8731]: I1205 12:31:51.363133 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-config\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.363461 master-0 kubenswrapper[8731]: I1205 12:31:51.363364 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f9s4\" (UniqueName: \"kubernetes.io/projected/96b47a46-dd09-41f4-83e3-f2548e64915b-kube-api-access-6f9s4\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.363461 master-0 kubenswrapper[8731]: E1205 12:31:51.363405 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:51.363578 master-0 kubenswrapper[8731]: E1205 12:31:51.363549 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:31:51.863512832 +0000 UTC m=+10.167497139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : secret "serving-cert" not found Dec 05 12:31:51.363864 master-0 kubenswrapper[8731]: I1205 12:31:51.363826 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.364127 master-0 kubenswrapper[8731]: E1205 12:31:51.364075 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:51.364197 master-0 kubenswrapper[8731]: I1205 12:31:51.364123 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-proxy-ca-bundles\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.364238 master-0 kubenswrapper[8731]: E1205 12:31:51.364221 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:31:51.864190079 +0000 UTC m=+10.168174246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : configmap "client-ca" not found Dec 05 12:31:51.364976 master-0 kubenswrapper[8731]: I1205 12:31:51.364939 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-config\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.365943 master-0 kubenswrapper[8731]: I1205 12:31:51.365898 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-proxy-ca-bundles\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.389663 master-0 kubenswrapper[8731]: I1205 12:31:51.389594 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f9s4\" (UniqueName: \"kubernetes.io/projected/96b47a46-dd09-41f4-83e3-f2548e64915b-kube-api-access-6f9s4\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.465306 master-0 kubenswrapper[8731]: I1205 12:31:51.465141 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:51.465306 master-0 kubenswrapper[8731]: I1205 12:31:51.465226 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:51.465600 master-0 kubenswrapper[8731]: E1205 12:31:51.465378 8731 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:51.465600 master-0 kubenswrapper[8731]: E1205 12:31:51.465447 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:53.465427419 +0000 UTC m=+11.769411586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : configmap "client-ca" not found Dec 05 12:31:51.465600 master-0 kubenswrapper[8731]: E1205 12:31:51.465451 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:51.465695 master-0 kubenswrapper[8731]: E1205 12:31:51.465635 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:53.465598243 +0000 UTC m=+11.769582440 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : secret "serving-cert" not found Dec 05 12:31:51.869866 master-0 kubenswrapper[8731]: I1205 12:31:51.869739 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.870091 master-0 kubenswrapper[8731]: E1205 12:31:51.869965 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:51.870127 master-0 kubenswrapper[8731]: E1205 12:31:51.870114 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:31:52.87007828 +0000 UTC m=+11.174062487 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : configmap "client-ca" not found Dec 05 12:31:51.870328 master-0 kubenswrapper[8731]: I1205 12:31:51.870270 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:51.870692 master-0 kubenswrapper[8731]: E1205 12:31:51.870660 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:51.870901 master-0 kubenswrapper[8731]: E1205 12:31:51.870888 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:31:52.87085527 +0000 UTC m=+11.174839627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : secret "serving-cert" not found Dec 05 12:31:51.942668 master-0 kubenswrapper[8731]: I1205 12:31:51.942574 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f94de527-4a20-4d5b-8688-a2c46566c3c1" path="/var/lib/kubelet/pods/f94de527-4a20-4d5b-8688-a2c46566c3c1/volumes" Dec 05 12:31:52.097332 master-0 kubenswrapper[8731]: I1205 12:31:52.097242 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" event={"ID":"480c1f6e-0e13-49f9-bc4e-07350842f16c","Type":"ContainerStarted","Data":"c3d80b69dcfc87067aaae63f00809fa404e99554c3b19017580f5646450199ef"} Dec 05 12:31:52.113141 master-0 kubenswrapper[8731]: I1205 12:31:52.113017 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" podStartSLOduration=2.739451292 podStartE2EDuration="6.112982411s" podCreationTimestamp="2025-12-05 12:31:46 +0000 UTC" firstStartedPulling="2025-12-05 12:31:47.203524031 +0000 UTC m=+5.507508198" lastFinishedPulling="2025-12-05 12:31:50.57705515 +0000 UTC m=+8.881039317" observedRunningTime="2025-12-05 12:31:52.11065499 +0000 UTC m=+10.414639197" watchObservedRunningTime="2025-12-05 12:31:52.112982411 +0000 UTC m=+10.416966618" Dec 05 12:31:52.885071 master-0 kubenswrapper[8731]: I1205 12:31:52.884980 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:52.885941 master-0 kubenswrapper[8731]: E1205 12:31:52.885214 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:52.885941 master-0 kubenswrapper[8731]: E1205 12:31:52.885325 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:31:54.885294441 +0000 UTC m=+13.189278608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : configmap "client-ca" not found Dec 05 12:31:52.885941 master-0 kubenswrapper[8731]: I1205 12:31:52.885316 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:52.885941 master-0 kubenswrapper[8731]: E1205 12:31:52.885561 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:52.885941 master-0 kubenswrapper[8731]: E1205 12:31:52.885737 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:31:54.885717632 +0000 UTC m=+13.189701799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : secret "serving-cert" not found Dec 05 12:31:53.501336 master-0 kubenswrapper[8731]: I1205 12:31:53.501221 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:53.501336 master-0 kubenswrapper[8731]: I1205 12:31:53.501341 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:53.501748 master-0 kubenswrapper[8731]: E1205 12:31:53.501675 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:53.501799 master-0 kubenswrapper[8731]: E1205 12:31:53.501746 8731 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:53.501904 master-0 kubenswrapper[8731]: E1205 12:31:53.501873 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:57.501826498 +0000 UTC m=+15.805810705 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : secret "serving-cert" not found Dec 05 12:31:53.501972 master-0 kubenswrapper[8731]: E1205 12:31:53.501920 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:31:57.50190402 +0000 UTC m=+15.805888227 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : configmap "client-ca" not found Dec 05 12:31:54.107360 master-0 kubenswrapper[8731]: I1205 12:31:54.106680 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerStarted","Data":"e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39"} Dec 05 12:31:54.921143 master-0 kubenswrapper[8731]: I1205 12:31:54.921076 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:54.921419 master-0 kubenswrapper[8731]: I1205 12:31:54.921204 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:54.921686 master-0 kubenswrapper[8731]: E1205 12:31:54.921523 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:54.921766 master-0 kubenswrapper[8731]: E1205 12:31:54.921587 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:54.921848 master-0 kubenswrapper[8731]: E1205 12:31:54.921792 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.92171311 +0000 UTC m=+17.225697277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : configmap "client-ca" not found Dec 05 12:31:54.921848 master-0 kubenswrapper[8731]: E1205 12:31:54.921811 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:31:58.921802162 +0000 UTC m=+17.225786329 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : secret "serving-cert" not found Dec 05 12:31:56.122015 master-0 kubenswrapper[8731]: I1205 12:31:56.121854 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerStarted","Data":"a4430062c5adda1c62354e9a698c163c97a33327be32fd67d0fc627123050dbf"} Dec 05 12:31:57.557056 master-0 kubenswrapper[8731]: I1205 12:31:57.556948 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:57.557800 master-0 kubenswrapper[8731]: E1205 12:31:57.557290 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:57.557800 master-0 kubenswrapper[8731]: I1205 12:31:57.557392 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:31:57.557800 master-0 kubenswrapper[8731]: E1205 12:31:57.557471 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:05.557426794 +0000 UTC m=+23.861411111 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : secret "serving-cert" not found Dec 05 12:31:57.557800 master-0 kubenswrapper[8731]: E1205 12:31:57.557624 8731 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:57.557800 master-0 kubenswrapper[8731]: E1205 12:31:57.557724 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:05.557694041 +0000 UTC m=+23.861678248 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : configmap "client-ca" not found Dec 05 12:31:58.671047 master-0 kubenswrapper[8731]: I1205 12:31:58.670949 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:31:58.671047 master-0 kubenswrapper[8731]: I1205 12:31:58.671023 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:31:58.671749 master-0 kubenswrapper[8731]: I1205 12:31:58.671211 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:31:58.671749 master-0 kubenswrapper[8731]: I1205 12:31:58.671237 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:31:58.671749 master-0 kubenswrapper[8731]: I1205 12:31:58.671293 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:31:58.671749 master-0 kubenswrapper[8731]: E1205 12:31:58.671331 8731 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:31:58.671749 master-0 kubenswrapper[8731]: E1205 12:31:58.671459 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.6714254 +0000 UTC m=+32.975409567 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:31:58.671749 master-0 kubenswrapper[8731]: E1205 12:31:58.671477 8731 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 05 12:31:58.671749 master-0 kubenswrapper[8731]: E1205 12:31:58.671592 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.671561614 +0000 UTC m=+32.975545821 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : secret "metrics-daemon-secret" not found Dec 05 12:31:58.671749 master-0 kubenswrapper[8731]: E1205 12:31:58.671608 8731 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:31:58.671749 master-0 kubenswrapper[8731]: E1205 12:31:58.671651 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.671640826 +0000 UTC m=+32.975624993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:31:58.671749 master-0 kubenswrapper[8731]: E1205 12:31:58.671707 8731 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:31:58.672231 master-0 kubenswrapper[8731]: I1205 12:31:58.671768 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:31:58.672231 master-0 kubenswrapper[8731]: E1205 12:31:58.671780 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.671764459 +0000 UTC m=+32.975748656 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:31:58.672231 master-0 kubenswrapper[8731]: E1205 12:31:58.671866 8731 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:58.672231 master-0 kubenswrapper[8731]: I1205 12:31:58.671899 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:58.672231 master-0 kubenswrapper[8731]: E1205 12:31:58.671907 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.671895122 +0000 UTC m=+32.975879329 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:31:58.672231 master-0 kubenswrapper[8731]: E1205 12:31:58.671972 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:58.672231 master-0 kubenswrapper[8731]: E1205 12:31:58.672011 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.671999645 +0000 UTC m=+32.975983852 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:31:58.672231 master-0 kubenswrapper[8731]: I1205 12:31:58.672059 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:31:58.672547 master-0 kubenswrapper[8731]: E1205 12:31:58.672286 8731 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:31:58.672547 master-0 kubenswrapper[8731]: I1205 12:31:58.672295 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:31:58.672547 master-0 kubenswrapper[8731]: E1205 12:31:58.672370 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.672343295 +0000 UTC m=+32.976327472 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:31:58.672547 master-0 kubenswrapper[8731]: E1205 12:31:58.672396 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:31:58.672547 master-0 kubenswrapper[8731]: I1205 12:31:58.672417 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:31:58.672547 master-0 kubenswrapper[8731]: E1205 12:31:58.672433 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.672422167 +0000 UTC m=+32.976406594 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:31:58.672547 master-0 kubenswrapper[8731]: E1205 12:31:58.672503 8731 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:58.672547 master-0 kubenswrapper[8731]: I1205 12:31:58.672515 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:31:58.672547 master-0 kubenswrapper[8731]: E1205 12:31:58.672534 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.672524479 +0000 UTC m=+32.976508876 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:31:58.672868 master-0 kubenswrapper[8731]: E1205 12:31:58.672633 8731 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:31:58.672868 master-0 kubenswrapper[8731]: E1205 12:31:58.672649 8731 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:58.672868 master-0 kubenswrapper[8731]: E1205 12:31:58.672702 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.672688274 +0000 UTC m=+32.976672441 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:31:58.672868 master-0 kubenswrapper[8731]: E1205 12:31:58.672718 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.672710054 +0000 UTC m=+32.976694481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:31:58.976262 master-0 kubenswrapper[8731]: I1205 12:31:58.976131 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:58.976602 master-0 kubenswrapper[8731]: I1205 12:31:58.976297 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:31:58.976602 master-0 kubenswrapper[8731]: E1205 12:31:58.976329 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:31:58.976602 master-0 kubenswrapper[8731]: E1205 12:31:58.976465 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:32:06.976434584 +0000 UTC m=+25.280418761 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : configmap "client-ca" not found Dec 05 12:31:58.976602 master-0 kubenswrapper[8731]: E1205 12:31:58.976570 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:31:58.976749 master-0 kubenswrapper[8731]: E1205 12:31:58.976691 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:32:06.97666826 +0000 UTC m=+25.280652417 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : secret "serving-cert" not found Dec 05 12:32:05.560217 master-0 kubenswrapper[8731]: I1205 12:32:05.559626 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:32:05.560217 master-0 kubenswrapper[8731]: I1205 12:32:05.560227 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca\") pod \"route-controller-manager-6fcbfdfc87-5fw4q\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:32:05.561586 master-0 kubenswrapper[8731]: E1205 12:32:05.559892 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:05.561586 master-0 kubenswrapper[8731]: E1205 12:32:05.560416 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:21.560378324 +0000 UTC m=+39.864362531 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : secret "serving-cert" not found Dec 05 12:32:05.561586 master-0 kubenswrapper[8731]: E1205 12:32:05.560442 8731 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:32:05.561586 master-0 kubenswrapper[8731]: E1205 12:32:05.560502 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca podName:d6ab6d58-0346-4c64-908a-ef06ebf2e1c5 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:21.560482906 +0000 UTC m=+39.864467113 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca") pod "route-controller-manager-6fcbfdfc87-5fw4q" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5") : configmap "client-ca" not found Dec 05 12:32:06.980072 master-0 kubenswrapper[8731]: I1205 12:32:06.979873 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:32:06.980072 master-0 kubenswrapper[8731]: E1205 12:32:06.980086 8731 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 05 12:32:06.980072 master-0 kubenswrapper[8731]: I1205 12:32:06.980135 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert\") pod \"controller-manager-6c78fb97bf-4vsqw\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:32:06.981146 master-0 kubenswrapper[8731]: E1205 12:32:06.980283 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:32:22.980238565 +0000 UTC m=+41.284222772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : configmap "client-ca" not found Dec 05 12:32:06.981146 master-0 kubenswrapper[8731]: E1205 12:32:06.980400 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:06.981146 master-0 kubenswrapper[8731]: E1205 12:32:06.980496 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert podName:96b47a46-dd09-41f4-83e3-f2548e64915b nodeName:}" failed. No retries permitted until 2025-12-05 12:32:22.980473471 +0000 UTC m=+41.284457678 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert") pod "controller-manager-6c78fb97bf-4vsqw" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b") : secret "serving-cert" not found Dec 05 12:32:09.915749 master-0 kubenswrapper[8731]: I1205 12:32:09.914301 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-8645f66975-h6htr"] Dec 05 12:32:09.915749 master-0 kubenswrapper[8731]: I1205 12:32:09.915146 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:09.922388 master-0 kubenswrapper[8731]: I1205 12:32:09.922313 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Dec 05 12:32:09.922581 master-0 kubenswrapper[8731]: I1205 12:32:09.922462 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 12:32:09.922581 master-0 kubenswrapper[8731]: I1205 12:32:09.922523 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 12:32:09.922756 master-0 kubenswrapper[8731]: I1205 12:32:09.922718 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 12:32:09.923971 master-0 kubenswrapper[8731]: I1205 12:32:09.923938 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 12:32:09.934312 master-0 kubenswrapper[8731]: I1205 12:32:09.931965 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 12:32:09.934312 master-0 kubenswrapper[8731]: I1205 12:32:09.931807 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Dec 05 12:32:09.934312 master-0 kubenswrapper[8731]: I1205 12:32:09.932163 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 12:32:09.935743 master-0 kubenswrapper[8731]: I1205 12:32:09.934702 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 12:32:09.948214 master-0 kubenswrapper[8731]: I1205 12:32:09.943303 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 12:32:09.959079 master-0 kubenswrapper[8731]: I1205 12:32:09.956113 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-8645f66975-h6htr"] Dec 05 12:32:10.005523 master-0 kubenswrapper[8731]: I1205 12:32:10.004748 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2"] Dec 05 12:32:10.005774 master-0 kubenswrapper[8731]: I1205 12:32:10.005650 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.008348 master-0 kubenswrapper[8731]: W1205 12:32:10.008295 8731 reflector.go:561] object-"openshift-catalogd"/"catalogserver-cert": failed to list *v1.Secret: secrets "catalogserver-cert" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-catalogd": no relationship found between node 'master-0' and this object Dec 05 12:32:10.008483 master-0 kubenswrapper[8731]: E1205 12:32:10.008356 8731 reflector.go:158] "Unhandled Error" err="object-\"openshift-catalogd\"/\"catalogserver-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"catalogserver-cert\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-catalogd\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 05 12:32:10.008483 master-0 kubenswrapper[8731]: W1205 12:32:10.008390 8731 reflector.go:561] object-"openshift-catalogd"/"catalogd-trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "catalogd-trusted-ca-bundle" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-catalogd": no relationship found between node 'master-0' and this object Dec 05 12:32:10.008483 master-0 kubenswrapper[8731]: E1205 12:32:10.008456 8731 reflector.go:158] "Unhandled Error" err="object-\"openshift-catalogd\"/\"catalogd-trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"catalogd-trusted-ca-bundle\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-catalogd\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 05 12:32:10.008623 master-0 kubenswrapper[8731]: W1205 12:32:10.008523 8731 reflector.go:561] object-"openshift-catalogd"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-catalogd": no relationship found between node 'master-0' and this object Dec 05 12:32:10.008672 master-0 kubenswrapper[8731]: E1205 12:32:10.008610 8731 reflector.go:158] "Unhandled Error" err="object-\"openshift-catalogd\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-catalogd\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 05 12:32:10.009451 master-0 kubenswrapper[8731]: I1205 12:32:10.009415 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 05 12:32:10.028448 master-0 kubenswrapper[8731]: I1205 12:32:10.028221 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2"] Dec 05 12:32:10.045322 master-0 kubenswrapper[8731]: I1205 12:32:10.045246 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-etcd-serving-ca\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.045322 master-0 kubenswrapper[8731]: I1205 12:32:10.045320 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw9g5\" (UniqueName: \"kubernetes.io/projected/67d4056d-507d-44a9-b238-300913e1b957-kube-api-access-sw9g5\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.045620 master-0 kubenswrapper[8731]: I1205 12:32:10.045561 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.045994 master-0 kubenswrapper[8731]: I1205 12:32:10.045944 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-encryption-config\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.046117 master-0 kubenswrapper[8731]: I1205 12:32:10.046087 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-trusted-ca-bundle\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.046168 master-0 kubenswrapper[8731]: I1205 12:32:10.046123 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-image-import-ca\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.046168 master-0 kubenswrapper[8731]: I1205 12:32:10.046153 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-node-pullsecrets\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.046283 master-0 kubenswrapper[8731]: I1205 12:32:10.046217 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-config\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.046325 master-0 kubenswrapper[8731]: I1205 12:32:10.046293 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.046363 master-0 kubenswrapper[8731]: I1205 12:32:10.046334 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-etcd-client\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.049844 master-0 kubenswrapper[8731]: I1205 12:32:10.046401 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-audit-dir\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.119217 master-0 kubenswrapper[8731]: I1205 12:32:10.118915 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw"] Dec 05 12:32:10.119495 master-0 kubenswrapper[8731]: E1205 12:32:10.119285 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" podUID="96b47a46-dd09-41f4-83e3-f2548e64915b" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.146264 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q"] Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: E1205 12:32:10.146635 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" podUID="d6ab6d58-0346-4c64-908a-ef06ebf2e1c5" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.146916 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-trusted-ca-bundle\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.146940 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-image-import-ca\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.146967 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-node-pullsecrets\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.146997 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.147029 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-config\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.147048 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.147069 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g7mj\" (UniqueName: \"kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-kube-api-access-5g7mj\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.147126 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.147147 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b741029-0eb5-409b-b7f1-95e8385dc400-cache\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.147175 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-etcd-client\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.147215 master-0 kubenswrapper[8731]: I1205 12:32:10.147218 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.147947 master-0 kubenswrapper[8731]: I1205 12:32:10.147386 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-audit-dir\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.147947 master-0 kubenswrapper[8731]: I1205 12:32:10.147433 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-audit-dir\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.151208 master-0 kubenswrapper[8731]: I1205 12:32:10.148289 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-config\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.151208 master-0 kubenswrapper[8731]: I1205 12:32:10.148955 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-node-pullsecrets\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.151208 master-0 kubenswrapper[8731]: I1205 12:32:10.149107 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-etcd-serving-ca\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.151208 master-0 kubenswrapper[8731]: E1205 12:32:10.149121 8731 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 05 12:32:10.151208 master-0 kubenswrapper[8731]: I1205 12:32:10.149326 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw9g5\" (UniqueName: \"kubernetes.io/projected/67d4056d-507d-44a9-b238-300913e1b957-kube-api-access-sw9g5\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.151208 master-0 kubenswrapper[8731]: E1205 12:32:10.149393 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert podName:67d4056d-507d-44a9-b238-300913e1b957 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:10.649357792 +0000 UTC m=+28.953342159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert") pod "apiserver-8645f66975-h6htr" (UID: "67d4056d-507d-44a9-b238-300913e1b957") : secret "serving-cert" not found Dec 05 12:32:10.151208 master-0 kubenswrapper[8731]: I1205 12:32:10.149903 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.151208 master-0 kubenswrapper[8731]: I1205 12:32:10.149946 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-ca-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.151208 master-0 kubenswrapper[8731]: I1205 12:32:10.150003 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-encryption-config\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.157261 master-0 kubenswrapper[8731]: I1205 12:32:10.153007 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-image-import-ca\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.157261 master-0 kubenswrapper[8731]: E1205 12:32:10.153117 8731 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 05 12:32:10.157261 master-0 kubenswrapper[8731]: E1205 12:32:10.153338 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit podName:67d4056d-507d-44a9-b238-300913e1b957 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:10.653315315 +0000 UTC m=+28.957299482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit") pod "apiserver-8645f66975-h6htr" (UID: "67d4056d-507d-44a9-b238-300913e1b957") : configmap "audit-0" not found Dec 05 12:32:10.157261 master-0 kubenswrapper[8731]: I1205 12:32:10.153858 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-trusted-ca-bundle\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.161246 master-0 kubenswrapper[8731]: I1205 12:32:10.157911 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-etcd-serving-ca\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.161246 master-0 kubenswrapper[8731]: I1205 12:32:10.158382 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-etcd-client\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.170206 master-0 kubenswrapper[8731]: I1205 12:32:10.166813 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-encryption-config\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.191796 master-0 kubenswrapper[8731]: I1205 12:32:10.191217 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw9g5\" (UniqueName: \"kubernetes.io/projected/67d4056d-507d-44a9-b238-300913e1b957-kube-api-access-sw9g5\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.200215 master-0 kubenswrapper[8731]: I1205 12:32:10.199723 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" event={"ID":"f3792522-fec6-4022-90ac-0b8467fcd625","Type":"ContainerStarted","Data":"eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457"} Dec 05 12:32:10.211073 master-0 kubenswrapper[8731]: I1205 12:32:10.208926 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:32:10.211073 master-0 kubenswrapper[8731]: I1205 12:32:10.209432 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerStarted","Data":"b351d2f70dc6ca77a15619a3104c4ce47b9bc5e14772befd2755648b695c45dd"} Dec 05 12:32:10.211073 master-0 kubenswrapper[8731]: I1205 12:32:10.210121 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:32:10.232506 master-0 kubenswrapper[8731]: I1205 12:32:10.231313 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:32:10.240134 master-0 kubenswrapper[8731]: I1205 12:32:10.240083 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.250505 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-config\") pod \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.250570 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-config\") pod \"96b47a46-dd09-41f4-83e3-f2548e64915b\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.250607 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f9s4\" (UniqueName: \"kubernetes.io/projected/96b47a46-dd09-41f4-83e3-f2548e64915b-kube-api-access-6f9s4\") pod \"96b47a46-dd09-41f4-83e3-f2548e64915b\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.250655 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-proxy-ca-bundles\") pod \"96b47a46-dd09-41f4-83e3-f2548e64915b\" (UID: \"96b47a46-dd09-41f4-83e3-f2548e64915b\") " Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.250687 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7hvk\" (UniqueName: \"kubernetes.io/projected/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-kube-api-access-w7hvk\") pod \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\" (UID: \"d6ab6d58-0346-4c64-908a-ef06ebf2e1c5\") " Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.250892 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-ca-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.251014 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.251056 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.251080 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g7mj\" (UniqueName: \"kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-kube-api-access-5g7mj\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.251166 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b741029-0eb5-409b-b7f1-95e8385dc400-cache\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.251223 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.251760 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-config" (OuterVolumeSpecName: "config") pod "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.252057 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-config" (OuterVolumeSpecName: "config") pod "96b47a46-dd09-41f4-83e3-f2548e64915b" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.253002 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.253222 master-0 kubenswrapper[8731]: I1205 12:32:10.253250 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.254100 master-0 kubenswrapper[8731]: I1205 12:32:10.253673 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b741029-0eb5-409b-b7f1-95e8385dc400-cache\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:10.254100 master-0 kubenswrapper[8731]: I1205 12:32:10.253921 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "96b47a46-dd09-41f4-83e3-f2548e64915b" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:10.258501 master-0 kubenswrapper[8731]: I1205 12:32:10.256975 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-kube-api-access-w7hvk" (OuterVolumeSpecName: "kube-api-access-w7hvk") pod "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5" (UID: "d6ab6d58-0346-4c64-908a-ef06ebf2e1c5"). InnerVolumeSpecName "kube-api-access-w7hvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:32:10.258501 master-0 kubenswrapper[8731]: I1205 12:32:10.257320 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b47a46-dd09-41f4-83e3-f2548e64915b-kube-api-access-6f9s4" (OuterVolumeSpecName: "kube-api-access-6f9s4") pod "96b47a46-dd09-41f4-83e3-f2548e64915b" (UID: "96b47a46-dd09-41f4-83e3-f2548e64915b"). InnerVolumeSpecName "kube-api-access-6f9s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:32:10.353266 master-0 kubenswrapper[8731]: I1205 12:32:10.353068 8731 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:10.353266 master-0 kubenswrapper[8731]: I1205 12:32:10.353115 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7hvk\" (UniqueName: \"kubernetes.io/projected/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-kube-api-access-w7hvk\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:10.353266 master-0 kubenswrapper[8731]: I1205 12:32:10.353131 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:10.353266 master-0 kubenswrapper[8731]: I1205 12:32:10.353141 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:10.353266 master-0 kubenswrapper[8731]: I1205 12:32:10.353151 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f9s4\" (UniqueName: \"kubernetes.io/projected/96b47a46-dd09-41f4-83e3-f2548e64915b-kube-api-access-6f9s4\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:10.656953 master-0 kubenswrapper[8731]: I1205 12:32:10.656782 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.657217 master-0 kubenswrapper[8731]: E1205 12:32:10.657059 8731 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 05 12:32:10.657217 master-0 kubenswrapper[8731]: E1205 12:32:10.657203 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert podName:67d4056d-507d-44a9-b238-300913e1b957 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:11.657160532 +0000 UTC m=+29.961144699 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert") pod "apiserver-8645f66975-h6htr" (UID: "67d4056d-507d-44a9-b238-300913e1b957") : secret "serving-cert" not found Dec 05 12:32:10.657292 master-0 kubenswrapper[8731]: I1205 12:32:10.657247 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:10.657495 master-0 kubenswrapper[8731]: E1205 12:32:10.657432 8731 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 05 12:32:10.657608 master-0 kubenswrapper[8731]: E1205 12:32:10.657574 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit podName:67d4056d-507d-44a9-b238-300913e1b957 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:11.657542493 +0000 UTC m=+29.961526800 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit") pod "apiserver-8645f66975-h6htr" (UID: "67d4056d-507d-44a9-b238-300913e1b957") : configmap "audit-0" not found Dec 05 12:32:10.885273 master-0 kubenswrapper[8731]: I1205 12:32:10.884738 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 05 12:32:10.893503 master-0 kubenswrapper[8731]: E1205 12:32:10.893445 8731 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Dec 05 12:32:10.893801 master-0 kubenswrapper[8731]: E1205 12:32:10.893564 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs podName:3b741029-0eb5-409b-b7f1-95e8385dc400 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:11.393534332 +0000 UTC m=+29.697518499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs") pod "catalogd-controller-manager-7cc89f4c4c-n28z2" (UID: "3b741029-0eb5-409b-b7f1-95e8385dc400") : secret "catalogserver-cert" not found Dec 05 12:32:11.051050 master-0 kubenswrapper[8731]: I1205 12:32:11.042398 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 05 12:32:11.118960 master-0 kubenswrapper[8731]: I1205 12:32:11.118901 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 05 12:32:11.124844 master-0 kubenswrapper[8731]: I1205 12:32:11.124805 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g7mj\" (UniqueName: \"kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-kube-api-access-5g7mj\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:11.128812 master-0 kubenswrapper[8731]: I1205 12:32:11.128742 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-ca-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:11.202863 master-0 kubenswrapper[8731]: I1205 12:32:11.202788 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 05 12:32:11.203815 master-0 kubenswrapper[8731]: I1205 12:32:11.203557 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.208015 master-0 kubenswrapper[8731]: I1205 12:32:11.206339 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Dec 05 12:32:11.215091 master-0 kubenswrapper[8731]: I1205 12:32:11.214864 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 05 12:32:11.215498 master-0 kubenswrapper[8731]: I1205 12:32:11.215441 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw" Dec 05 12:32:11.215573 master-0 kubenswrapper[8731]: I1205 12:32:11.215525 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q" Dec 05 12:32:11.266365 master-0 kubenswrapper[8731]: I1205 12:32:11.264167 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kube-api-access\") pod \"installer-1-master-0\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.266365 master-0 kubenswrapper[8731]: I1205 12:32:11.264278 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.266365 master-0 kubenswrapper[8731]: I1205 12:32:11.264361 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-var-lock\") pod \"installer-1-master-0\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.267787 master-0 kubenswrapper[8731]: I1205 12:32:11.266790 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58c47b4bcf-j2srw"] Dec 05 12:32:11.267787 master-0 kubenswrapper[8731]: I1205 12:32:11.267379 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw"] Dec 05 12:32:11.267787 master-0 kubenswrapper[8731]: I1205 12:32:11.267499 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.269506 master-0 kubenswrapper[8731]: I1205 12:32:11.269487 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 12:32:11.270519 master-0 kubenswrapper[8731]: I1205 12:32:11.270502 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 12:32:11.270697 master-0 kubenswrapper[8731]: I1205 12:32:11.270659 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 12:32:11.272158 master-0 kubenswrapper[8731]: I1205 12:32:11.272113 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c78fb97bf-4vsqw"] Dec 05 12:32:11.274566 master-0 kubenswrapper[8731]: I1205 12:32:11.274540 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 12:32:11.276819 master-0 kubenswrapper[8731]: I1205 12:32:11.276769 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 12:32:11.277019 master-0 kubenswrapper[8731]: I1205 12:32:11.276983 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58c47b4bcf-j2srw"] Dec 05 12:32:11.279037 master-0 kubenswrapper[8731]: I1205 12:32:11.278997 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 12:32:11.315929 master-0 kubenswrapper[8731]: I1205 12:32:11.315729 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q"] Dec 05 12:32:11.318334 master-0 kubenswrapper[8731]: I1205 12:32:11.318270 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fcbfdfc87-5fw4q"] Dec 05 12:32:11.366249 master-0 kubenswrapper[8731]: I1205 12:32:11.366154 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-config\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.366485 master-0 kubenswrapper[8731]: I1205 12:32:11.366297 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-proxy-ca-bundles\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.366485 master-0 kubenswrapper[8731]: I1205 12:32:11.366372 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kube-api-access\") pod \"installer-1-master-0\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.366485 master-0 kubenswrapper[8731]: I1205 12:32:11.366414 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.366485 master-0 kubenswrapper[8731]: I1205 12:32:11.366460 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-client-ca\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.366667 master-0 kubenswrapper[8731]: I1205 12:32:11.366533 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.366667 master-0 kubenswrapper[8731]: I1205 12:32:11.366566 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qjz\" (UniqueName: \"kubernetes.io/projected/aa67674b-53bd-45d9-a217-915ed52ff870-kube-api-access-85qjz\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.366667 master-0 kubenswrapper[8731]: I1205 12:32:11.366634 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-var-lock\") pod \"installer-1-master-0\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.366793 master-0 kubenswrapper[8731]: I1205 12:32:11.366740 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96b47a46-dd09-41f4-83e3-f2548e64915b-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:11.366793 master-0 kubenswrapper[8731]: I1205 12:32:11.366760 8731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96b47a46-dd09-41f4-83e3-f2548e64915b-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:11.366793 master-0 kubenswrapper[8731]: I1205 12:32:11.366774 8731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:11.366915 master-0 kubenswrapper[8731]: I1205 12:32:11.366788 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:11.366915 master-0 kubenswrapper[8731]: I1205 12:32:11.366870 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-var-lock\") pod \"installer-1-master-0\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.367124 master-0 kubenswrapper[8731]: I1205 12:32:11.367054 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.387109 master-0 kubenswrapper[8731]: I1205 12:32:11.387021 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kube-api-access\") pod \"installer-1-master-0\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.467988 master-0 kubenswrapper[8731]: I1205 12:32:11.467889 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:11.468289 master-0 kubenswrapper[8731]: I1205 12:32:11.468052 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qjz\" (UniqueName: \"kubernetes.io/projected/aa67674b-53bd-45d9-a217-915ed52ff870-kube-api-access-85qjz\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.468289 master-0 kubenswrapper[8731]: E1205 12:32:11.468106 8731 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Dec 05 12:32:11.468289 master-0 kubenswrapper[8731]: E1205 12:32:11.468215 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs podName:3b741029-0eb5-409b-b7f1-95e8385dc400 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:12.468172829 +0000 UTC m=+30.772156996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs") pod "catalogd-controller-manager-7cc89f4c4c-n28z2" (UID: "3b741029-0eb5-409b-b7f1-95e8385dc400") : secret "catalogserver-cert" not found Dec 05 12:32:11.468289 master-0 kubenswrapper[8731]: I1205 12:32:11.468114 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.468589 master-0 kubenswrapper[8731]: E1205 12:32:11.468531 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:11.468717 master-0 kubenswrapper[8731]: E1205 12:32:11.468676 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert podName:aa67674b-53bd-45d9-a217-915ed52ff870 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:11.968635981 +0000 UTC m=+30.272620278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert") pod "controller-manager-58c47b4bcf-j2srw" (UID: "aa67674b-53bd-45d9-a217-915ed52ff870") : secret "serving-cert" not found Dec 05 12:32:11.468717 master-0 kubenswrapper[8731]: I1205 12:32:11.468664 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-config\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.468915 master-0 kubenswrapper[8731]: I1205 12:32:11.468861 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-proxy-ca-bundles\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.469019 master-0 kubenswrapper[8731]: I1205 12:32:11.468986 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-client-ca\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.470330 master-0 kubenswrapper[8731]: I1205 12:32:11.470283 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-config\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.470481 master-0 kubenswrapper[8731]: I1205 12:32:11.470427 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-client-ca\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.470763 master-0 kubenswrapper[8731]: I1205 12:32:11.470667 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-proxy-ca-bundles\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.496409 master-0 kubenswrapper[8731]: I1205 12:32:11.496327 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qjz\" (UniqueName: \"kubernetes.io/projected/aa67674b-53bd-45d9-a217-915ed52ff870-kube-api-access-85qjz\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.537856 master-0 kubenswrapper[8731]: I1205 12:32:11.537682 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:11.671153 master-0 kubenswrapper[8731]: I1205 12:32:11.671024 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:11.671455 master-0 kubenswrapper[8731]: E1205 12:32:11.671242 8731 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 05 12:32:11.671455 master-0 kubenswrapper[8731]: E1205 12:32:11.671353 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit podName:67d4056d-507d-44a9-b238-300913e1b957 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:13.671326616 +0000 UTC m=+31.975310783 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit") pod "apiserver-8645f66975-h6htr" (UID: "67d4056d-507d-44a9-b238-300913e1b957") : configmap "audit-0" not found Dec 05 12:32:11.671589 master-0 kubenswrapper[8731]: I1205 12:32:11.671550 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:11.671732 master-0 kubenswrapper[8731]: E1205 12:32:11.671706 8731 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 05 12:32:11.671803 master-0 kubenswrapper[8731]: E1205 12:32:11.671788 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert podName:67d4056d-507d-44a9-b238-300913e1b957 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:13.671778727 +0000 UTC m=+31.975762894 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert") pod "apiserver-8645f66975-h6htr" (UID: "67d4056d-507d-44a9-b238-300913e1b957") : secret "serving-cert" not found Dec 05 12:32:11.736265 master-0 kubenswrapper[8731]: I1205 12:32:11.736178 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 05 12:32:11.952471 master-0 kubenswrapper[8731]: I1205 12:32:11.952063 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b47a46-dd09-41f4-83e3-f2548e64915b" path="/var/lib/kubelet/pods/96b47a46-dd09-41f4-83e3-f2548e64915b/volumes" Dec 05 12:32:11.952841 master-0 kubenswrapper[8731]: I1205 12:32:11.952813 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ab6d58-0346-4c64-908a-ef06ebf2e1c5" path="/var/lib/kubelet/pods/d6ab6d58-0346-4c64-908a-ef06ebf2e1c5/volumes" Dec 05 12:32:11.974854 master-0 kubenswrapper[8731]: I1205 12:32:11.974770 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:11.975151 master-0 kubenswrapper[8731]: E1205 12:32:11.974993 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:11.975151 master-0 kubenswrapper[8731]: E1205 12:32:11.975091 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert podName:aa67674b-53bd-45d9-a217-915ed52ff870 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:12.975068325 +0000 UTC m=+31.279052492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert") pod "controller-manager-58c47b4bcf-j2srw" (UID: "aa67674b-53bd-45d9-a217-915ed52ff870") : secret "serving-cert" not found Dec 05 12:32:12.221601 master-0 kubenswrapper[8731]: I1205 12:32:12.221518 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"26b7da93-bb3a-48c9-a2dc-d91c73db5578","Type":"ContainerStarted","Data":"a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25"} Dec 05 12:32:12.221601 master-0 kubenswrapper[8731]: I1205 12:32:12.221580 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"26b7da93-bb3a-48c9-a2dc-d91c73db5578","Type":"ContainerStarted","Data":"30ed2a2a29bc0515d570bdea00d443e316a56c4e00d683ca90b32cc9841c20a6"} Dec 05 12:32:12.239492 master-0 kubenswrapper[8731]: I1205 12:32:12.239392 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=1.239369628 podStartE2EDuration="1.239369628s" podCreationTimestamp="2025-12-05 12:32:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:32:12.23905746 +0000 UTC m=+30.543041627" watchObservedRunningTime="2025-12-05 12:32:12.239369628 +0000 UTC m=+30.543353795" Dec 05 12:32:12.480323 master-0 kubenswrapper[8731]: I1205 12:32:12.480127 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:12.480550 master-0 kubenswrapper[8731]: E1205 12:32:12.480398 8731 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Dec 05 12:32:12.480550 master-0 kubenswrapper[8731]: E1205 12:32:12.480514 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs podName:3b741029-0eb5-409b-b7f1-95e8385dc400 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.480486813 +0000 UTC m=+32.784470970 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs") pod "catalogd-controller-manager-7cc89f4c4c-n28z2" (UID: "3b741029-0eb5-409b-b7f1-95e8385dc400") : secret "catalogserver-cert" not found Dec 05 12:32:12.514668 master-0 kubenswrapper[8731]: I1205 12:32:12.514577 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k"] Dec 05 12:32:12.515327 master-0 kubenswrapper[8731]: I1205 12:32:12.515262 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.519214 master-0 kubenswrapper[8731]: I1205 12:32:12.519142 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 05 12:32:12.519484 master-0 kubenswrapper[8731]: I1205 12:32:12.519456 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 05 12:32:12.519676 master-0 kubenswrapper[8731]: I1205 12:32:12.519649 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 05 12:32:12.531068 master-0 kubenswrapper[8731]: I1205 12:32:12.530998 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k"] Dec 05 12:32:12.582049 master-0 kubenswrapper[8731]: I1205 12:32:12.581964 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.582330 master-0 kubenswrapper[8731]: I1205 12:32:12.582095 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/153fec1f-a10b-4c6c-a997-60fa80c13a86-cache\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.582330 master-0 kubenswrapper[8731]: I1205 12:32:12.582152 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.582404 master-0 kubenswrapper[8731]: I1205 12:32:12.582338 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr2r9\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-kube-api-access-dr2r9\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.582453 master-0 kubenswrapper[8731]: I1205 12:32:12.582433 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.683853 master-0 kubenswrapper[8731]: I1205 12:32:12.683382 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.684168 master-0 kubenswrapper[8731]: I1205 12:32:12.683896 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr2r9\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-kube-api-access-dr2r9\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.684168 master-0 kubenswrapper[8731]: I1205 12:32:12.683973 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.684168 master-0 kubenswrapper[8731]: E1205 12:32:12.683670 8731 projected.go:288] Couldn't get configMap openshift-operator-controller/operator-controller-trusted-ca-bundle: configmap "operator-controller-trusted-ca-bundle" not found Dec 05 12:32:12.684168 master-0 kubenswrapper[8731]: I1205 12:32:12.684017 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.684168 master-0 kubenswrapper[8731]: E1205 12:32:12.684047 8731 projected.go:194] Error preparing data for projected volume ca-certs for pod openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k: configmap "operator-controller-trusted-ca-bundle" not found Dec 05 12:32:12.684346 master-0 kubenswrapper[8731]: I1205 12:32:12.684228 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.684346 master-0 kubenswrapper[8731]: I1205 12:32:12.684251 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.684346 master-0 kubenswrapper[8731]: E1205 12:32:12.684292 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs podName:153fec1f-a10b-4c6c-a997-60fa80c13a86 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:13.184243346 +0000 UTC m=+31.488227513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ca-certs" (UniqueName: "kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs") pod "operator-controller-controller-manager-7cbd59c7f8-d9g7k" (UID: "153fec1f-a10b-4c6c-a997-60fa80c13a86") : configmap "operator-controller-trusted-ca-bundle" not found Dec 05 12:32:12.684445 master-0 kubenswrapper[8731]: I1205 12:32:12.684416 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/153fec1f-a10b-4c6c-a997-60fa80c13a86-cache\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.684885 master-0 kubenswrapper[8731]: I1205 12:32:12.684851 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/153fec1f-a10b-4c6c-a997-60fa80c13a86-cache\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.706715 master-0 kubenswrapper[8731]: I1205 12:32:12.706646 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr2r9\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-kube-api-access-dr2r9\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:12.762766 master-0 kubenswrapper[8731]: I1205 12:32:12.762618 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-8645f66975-h6htr"] Dec 05 12:32:12.762988 master-0 kubenswrapper[8731]: E1205 12:32:12.762931 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-8645f66975-h6htr" podUID="67d4056d-507d-44a9-b238-300913e1b957" Dec 05 12:32:12.988511 master-0 kubenswrapper[8731]: I1205 12:32:12.988416 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:12.988824 master-0 kubenswrapper[8731]: E1205 12:32:12.988718 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:12.988937 master-0 kubenswrapper[8731]: E1205 12:32:12.988842 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert podName:aa67674b-53bd-45d9-a217-915ed52ff870 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.988813247 +0000 UTC m=+33.292797414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert") pod "controller-manager-58c47b4bcf-j2srw" (UID: "aa67674b-53bd-45d9-a217-915ed52ff870") : secret "serving-cert" not found Dec 05 12:32:13.192577 master-0 kubenswrapper[8731]: I1205 12:32:13.192489 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:13.192900 master-0 kubenswrapper[8731]: E1205 12:32:13.192850 8731 projected.go:288] Couldn't get configMap openshift-operator-controller/operator-controller-trusted-ca-bundle: configmap "operator-controller-trusted-ca-bundle" not found Dec 05 12:32:13.192999 master-0 kubenswrapper[8731]: E1205 12:32:13.192911 8731 projected.go:194] Error preparing data for projected volume ca-certs for pod openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k: configmap "operator-controller-trusted-ca-bundle" not found Dec 05 12:32:13.193062 master-0 kubenswrapper[8731]: E1205 12:32:13.193008 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs podName:153fec1f-a10b-4c6c-a997-60fa80c13a86 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.19297436 +0000 UTC m=+32.496958707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "ca-certs" (UniqueName: "kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs") pod "operator-controller-controller-manager-7cbd59c7f8-d9g7k" (UID: "153fec1f-a10b-4c6c-a997-60fa80c13a86") : configmap "operator-controller-trusted-ca-bundle" not found Dec 05 12:32:13.229298 master-0 kubenswrapper[8731]: I1205 12:32:13.229226 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:13.244498 master-0 kubenswrapper[8731]: I1205 12:32:13.244427 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:13.293554 master-0 kubenswrapper[8731]: I1205 12:32:13.293481 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-image-import-ca\") pod \"67d4056d-507d-44a9-b238-300913e1b957\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " Dec 05 12:32:13.293554 master-0 kubenswrapper[8731]: I1205 12:32:13.293552 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-encryption-config\") pod \"67d4056d-507d-44a9-b238-300913e1b957\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " Dec 05 12:32:13.293906 master-0 kubenswrapper[8731]: I1205 12:32:13.293580 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-node-pullsecrets\") pod \"67d4056d-507d-44a9-b238-300913e1b957\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " Dec 05 12:32:13.293906 master-0 kubenswrapper[8731]: I1205 12:32:13.293603 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-etcd-serving-ca\") pod \"67d4056d-507d-44a9-b238-300913e1b957\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " Dec 05 12:32:13.293906 master-0 kubenswrapper[8731]: I1205 12:32:13.293627 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw9g5\" (UniqueName: \"kubernetes.io/projected/67d4056d-507d-44a9-b238-300913e1b957-kube-api-access-sw9g5\") pod \"67d4056d-507d-44a9-b238-300913e1b957\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " Dec 05 12:32:13.293906 master-0 kubenswrapper[8731]: I1205 12:32:13.293649 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-trusted-ca-bundle\") pod \"67d4056d-507d-44a9-b238-300913e1b957\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " Dec 05 12:32:13.293906 master-0 kubenswrapper[8731]: I1205 12:32:13.293685 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-audit-dir\") pod \"67d4056d-507d-44a9-b238-300913e1b957\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " Dec 05 12:32:13.293906 master-0 kubenswrapper[8731]: I1205 12:32:13.293726 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-config\") pod \"67d4056d-507d-44a9-b238-300913e1b957\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " Dec 05 12:32:13.293906 master-0 kubenswrapper[8731]: I1205 12:32:13.293751 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-etcd-client\") pod \"67d4056d-507d-44a9-b238-300913e1b957\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " Dec 05 12:32:13.293906 master-0 kubenswrapper[8731]: I1205 12:32:13.293815 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "67d4056d-507d-44a9-b238-300913e1b957" (UID: "67d4056d-507d-44a9-b238-300913e1b957"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:32:13.294217 master-0 kubenswrapper[8731]: I1205 12:32:13.294145 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "67d4056d-507d-44a9-b238-300913e1b957" (UID: "67d4056d-507d-44a9-b238-300913e1b957"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:32:13.294645 master-0 kubenswrapper[8731]: I1205 12:32:13.294572 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "67d4056d-507d-44a9-b238-300913e1b957" (UID: "67d4056d-507d-44a9-b238-300913e1b957"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:13.294927 master-0 kubenswrapper[8731]: I1205 12:32:13.294652 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "67d4056d-507d-44a9-b238-300913e1b957" (UID: "67d4056d-507d-44a9-b238-300913e1b957"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:13.294927 master-0 kubenswrapper[8731]: I1205 12:32:13.294749 8731 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-image-import-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:13.294927 master-0 kubenswrapper[8731]: I1205 12:32:13.294768 8731 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:13.294927 master-0 kubenswrapper[8731]: I1205 12:32:13.294780 8731 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:13.294927 master-0 kubenswrapper[8731]: I1205 12:32:13.294792 8731 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/67d4056d-507d-44a9-b238-300913e1b957-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:13.295474 master-0 kubenswrapper[8731]: I1205 12:32:13.295440 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "67d4056d-507d-44a9-b238-300913e1b957" (UID: "67d4056d-507d-44a9-b238-300913e1b957"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:13.295546 master-0 kubenswrapper[8731]: I1205 12:32:13.295498 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-config" (OuterVolumeSpecName: "config") pod "67d4056d-507d-44a9-b238-300913e1b957" (UID: "67d4056d-507d-44a9-b238-300913e1b957"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:13.298744 master-0 kubenswrapper[8731]: I1205 12:32:13.298683 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d4056d-507d-44a9-b238-300913e1b957-kube-api-access-sw9g5" (OuterVolumeSpecName: "kube-api-access-sw9g5") pod "67d4056d-507d-44a9-b238-300913e1b957" (UID: "67d4056d-507d-44a9-b238-300913e1b957"). InnerVolumeSpecName "kube-api-access-sw9g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:32:13.298835 master-0 kubenswrapper[8731]: I1205 12:32:13.298750 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "67d4056d-507d-44a9-b238-300913e1b957" (UID: "67d4056d-507d-44a9-b238-300913e1b957"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:32:13.299877 master-0 kubenswrapper[8731]: I1205 12:32:13.299832 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "67d4056d-507d-44a9-b238-300913e1b957" (UID: "67d4056d-507d-44a9-b238-300913e1b957"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:32:13.384704 master-0 kubenswrapper[8731]: I1205 12:32:13.384635 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc"] Dec 05 12:32:13.386898 master-0 kubenswrapper[8731]: I1205 12:32:13.385275 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.388021 master-0 kubenswrapper[8731]: I1205 12:32:13.387959 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 12:32:13.388165 master-0 kubenswrapper[8731]: I1205 12:32:13.388133 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 12:32:13.388282 master-0 kubenswrapper[8731]: I1205 12:32:13.388141 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 12:32:13.389644 master-0 kubenswrapper[8731]: I1205 12:32:13.389585 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 12:32:13.389788 master-0 kubenswrapper[8731]: I1205 12:32:13.389642 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 12:32:13.398394 master-0 kubenswrapper[8731]: I1205 12:32:13.395790 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:13.398394 master-0 kubenswrapper[8731]: I1205 12:32:13.395834 8731 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-etcd-client\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:13.398394 master-0 kubenswrapper[8731]: I1205 12:32:13.395850 8731 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-encryption-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:13.398394 master-0 kubenswrapper[8731]: I1205 12:32:13.395864 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw9g5\" (UniqueName: \"kubernetes.io/projected/67d4056d-507d-44a9-b238-300913e1b957-kube-api-access-sw9g5\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:13.398394 master-0 kubenswrapper[8731]: I1205 12:32:13.395878 8731 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:13.399254 master-0 kubenswrapper[8731]: I1205 12:32:13.398875 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc"] Dec 05 12:32:13.496978 master-0 kubenswrapper[8731]: I1205 12:32:13.496905 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn7rl\" (UniqueName: \"kubernetes.io/projected/79ccdd97-da60-4ac7-a640-c640c45648f7-kube-api-access-jn7rl\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.497243 master-0 kubenswrapper[8731]: I1205 12:32:13.497086 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.497243 master-0 kubenswrapper[8731]: I1205 12:32:13.497147 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-client-ca\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.497243 master-0 kubenswrapper[8731]: I1205 12:32:13.497171 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-config\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.561950 master-0 kubenswrapper[8731]: I1205 12:32:13.561892 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-77c99c46b8-44qrw"] Dec 05 12:32:13.562652 master-0 kubenswrapper[8731]: I1205 12:32:13.562628 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.565963 master-0 kubenswrapper[8731]: I1205 12:32:13.565928 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 12:32:13.566076 master-0 kubenswrapper[8731]: I1205 12:32:13.566013 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 12:32:13.566245 master-0 kubenswrapper[8731]: I1205 12:32:13.566140 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 12:32:13.567330 master-0 kubenswrapper[8731]: I1205 12:32:13.567300 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 12:32:13.575611 master-0 kubenswrapper[8731]: I1205 12:32:13.575569 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-77c99c46b8-44qrw"] Dec 05 12:32:13.598295 master-0 kubenswrapper[8731]: I1205 12:32:13.598235 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.598602 master-0 kubenswrapper[8731]: I1205 12:32:13.598320 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-cabundle\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.598602 master-0 kubenswrapper[8731]: I1205 12:32:13.598413 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-client-ca\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.598602 master-0 kubenswrapper[8731]: I1205 12:32:13.598432 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-config\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.598602 master-0 kubenswrapper[8731]: I1205 12:32:13.598459 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-key\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.598602 master-0 kubenswrapper[8731]: E1205 12:32:13.598471 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:13.598602 master-0 kubenswrapper[8731]: E1205 12:32:13.598577 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert podName:79ccdd97-da60-4ac7-a640-c640c45648f7 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:14.098548715 +0000 UTC m=+32.402532882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert") pod "route-controller-manager-5c76cbf655-294dc" (UID: "79ccdd97-da60-4ac7-a640-c640c45648f7") : secret "serving-cert" not found Dec 05 12:32:13.598815 master-0 kubenswrapper[8731]: I1205 12:32:13.598485 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn7rl\" (UniqueName: \"kubernetes.io/projected/79ccdd97-da60-4ac7-a640-c640c45648f7-kube-api-access-jn7rl\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.600222 master-0 kubenswrapper[8731]: I1205 12:32:13.598843 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqdxl\" (UniqueName: \"kubernetes.io/projected/cf8247a1-703a-46b3-9a33-25a73b27ab99-kube-api-access-fqdxl\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.600222 master-0 kubenswrapper[8731]: I1205 12:32:13.599751 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-client-ca\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.600222 master-0 kubenswrapper[8731]: I1205 12:32:13.599813 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-config\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.617330 master-0 kubenswrapper[8731]: I1205 12:32:13.617253 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn7rl\" (UniqueName: \"kubernetes.io/projected/79ccdd97-da60-4ac7-a640-c640c45648f7-kube-api-access-jn7rl\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:13.646459 master-0 kubenswrapper[8731]: I1205 12:32:13.645629 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 05 12:32:13.646459 master-0 kubenswrapper[8731]: I1205 12:32:13.646121 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:13.648590 master-0 kubenswrapper[8731]: I1205 12:32:13.648543 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Dec 05 12:32:13.659603 master-0 kubenswrapper[8731]: I1205 12:32:13.659544 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 05 12:32:13.700531 master-0 kubenswrapper[8731]: I1205 12:32:13.700371 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96fa3513-5467-4b0f-a03d-9279d36317bd-kube-api-access\") pod \"installer-1-master-0\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:13.700531 master-0 kubenswrapper[8731]: I1205 12:32:13.700423 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-key\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.700531 master-0 kubenswrapper[8731]: I1205 12:32:13.700464 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-var-lock\") pod \"installer-1-master-0\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:13.700531 master-0 kubenswrapper[8731]: I1205 12:32:13.700513 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqdxl\" (UniqueName: \"kubernetes.io/projected/cf8247a1-703a-46b3-9a33-25a73b27ab99-kube-api-access-fqdxl\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.700531 master-0 kubenswrapper[8731]: I1205 12:32:13.700537 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:13.700930 master-0 kubenswrapper[8731]: I1205 12:32:13.700576 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:13.700930 master-0 kubenswrapper[8731]: I1205 12:32:13.700593 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit\") pod \"apiserver-8645f66975-h6htr\" (UID: \"67d4056d-507d-44a9-b238-300913e1b957\") " pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:13.700930 master-0 kubenswrapper[8731]: I1205 12:32:13.700642 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-cabundle\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.701582 master-0 kubenswrapper[8731]: I1205 12:32:13.701551 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-cabundle\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.701672 master-0 kubenswrapper[8731]: E1205 12:32:13.701652 8731 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 05 12:32:13.701741 master-0 kubenswrapper[8731]: E1205 12:32:13.701703 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert podName:67d4056d-507d-44a9-b238-300913e1b957 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:17.701689105 +0000 UTC m=+36.005673272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert") pod "apiserver-8645f66975-h6htr" (UID: "67d4056d-507d-44a9-b238-300913e1b957") : secret "serving-cert" not found Dec 05 12:32:13.701793 master-0 kubenswrapper[8731]: E1205 12:32:13.701749 8731 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 05 12:32:13.701793 master-0 kubenswrapper[8731]: E1205 12:32:13.701769 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit podName:67d4056d-507d-44a9-b238-300913e1b957 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:17.701763127 +0000 UTC m=+36.005747294 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit") pod "apiserver-8645f66975-h6htr" (UID: "67d4056d-507d-44a9-b238-300913e1b957") : configmap "audit-0" not found Dec 05 12:32:13.709973 master-0 kubenswrapper[8731]: I1205 12:32:13.709930 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-key\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.722626 master-0 kubenswrapper[8731]: I1205 12:32:13.722572 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqdxl\" (UniqueName: \"kubernetes.io/projected/cf8247a1-703a-46b3-9a33-25a73b27ab99-kube-api-access-fqdxl\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.801566 master-0 kubenswrapper[8731]: I1205 12:32:13.801428 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96fa3513-5467-4b0f-a03d-9279d36317bd-kube-api-access\") pod \"installer-1-master-0\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:13.801566 master-0 kubenswrapper[8731]: I1205 12:32:13.801515 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-var-lock\") pod \"installer-1-master-0\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:13.801998 master-0 kubenswrapper[8731]: I1205 12:32:13.801937 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:13.801998 master-0 kubenswrapper[8731]: I1205 12:32:13.801981 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-var-lock\") pod \"installer-1-master-0\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:13.802190 master-0 kubenswrapper[8731]: I1205 12:32:13.802129 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:13.822858 master-0 kubenswrapper[8731]: I1205 12:32:13.822797 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96fa3513-5467-4b0f-a03d-9279d36317bd-kube-api-access\") pod \"installer-1-master-0\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:13.882379 master-0 kubenswrapper[8731]: I1205 12:32:13.882295 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:32:13.969358 master-0 kubenswrapper[8731]: I1205 12:32:13.969307 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 05 12:32:14.077950 master-0 kubenswrapper[8731]: I1205 12:32:14.077868 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-77c99c46b8-44qrw"] Dec 05 12:32:14.087640 master-0 kubenswrapper[8731]: W1205 12:32:14.087092 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf8247a1_703a_46b3_9a33_25a73b27ab99.slice/crio-9f1e76d4f58fcd22a9b3bb1871e5fda992687c0e5181ed09e4aadbd1b7953465 WatchSource:0}: Error finding container 9f1e76d4f58fcd22a9b3bb1871e5fda992687c0e5181ed09e4aadbd1b7953465: Status 404 returned error can't find the container with id 9f1e76d4f58fcd22a9b3bb1871e5fda992687c0e5181ed09e4aadbd1b7953465 Dec 05 12:32:14.106049 master-0 kubenswrapper[8731]: I1205 12:32:14.105986 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:14.106271 master-0 kubenswrapper[8731]: E1205 12:32:14.106241 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:14.106340 master-0 kubenswrapper[8731]: E1205 12:32:14.106322 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert podName:79ccdd97-da60-4ac7-a640-c640c45648f7 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:15.106302955 +0000 UTC m=+33.410287122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert") pod "route-controller-manager-5c76cbf655-294dc" (UID: "79ccdd97-da60-4ac7-a640-c640c45648f7") : secret "serving-cert" not found Dec 05 12:32:14.131525 master-0 kubenswrapper[8731]: I1205 12:32:14.131470 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:32:14.207145 master-0 kubenswrapper[8731]: I1205 12:32:14.207083 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:14.213338 master-0 kubenswrapper[8731]: I1205 12:32:14.213275 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:14.234703 master-0 kubenswrapper[8731]: I1205 12:32:14.234635 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" event={"ID":"cf8247a1-703a-46b3-9a33-25a73b27ab99","Type":"ContainerStarted","Data":"9f1e76d4f58fcd22a9b3bb1871e5fda992687c0e5181ed09e4aadbd1b7953465"} Dec 05 12:32:14.234703 master-0 kubenswrapper[8731]: I1205 12:32:14.234678 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-8645f66975-h6htr" Dec 05 12:32:14.340211 master-0 kubenswrapper[8731]: I1205 12:32:14.340077 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:14.511455 master-0 kubenswrapper[8731]: I1205 12:32:14.511279 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:14.511720 master-0 kubenswrapper[8731]: E1205 12:32:14.511556 8731 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Dec 05 12:32:14.511720 master-0 kubenswrapper[8731]: E1205 12:32:14.511680 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs podName:3b741029-0eb5-409b-b7f1-95e8385dc400 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:18.511652784 +0000 UTC m=+36.815636941 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs") pod "catalogd-controller-manager-7cc89f4c4c-n28z2" (UID: "3b741029-0eb5-409b-b7f1-95e8385dc400") : secret "catalogserver-cert" not found Dec 05 12:32:14.713906 master-0 kubenswrapper[8731]: I1205 12:32:14.713739 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:32:14.714394 master-0 kubenswrapper[8731]: E1205 12:32:14.714065 8731 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 05 12:32:14.714394 master-0 kubenswrapper[8731]: I1205 12:32:14.714237 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:32:14.714394 master-0 kubenswrapper[8731]: E1205 12:32:14.714267 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs podName:fb7003a6-4341-49eb-bec3-76ba8610fa12 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.714150544 +0000 UTC m=+65.018134901 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs") pod "network-metrics-daemon-99djw" (UID: "fb7003a6-4341-49eb-bec3-76ba8610fa12") : secret "metrics-daemon-secret" not found Dec 05 12:32:14.714394 master-0 kubenswrapper[8731]: E1205 12:32:14.714358 8731 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 05 12:32:14.714821 master-0 kubenswrapper[8731]: I1205 12:32:14.714397 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:32:14.714821 master-0 kubenswrapper[8731]: E1205 12:32:14.714428 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls podName:38941513-e968-45f1-9cb2-b63d40338f36 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.714408801 +0000 UTC m=+65.018392998 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls") pod "cluster-image-registry-operator-6fb9f88b7-sxxpq" (UID: "38941513-e968-45f1-9cb2-b63d40338f36") : secret "image-registry-operator-tls" not found Dec 05 12:32:14.714821 master-0 kubenswrapper[8731]: I1205 12:32:14.714650 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:32:14.714821 master-0 kubenswrapper[8731]: I1205 12:32:14.714700 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:32:14.714821 master-0 kubenswrapper[8731]: E1205 12:32:14.714653 8731 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:32:14.714821 master-0 kubenswrapper[8731]: I1205 12:32:14.714781 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:32:14.715280 master-0 kubenswrapper[8731]: E1205 12:32:14.714742 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 05 12:32:14.715280 master-0 kubenswrapper[8731]: E1205 12:32:14.714858 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls podName:5f0c6889-0739-48a3-99cd-6db9d1f83242 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.714824661 +0000 UTC m=+65.018808868 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls") pod "dns-operator-7c56cf9b74-z9g7c" (UID: "5f0c6889-0739-48a3-99cd-6db9d1f83242") : secret "metrics-tls" not found Dec 05 12:32:14.715280 master-0 kubenswrapper[8731]: E1205 12:32:14.714917 8731 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 05 12:32:14.715280 master-0 kubenswrapper[8731]: E1205 12:32:14.714968 8731 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 05 12:32:14.715280 master-0 kubenswrapper[8731]: I1205 12:32:14.714968 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:32:14.715280 master-0 kubenswrapper[8731]: E1205 12:32:14.715037 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.715007376 +0000 UTC m=+65.018991583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "performance-addon-operator-webhook-cert" not found Dec 05 12:32:14.715280 master-0 kubenswrapper[8731]: E1205 12:32:14.715074 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls podName:a2acba71-b9dc-4b85-be35-c995b8be2f19 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.715059277 +0000 UTC m=+65.019043484 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls") pod "cluster-node-tuning-operator-85cff47f46-p9xtc" (UID: "a2acba71-b9dc-4b85-be35-c995b8be2f19") : secret "node-tuning-operator-tls" not found Dec 05 12:32:14.715280 master-0 kubenswrapper[8731]: E1205 12:32:14.715111 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls podName:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.715086868 +0000 UTC m=+65.019071075 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls") pod "ingress-operator-8649c48786-7xrk6" (UID: "a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7") : secret "metrics-tls" not found Dec 05 12:32:14.715280 master-0 kubenswrapper[8731]: I1205 12:32:14.715157 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: E1205 12:32:14.715401 8731 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: E1205 12:32:14.715446 8731 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: E1205 12:32:14.715489 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls podName:58187662-b502-4d90-95ce-2aa91a81d256 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.715437558 +0000 UTC m=+65.019421765 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-7ff994598c-lgc7z" (UID: "58187662-b502-4d90-95ce-2aa91a81d256") : secret "cluster-monitoring-operator-tls" not found Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: I1205 12:32:14.715623 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: E1205 12:32:14.715699 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert podName:29812c4b-48ac-488c-863c-1d52e39ea2ae nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.715676474 +0000 UTC m=+65.019660681 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert") pod "cluster-version-operator-77dfcc565f-2chqh" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae") : secret "cluster-version-operator-serving-cert" not found Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: E1205 12:32:14.715755 8731 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: I1205 12:32:14.715775 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: E1205 12:32:14.715807 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert podName:0dda6d9b-cb3a-413a-85af-ef08f15ea42e nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.715791117 +0000 UTC m=+65.019775324 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert") pod "package-server-manager-67477646d4-9vfxw" (UID: "0dda6d9b-cb3a-413a-85af-ef08f15ea42e") : secret "package-server-manager-serving-cert" not found Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: I1205 12:32:14.715852 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: E1205 12:32:14.715879 8731 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 05 12:32:14.715906 master-0 kubenswrapper[8731]: E1205 12:32:14.715930 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics podName:1e6babfe-724a-4eab-bb3b-bc318bf57b70 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.71590353 +0000 UTC m=+65.019887697 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics") pod "marketplace-operator-f797b99b6-vwhxt" (UID: "1e6babfe-724a-4eab-bb3b-bc318bf57b70") : secret "marketplace-operator-metrics" not found Dec 05 12:32:14.716646 master-0 kubenswrapper[8731]: E1205 12:32:14.716047 8731 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 05 12:32:14.716646 master-0 kubenswrapper[8731]: E1205 12:32:14.716115 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs podName:cfc37275-4e59-4f73-8b08-c8ca8ec28bbb nodeName:}" failed. No retries permitted until 2025-12-05 12:32:46.716100175 +0000 UTC m=+65.020084392 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs") pod "multus-admission-controller-7dfc5b745f-xlrzq" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb") : secret "multus-admission-controller-secret" not found Dec 05 12:32:15.019064 master-0 kubenswrapper[8731]: I1205 12:32:15.018960 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:15.019447 master-0 kubenswrapper[8731]: E1205 12:32:15.019297 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:15.019511 master-0 kubenswrapper[8731]: E1205 12:32:15.019463 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert podName:aa67674b-53bd-45d9-a217-915ed52ff870 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:19.019433174 +0000 UTC m=+37.323417551 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert") pod "controller-manager-58c47b4bcf-j2srw" (UID: "aa67674b-53bd-45d9-a217-915ed52ff870") : secret "serving-cert" not found Dec 05 12:32:15.121367 master-0 kubenswrapper[8731]: I1205 12:32:15.121246 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:15.121788 master-0 kubenswrapper[8731]: E1205 12:32:15.121456 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:15.121788 master-0 kubenswrapper[8731]: E1205 12:32:15.121557 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert podName:79ccdd97-da60-4ac7-a640-c640c45648f7 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:17.121524806 +0000 UTC m=+35.425509013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert") pod "route-controller-manager-5c76cbf655-294dc" (UID: "79ccdd97-da60-4ac7-a640-c640c45648f7") : secret "serving-cert" not found Dec 05 12:32:16.660225 master-0 kubenswrapper[8731]: I1205 12:32:16.658027 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 05 12:32:16.678860 master-0 kubenswrapper[8731]: W1205 12:32:16.678764 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod96fa3513_5467_4b0f_a03d_9279d36317bd.slice/crio-89350a747cdc136c0874cbdedf75ff768f3aa173665ba78cb7204afab7285a1e WatchSource:0}: Error finding container 89350a747cdc136c0874cbdedf75ff768f3aa173665ba78cb7204afab7285a1e: Status 404 returned error can't find the container with id 89350a747cdc136c0874cbdedf75ff768f3aa173665ba78cb7204afab7285a1e Dec 05 12:32:16.753211 master-0 kubenswrapper[8731]: I1205 12:32:16.749788 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k"] Dec 05 12:32:16.897079 master-0 kubenswrapper[8731]: I1205 12:32:16.896934 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-845d4454f8-kcq9s"] Dec 05 12:32:16.897708 master-0 kubenswrapper[8731]: I1205 12:32:16.897684 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.916689 master-0 kubenswrapper[8731]: I1205 12:32:16.908773 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 12:32:16.916689 master-0 kubenswrapper[8731]: I1205 12:32:16.908975 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 12:32:16.916689 master-0 kubenswrapper[8731]: I1205 12:32:16.909084 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 12:32:16.916689 master-0 kubenswrapper[8731]: I1205 12:32:16.909208 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 12:32:16.932695 master-0 kubenswrapper[8731]: I1205 12:32:16.929857 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 12:32:16.932695 master-0 kubenswrapper[8731]: I1205 12:32:16.930056 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 12:32:16.932695 master-0 kubenswrapper[8731]: I1205 12:32:16.930314 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 12:32:16.932695 master-0 kubenswrapper[8731]: I1205 12:32:16.931502 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 12:32:16.933673 master-0 kubenswrapper[8731]: I1205 12:32:16.933462 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 12:32:16.934736 master-0 kubenswrapper[8731]: I1205 12:32:16.934687 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-8645f66975-h6htr"] Dec 05 12:32:16.945971 master-0 kubenswrapper[8731]: I1205 12:32:16.945898 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-8645f66975-h6htr"] Dec 05 12:32:16.967382 master-0 kubenswrapper[8731]: I1205 12:32:16.967324 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh58c\" (UniqueName: \"kubernetes.io/projected/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-kube-api-access-dh58c\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.967382 master-0 kubenswrapper[8731]: I1205 12:32:16.967400 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-serving-ca\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.967765 master-0 kubenswrapper[8731]: I1205 12:32:16.967437 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.967765 master-0 kubenswrapper[8731]: I1205 12:32:16.967477 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit-dir\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.967765 master-0 kubenswrapper[8731]: I1205 12:32:16.967547 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-node-pullsecrets\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.967963 master-0 kubenswrapper[8731]: I1205 12:32:16.967761 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.967963 master-0 kubenswrapper[8731]: I1205 12:32:16.967874 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-config\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.968137 master-0 kubenswrapper[8731]: I1205 12:32:16.968081 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-trusted-ca-bundle\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.968357 master-0 kubenswrapper[8731]: I1205 12:32:16.968310 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-encryption-config\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.968565 master-0 kubenswrapper[8731]: I1205 12:32:16.968528 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-client\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.968672 master-0 kubenswrapper[8731]: I1205 12:32:16.968635 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-image-import-ca\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:16.969138 master-0 kubenswrapper[8731]: I1205 12:32:16.969092 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 12:32:16.970868 master-0 kubenswrapper[8731]: I1205 12:32:16.970835 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-845d4454f8-kcq9s"] Dec 05 12:32:17.070805 master-0 kubenswrapper[8731]: I1205 12:32:17.070703 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-client\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.070957 master-0 kubenswrapper[8731]: I1205 12:32:17.070823 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-image-import-ca\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.070957 master-0 kubenswrapper[8731]: I1205 12:32:17.070877 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh58c\" (UniqueName: \"kubernetes.io/projected/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-kube-api-access-dh58c\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.071092 master-0 kubenswrapper[8731]: I1205 12:32:17.071062 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-serving-ca\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.071586 master-0 kubenswrapper[8731]: I1205 12:32:17.071541 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.071646 master-0 kubenswrapper[8731]: I1205 12:32:17.071611 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit-dir\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.071646 master-0 kubenswrapper[8731]: I1205 12:32:17.071640 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-node-pullsecrets\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.071744 master-0 kubenswrapper[8731]: E1205 12:32:17.071708 8731 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 05 12:32:17.071844 master-0 kubenswrapper[8731]: E1205 12:32:17.071803 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert podName:f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb nodeName:}" failed. No retries permitted until 2025-12-05 12:32:17.571764101 +0000 UTC m=+35.875748278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert") pod "apiserver-845d4454f8-kcq9s" (UID: "f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb") : secret "serving-cert" not found Dec 05 12:32:17.071844 master-0 kubenswrapper[8731]: I1205 12:32:17.071720 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.071922 master-0 kubenswrapper[8731]: I1205 12:32:17.071874 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-node-pullsecrets\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.071969 master-0 kubenswrapper[8731]: I1205 12:32:17.071808 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit-dir\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.072005 master-0 kubenswrapper[8731]: I1205 12:32:17.071965 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-config\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.072208 master-0 kubenswrapper[8731]: I1205 12:32:17.072150 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-trusted-ca-bundle\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.072337 master-0 kubenswrapper[8731]: I1205 12:32:17.072245 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-encryption-config\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.072337 master-0 kubenswrapper[8731]: I1205 12:32:17.072309 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67d4056d-507d-44a9-b238-300913e1b957-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:17.072337 master-0 kubenswrapper[8731]: I1205 12:32:17.072327 8731 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/67d4056d-507d-44a9-b238-300913e1b957-audit\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:17.072676 master-0 kubenswrapper[8731]: I1205 12:32:17.072643 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-serving-ca\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.072799 master-0 kubenswrapper[8731]: I1205 12:32:17.072762 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-image-import-ca\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.076492 master-0 kubenswrapper[8731]: I1205 12:32:17.072937 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-config\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.076492 master-0 kubenswrapper[8731]: I1205 12:32:17.073493 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-trusted-ca-bundle\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.076492 master-0 kubenswrapper[8731]: I1205 12:32:17.073575 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.077823 master-0 kubenswrapper[8731]: I1205 12:32:17.077785 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-encryption-config\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.081993 master-0 kubenswrapper[8731]: I1205 12:32:17.081870 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-client\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.173262 master-0 kubenswrapper[8731]: I1205 12:32:17.173092 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:17.173572 master-0 kubenswrapper[8731]: E1205 12:32:17.173512 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:17.173702 master-0 kubenswrapper[8731]: E1205 12:32:17.173676 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert podName:79ccdd97-da60-4ac7-a640-c640c45648f7 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:21.173639727 +0000 UTC m=+39.477623934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert") pod "route-controller-manager-5c76cbf655-294dc" (UID: "79ccdd97-da60-4ac7-a640-c640c45648f7") : secret "serving-cert" not found Dec 05 12:32:17.255928 master-0 kubenswrapper[8731]: I1205 12:32:17.255249 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" event={"ID":"cf8247a1-703a-46b3-9a33-25a73b27ab99","Type":"ContainerStarted","Data":"48561b92390271bf5bcb9ad8430184be011980a59e84d647901af23e4a1dea25"} Dec 05 12:32:17.256587 master-0 kubenswrapper[8731]: I1205 12:32:17.256527 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"96fa3513-5467-4b0f-a03d-9279d36317bd","Type":"ContainerStarted","Data":"89350a747cdc136c0874cbdedf75ff768f3aa173665ba78cb7204afab7285a1e"} Dec 05 12:32:17.257576 master-0 kubenswrapper[8731]: I1205 12:32:17.257520 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerStarted","Data":"07bd9adb3dd2a54b1348564cac3ab912144772686d957ab49d9bf60d68718f5e"} Dec 05 12:32:17.318325 master-0 kubenswrapper[8731]: I1205 12:32:17.318242 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh58c\" (UniqueName: \"kubernetes.io/projected/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-kube-api-access-dh58c\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.579071 master-0 kubenswrapper[8731]: I1205 12:32:17.577552 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:17.579071 master-0 kubenswrapper[8731]: E1205 12:32:17.577826 8731 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 05 12:32:17.579071 master-0 kubenswrapper[8731]: E1205 12:32:17.577953 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert podName:f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb nodeName:}" failed. No retries permitted until 2025-12-05 12:32:18.577928958 +0000 UTC m=+36.881913125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert") pod "apiserver-845d4454f8-kcq9s" (UID: "f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb") : secret "serving-cert" not found Dec 05 12:32:17.941314 master-0 kubenswrapper[8731]: I1205 12:32:17.940915 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d4056d-507d-44a9-b238-300913e1b957" path="/var/lib/kubelet/pods/67d4056d-507d-44a9-b238-300913e1b957/volumes" Dec 05 12:32:18.264898 master-0 kubenswrapper[8731]: I1205 12:32:18.264697 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"96fa3513-5467-4b0f-a03d-9279d36317bd","Type":"ContainerStarted","Data":"0a7d145dbed8d32146e90821257e92134c8804dafe8896f59ec88530e6ad0c4e"} Dec 05 12:32:18.267446 master-0 kubenswrapper[8731]: I1205 12:32:18.267389 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerStarted","Data":"919a5a586c053c933b88b4faaf4716d63b0ce72dea0802a0de12305677effe13"} Dec 05 12:32:18.267877 master-0 kubenswrapper[8731]: I1205 12:32:18.267834 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerStarted","Data":"b02b74337c561023bb77d95397661e10a1ee5fc12d28b2fd7ee9556bbaba81e5"} Dec 05 12:32:18.329985 master-0 kubenswrapper[8731]: I1205 12:32:18.329895 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=5.329869352 podStartE2EDuration="5.329869352s" podCreationTimestamp="2025-12-05 12:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:32:18.328969809 +0000 UTC m=+36.632953986" watchObservedRunningTime="2025-12-05 12:32:18.329869352 +0000 UTC m=+36.633853519" Dec 05 12:32:18.330342 master-0 kubenswrapper[8731]: I1205 12:32:18.330155 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" podStartSLOduration=5.330148039 podStartE2EDuration="5.330148039s" podCreationTimestamp="2025-12-05 12:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:32:17.321599204 +0000 UTC m=+35.625583371" watchObservedRunningTime="2025-12-05 12:32:18.330148039 +0000 UTC m=+36.634132206" Dec 05 12:32:18.364209 master-0 kubenswrapper[8731]: I1205 12:32:18.364103 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podStartSLOduration=6.364085132 podStartE2EDuration="6.364085132s" podCreationTimestamp="2025-12-05 12:32:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:32:18.363907937 +0000 UTC m=+36.667892104" watchObservedRunningTime="2025-12-05 12:32:18.364085132 +0000 UTC m=+36.668069299" Dec 05 12:32:18.590764 master-0 kubenswrapper[8731]: I1205 12:32:18.590597 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:18.591022 master-0 kubenswrapper[8731]: I1205 12:32:18.590926 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:18.595827 master-0 kubenswrapper[8731]: I1205 12:32:18.595759 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:18.598316 master-0 kubenswrapper[8731]: I1205 12:32:18.597307 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:18.615578 master-0 kubenswrapper[8731]: I1205 12:32:18.610083 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 05 12:32:18.615578 master-0 kubenswrapper[8731]: I1205 12:32:18.610523 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="26b7da93-bb3a-48c9-a2dc-d91c73db5578" containerName="installer" containerID="cri-o://a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25" gracePeriod=30 Dec 05 12:32:18.734960 master-0 kubenswrapper[8731]: I1205 12:32:18.734850 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:18.790174 master-0 kubenswrapper[8731]: I1205 12:32:18.790070 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:18.961168 master-0 kubenswrapper[8731]: I1205 12:32:18.961084 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2"] Dec 05 12:32:18.968946 master-0 kubenswrapper[8731]: W1205 12:32:18.968879 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b741029_0eb5_409b_b7f1_95e8385dc400.slice/crio-029b733e2c6ad9f0e336ec7c4af189bd8388fe0d1d5f30c3280c2f24f4c1e475 WatchSource:0}: Error finding container 029b733e2c6ad9f0e336ec7c4af189bd8388fe0d1d5f30c3280c2f24f4c1e475: Status 404 returned error can't find the container with id 029b733e2c6ad9f0e336ec7c4af189bd8388fe0d1d5f30c3280c2f24f4c1e475 Dec 05 12:32:18.998799 master-0 kubenswrapper[8731]: I1205 12:32:18.998752 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-845d4454f8-kcq9s"] Dec 05 12:32:19.101558 master-0 kubenswrapper[8731]: I1205 12:32:19.101437 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:19.101789 master-0 kubenswrapper[8731]: E1205 12:32:19.101656 8731 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:19.101789 master-0 kubenswrapper[8731]: E1205 12:32:19.101745 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert podName:aa67674b-53bd-45d9-a217-915ed52ff870 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:27.101723199 +0000 UTC m=+45.405707376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert") pod "controller-manager-58c47b4bcf-j2srw" (UID: "aa67674b-53bd-45d9-a217-915ed52ff870") : secret "serving-cert" not found Dec 05 12:32:19.275024 master-0 kubenswrapper[8731]: I1205 12:32:19.274385 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerStarted","Data":"48416d2553549ef2df4e4b21da938432c85035a334034a6b191574d20869a9df"} Dec 05 12:32:19.275024 master-0 kubenswrapper[8731]: I1205 12:32:19.274947 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerStarted","Data":"029b733e2c6ad9f0e336ec7c4af189bd8388fe0d1d5f30c3280c2f24f4c1e475"} Dec 05 12:32:19.276511 master-0 kubenswrapper[8731]: I1205 12:32:19.276443 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" event={"ID":"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb","Type":"ContainerStarted","Data":"04f451fea9668a794e9e554df0005ce70f405943bf1c6d084959d7f333152fc6"} Dec 05 12:32:19.276980 master-0 kubenswrapper[8731]: I1205 12:32:19.276932 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:20.283065 master-0 kubenswrapper[8731]: I1205 12:32:20.282946 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerStarted","Data":"73f6bfa12151c71020cd1cc8c48ebdf6c4c24dbf1a05b4873ce05f073bdcce94"} Dec 05 12:32:20.802861 master-0 kubenswrapper[8731]: I1205 12:32:20.802763 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" podStartSLOduration=11.802721987 podStartE2EDuration="11.802721987s" podCreationTimestamp="2025-12-05 12:32:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:32:20.299699362 +0000 UTC m=+38.603683529" watchObservedRunningTime="2025-12-05 12:32:20.802721987 +0000 UTC m=+39.106706154" Dec 05 12:32:20.804633 master-0 kubenswrapper[8731]: I1205 12:32:20.804599 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 05 12:32:20.805361 master-0 kubenswrapper[8731]: I1205 12:32:20.805324 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:20.814659 master-0 kubenswrapper[8731]: I1205 12:32:20.814600 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 05 12:32:20.932356 master-0 kubenswrapper[8731]: I1205 12:32:20.932290 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:20.932676 master-0 kubenswrapper[8731]: I1205 12:32:20.932412 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89e6ecfe-a9da-45b5-b388-41cc8856934b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:20.932676 master-0 kubenswrapper[8731]: I1205 12:32:20.932635 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-var-lock\") pod \"installer-2-master-0\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:21.034171 master-0 kubenswrapper[8731]: I1205 12:32:21.034095 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-var-lock\") pod \"installer-2-master-0\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:21.034456 master-0 kubenswrapper[8731]: I1205 12:32:21.034338 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-var-lock\") pod \"installer-2-master-0\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:21.034640 master-0 kubenswrapper[8731]: I1205 12:32:21.034605 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:21.034694 master-0 kubenswrapper[8731]: I1205 12:32:21.034679 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89e6ecfe-a9da-45b5-b388-41cc8856934b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:21.034779 master-0 kubenswrapper[8731]: I1205 12:32:21.034748 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:21.056571 master-0 kubenswrapper[8731]: I1205 12:32:21.056440 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89e6ecfe-a9da-45b5-b388-41cc8856934b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:21.132989 master-0 kubenswrapper[8731]: I1205 12:32:21.132902 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:21.250752 master-0 kubenswrapper[8731]: I1205 12:32:21.250101 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:21.250752 master-0 kubenswrapper[8731]: E1205 12:32:21.250367 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:21.251147 master-0 kubenswrapper[8731]: E1205 12:32:21.250828 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert podName:79ccdd97-da60-4ac7-a640-c640c45648f7 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:29.250797918 +0000 UTC m=+47.554782085 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert") pod "route-controller-manager-5c76cbf655-294dc" (UID: "79ccdd97-da60-4ac7-a640-c640c45648f7") : secret "serving-cert" not found Dec 05 12:32:21.293501 master-0 kubenswrapper[8731]: I1205 12:32:21.293334 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:21.490319 master-0 kubenswrapper[8731]: I1205 12:32:21.490232 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 05 12:32:23.302957 master-0 kubenswrapper[8731]: I1205 12:32:23.302392 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"89e6ecfe-a9da-45b5-b388-41cc8856934b","Type":"ContainerStarted","Data":"25e71d5eddcafcd9829a5c66c2ef26fe2e32162f5afff5293fd0a08750acdc93"} Dec 05 12:32:24.310315 master-0 kubenswrapper[8731]: I1205 12:32:24.309855 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"89e6ecfe-a9da-45b5-b388-41cc8856934b","Type":"ContainerStarted","Data":"3bb4a9c911fb9f3218850f905848b2ec7f252465ca2760ba76b95367c063b67a"} Dec 05 12:32:24.312959 master-0 kubenswrapper[8731]: I1205 12:32:24.312696 8731 generic.go:334] "Generic (PLEG): container finished" podID="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" containerID="c9cbf8e5df58cf6c6aff3967b76368b2b683cdb47115f76abdee2db7c46ae76b" exitCode=0 Dec 05 12:32:24.312959 master-0 kubenswrapper[8731]: I1205 12:32:24.312768 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" event={"ID":"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb","Type":"ContainerDied","Data":"c9cbf8e5df58cf6c6aff3967b76368b2b683cdb47115f76abdee2db7c46ae76b"} Dec 05 12:32:24.326980 master-0 kubenswrapper[8731]: I1205 12:32:24.326903 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=4.32687822 podStartE2EDuration="4.32687822s" podCreationTimestamp="2025-12-05 12:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:32:24.325870464 +0000 UTC m=+42.629854641" watchObservedRunningTime="2025-12-05 12:32:24.32687822 +0000 UTC m=+42.630862387" Dec 05 12:32:24.344556 master-0 kubenswrapper[8731]: I1205 12:32:24.344321 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:32:25.319167 master-0 kubenswrapper[8731]: I1205 12:32:25.319065 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" event={"ID":"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb","Type":"ContainerStarted","Data":"47993b0f5c02b8432a4bbdcf73db57ba7e46c6e4e750f5d8d873140e16f0fa9e"} Dec 05 12:32:25.319167 master-0 kubenswrapper[8731]: I1205 12:32:25.319166 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" event={"ID":"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb","Type":"ContainerStarted","Data":"943143ef3973188af4783dcf40be99b719a3294d28a086cd4ae91e7bc36161f4"} Dec 05 12:32:27.142134 master-0 kubenswrapper[8731]: I1205 12:32:27.142051 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:27.151218 master-0 kubenswrapper[8731]: I1205 12:32:27.151151 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert\") pod \"controller-manager-58c47b4bcf-j2srw\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:27.191852 master-0 kubenswrapper[8731]: I1205 12:32:27.191788 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:27.840149 master-0 kubenswrapper[8731]: I1205 12:32:27.834434 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" podStartSLOduration=11.801351706 podStartE2EDuration="15.834408148s" podCreationTimestamp="2025-12-05 12:32:12 +0000 UTC" firstStartedPulling="2025-12-05 12:32:19.023536626 +0000 UTC m=+37.327520793" lastFinishedPulling="2025-12-05 12:32:23.056593058 +0000 UTC m=+41.360577235" observedRunningTime="2025-12-05 12:32:25.350367748 +0000 UTC m=+43.654351915" watchObservedRunningTime="2025-12-05 12:32:27.834408148 +0000 UTC m=+46.138392325" Dec 05 12:32:27.840149 master-0 kubenswrapper[8731]: I1205 12:32:27.835168 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58c47b4bcf-j2srw"] Dec 05 12:32:28.349239 master-0 kubenswrapper[8731]: I1205 12:32:28.349118 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" event={"ID":"aa67674b-53bd-45d9-a217-915ed52ff870","Type":"ContainerStarted","Data":"14479d42517f11fb838dbc9755cc660ffb3071d4a134877f141049d0dc8d4831"} Dec 05 12:32:28.740855 master-0 kubenswrapper[8731]: I1205 12:32:28.740773 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:32:28.791563 master-0 kubenswrapper[8731]: I1205 12:32:28.791468 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:28.791563 master-0 kubenswrapper[8731]: I1205 12:32:28.791564 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:29.273580 master-0 kubenswrapper[8731]: E1205 12:32:29.273048 8731 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 05 12:32:29.273960 master-0 kubenswrapper[8731]: E1205 12:32:29.273630 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert podName:79ccdd97-da60-4ac7-a640-c640c45648f7 nodeName:}" failed. No retries permitted until 2025-12-05 12:32:45.273599348 +0000 UTC m=+63.577583545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert") pod "route-controller-manager-5c76cbf655-294dc" (UID: "79ccdd97-da60-4ac7-a640-c640c45648f7") : secret "serving-cert" not found Dec 05 12:32:29.273960 master-0 kubenswrapper[8731]: I1205 12:32:29.272867 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert\") pod \"route-controller-manager-5c76cbf655-294dc\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: I1205 12:32:29.765361 8731 patch_prober.go:28] interesting pod/apiserver-845d4454f8-kcq9s container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]etcd ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/project.openshift.io-projectcache ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-startinformers ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 12:32:29.765484 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:32:29.767192 master-0 kubenswrapper[8731]: I1205 12:32:29.765472 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" podUID="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:32:31.382449 master-0 kubenswrapper[8731]: I1205 12:32:31.382254 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58c47b4bcf-j2srw"] Dec 05 12:32:31.590164 master-0 kubenswrapper[8731]: I1205 12:32:31.589871 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc"] Dec 05 12:32:31.590622 master-0 kubenswrapper[8731]: E1205 12:32:31.590532 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" podUID="79ccdd97-da60-4ac7-a640-c640c45648f7" Dec 05 12:32:32.388707 master-0 kubenswrapper[8731]: I1205 12:32:32.388615 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:32.402645 master-0 kubenswrapper[8731]: I1205 12:32:32.402594 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:32.517675 master-0 kubenswrapper[8731]: I1205 12:32:32.517305 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-config\") pod \"79ccdd97-da60-4ac7-a640-c640c45648f7\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " Dec 05 12:32:32.518020 master-0 kubenswrapper[8731]: I1205 12:32:32.517711 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-client-ca\") pod \"79ccdd97-da60-4ac7-a640-c640c45648f7\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " Dec 05 12:32:32.518020 master-0 kubenswrapper[8731]: I1205 12:32:32.517774 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn7rl\" (UniqueName: \"kubernetes.io/projected/79ccdd97-da60-4ac7-a640-c640c45648f7-kube-api-access-jn7rl\") pod \"79ccdd97-da60-4ac7-a640-c640c45648f7\" (UID: \"79ccdd97-da60-4ac7-a640-c640c45648f7\") " Dec 05 12:32:32.518020 master-0 kubenswrapper[8731]: I1205 12:32:32.517867 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-config" (OuterVolumeSpecName: "config") pod "79ccdd97-da60-4ac7-a640-c640c45648f7" (UID: "79ccdd97-da60-4ac7-a640-c640c45648f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:32.518198 master-0 kubenswrapper[8731]: I1205 12:32:32.518042 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:32.518390 master-0 kubenswrapper[8731]: I1205 12:32:32.518335 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "79ccdd97-da60-4ac7-a640-c640c45648f7" (UID: "79ccdd97-da60-4ac7-a640-c640c45648f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:32.525252 master-0 kubenswrapper[8731]: I1205 12:32:32.525149 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ccdd97-da60-4ac7-a640-c640c45648f7-kube-api-access-jn7rl" (OuterVolumeSpecName: "kube-api-access-jn7rl") pod "79ccdd97-da60-4ac7-a640-c640c45648f7" (UID: "79ccdd97-da60-4ac7-a640-c640c45648f7"). InnerVolumeSpecName "kube-api-access-jn7rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:32:32.618907 master-0 kubenswrapper[8731]: I1205 12:32:32.618831 8731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/79ccdd97-da60-4ac7-a640-c640c45648f7-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:32.618907 master-0 kubenswrapper[8731]: I1205 12:32:32.618877 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn7rl\" (UniqueName: \"kubernetes.io/projected/79ccdd97-da60-4ac7-a640-c640c45648f7-kube-api-access-jn7rl\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:33.395671 master-0 kubenswrapper[8731]: I1205 12:32:33.395566 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc" Dec 05 12:32:33.397006 master-0 kubenswrapper[8731]: I1205 12:32:33.395573 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" event={"ID":"aa67674b-53bd-45d9-a217-915ed52ff870","Type":"ContainerStarted","Data":"60001bc0404d8de44828ec5e4b2c53caec988534f77ff46ee3c94718546080a9"} Dec 05 12:32:33.397006 master-0 kubenswrapper[8731]: I1205 12:32:33.396014 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" podUID="aa67674b-53bd-45d9-a217-915ed52ff870" containerName="controller-manager" containerID="cri-o://60001bc0404d8de44828ec5e4b2c53caec988534f77ff46ee3c94718546080a9" gracePeriod=30 Dec 05 12:32:37.193374 master-0 kubenswrapper[8731]: I1205 12:32:37.192771 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:37.200589 master-0 kubenswrapper[8731]: I1205 12:32:37.200461 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:37.650168 master-0 kubenswrapper[8731]: I1205 12:32:37.649970 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:37.663060 master-0 kubenswrapper[8731]: I1205 12:32:37.663006 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:32:37.667772 master-0 kubenswrapper[8731]: I1205 12:32:37.667664 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 05 12:32:37.668113 master-0 kubenswrapper[8731]: I1205 12:32:37.668021 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="89e6ecfe-a9da-45b5-b388-41cc8856934b" containerName="installer" containerID="cri-o://3bb4a9c911fb9f3218850f905848b2ec7f252465ca2760ba76b95367c063b67a" gracePeriod=30 Dec 05 12:32:38.433175 master-0 kubenswrapper[8731]: I1205 12:32:38.432068 8731 generic.go:334] "Generic (PLEG): container finished" podID="aa67674b-53bd-45d9-a217-915ed52ff870" containerID="60001bc0404d8de44828ec5e4b2c53caec988534f77ff46ee3c94718546080a9" exitCode=0 Dec 05 12:32:38.433175 master-0 kubenswrapper[8731]: I1205 12:32:38.433080 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" event={"ID":"aa67674b-53bd-45d9-a217-915ed52ff870","Type":"ContainerDied","Data":"60001bc0404d8de44828ec5e4b2c53caec988534f77ff46ee3c94718546080a9"} Dec 05 12:32:40.097494 master-0 kubenswrapper[8731]: I1205 12:32:40.097358 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:40.231081 master-0 kubenswrapper[8731]: I1205 12:32:40.230996 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert\") pod \"aa67674b-53bd-45d9-a217-915ed52ff870\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " Dec 05 12:32:40.231081 master-0 kubenswrapper[8731]: I1205 12:32:40.231096 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-proxy-ca-bundles\") pod \"aa67674b-53bd-45d9-a217-915ed52ff870\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " Dec 05 12:32:40.231654 master-0 kubenswrapper[8731]: I1205 12:32:40.231148 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-client-ca\") pod \"aa67674b-53bd-45d9-a217-915ed52ff870\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " Dec 05 12:32:40.231654 master-0 kubenswrapper[8731]: I1205 12:32:40.231208 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-config\") pod \"aa67674b-53bd-45d9-a217-915ed52ff870\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " Dec 05 12:32:40.231654 master-0 kubenswrapper[8731]: I1205 12:32:40.231239 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85qjz\" (UniqueName: \"kubernetes.io/projected/aa67674b-53bd-45d9-a217-915ed52ff870-kube-api-access-85qjz\") pod \"aa67674b-53bd-45d9-a217-915ed52ff870\" (UID: \"aa67674b-53bd-45d9-a217-915ed52ff870\") " Dec 05 12:32:40.232555 master-0 kubenswrapper[8731]: I1205 12:32:40.232437 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-client-ca" (OuterVolumeSpecName: "client-ca") pod "aa67674b-53bd-45d9-a217-915ed52ff870" (UID: "aa67674b-53bd-45d9-a217-915ed52ff870"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:40.239447 master-0 kubenswrapper[8731]: I1205 12:32:40.232914 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-config" (OuterVolumeSpecName: "config") pod "aa67674b-53bd-45d9-a217-915ed52ff870" (UID: "aa67674b-53bd-45d9-a217-915ed52ff870"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:40.239447 master-0 kubenswrapper[8731]: I1205 12:32:40.232997 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "aa67674b-53bd-45d9-a217-915ed52ff870" (UID: "aa67674b-53bd-45d9-a217-915ed52ff870"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:40.241972 master-0 kubenswrapper[8731]: I1205 12:32:40.241443 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aa67674b-53bd-45d9-a217-915ed52ff870" (UID: "aa67674b-53bd-45d9-a217-915ed52ff870"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:32:40.243237 master-0 kubenswrapper[8731]: I1205 12:32:40.243162 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa67674b-53bd-45d9-a217-915ed52ff870-kube-api-access-85qjz" (OuterVolumeSpecName: "kube-api-access-85qjz") pod "aa67674b-53bd-45d9-a217-915ed52ff870" (UID: "aa67674b-53bd-45d9-a217-915ed52ff870"). InnerVolumeSpecName "kube-api-access-85qjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:32:40.345034 master-0 kubenswrapper[8731]: I1205 12:32:40.332903 8731 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:40.345034 master-0 kubenswrapper[8731]: I1205 12:32:40.332954 8731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:40.345034 master-0 kubenswrapper[8731]: I1205 12:32:40.332964 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa67674b-53bd-45d9-a217-915ed52ff870-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:40.345034 master-0 kubenswrapper[8731]: I1205 12:32:40.332974 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85qjz\" (UniqueName: \"kubernetes.io/projected/aa67674b-53bd-45d9-a217-915ed52ff870-kube-api-access-85qjz\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:40.345034 master-0 kubenswrapper[8731]: I1205 12:32:40.332988 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa67674b-53bd-45d9-a217-915ed52ff870-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:40.433678 master-0 kubenswrapper[8731]: I1205 12:32:40.433489 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl"] Dec 05 12:32:40.433938 master-0 kubenswrapper[8731]: E1205 12:32:40.433767 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa67674b-53bd-45d9-a217-915ed52ff870" containerName="controller-manager" Dec 05 12:32:40.433938 master-0 kubenswrapper[8731]: I1205 12:32:40.433786 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa67674b-53bd-45d9-a217-915ed52ff870" containerName="controller-manager" Dec 05 12:32:40.433938 master-0 kubenswrapper[8731]: I1205 12:32:40.433877 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa67674b-53bd-45d9-a217-915ed52ff870" containerName="controller-manager" Dec 05 12:32:40.434434 master-0 kubenswrapper[8731]: I1205 12:32:40.434404 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.439636 master-0 kubenswrapper[8731]: I1205 12:32:40.439583 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc"] Dec 05 12:32:40.440086 master-0 kubenswrapper[8731]: I1205 12:32:40.440060 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 12:32:40.440371 master-0 kubenswrapper[8731]: I1205 12:32:40.440333 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 12:32:40.441261 master-0 kubenswrapper[8731]: I1205 12:32:40.440893 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 12:32:40.441261 master-0 kubenswrapper[8731]: I1205 12:32:40.440954 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 12:32:40.441261 master-0 kubenswrapper[8731]: I1205 12:32:40.440905 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 12:32:40.447103 master-0 kubenswrapper[8731]: I1205 12:32:40.447030 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_89e6ecfe-a9da-45b5-b388-41cc8856934b/installer/0.log" Dec 05 12:32:40.447103 master-0 kubenswrapper[8731]: I1205 12:32:40.447084 8731 generic.go:334] "Generic (PLEG): container finished" podID="89e6ecfe-a9da-45b5-b388-41cc8856934b" containerID="3bb4a9c911fb9f3218850f905848b2ec7f252465ca2760ba76b95367c063b67a" exitCode=1 Dec 05 12:32:40.447307 master-0 kubenswrapper[8731]: I1205 12:32:40.447224 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"89e6ecfe-a9da-45b5-b388-41cc8856934b","Type":"ContainerDied","Data":"3bb4a9c911fb9f3218850f905848b2ec7f252465ca2760ba76b95367c063b67a"} Dec 05 12:32:40.449740 master-0 kubenswrapper[8731]: I1205 12:32:40.449034 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" event={"ID":"aa67674b-53bd-45d9-a217-915ed52ff870","Type":"ContainerDied","Data":"14479d42517f11fb838dbc9755cc660ffb3071d4a134877f141049d0dc8d4831"} Dec 05 12:32:40.449740 master-0 kubenswrapper[8731]: I1205 12:32:40.449144 8731 scope.go:117] "RemoveContainer" containerID="60001bc0404d8de44828ec5e4b2c53caec988534f77ff46ee3c94718546080a9" Dec 05 12:32:40.449740 master-0 kubenswrapper[8731]: I1205 12:32:40.449166 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" Dec 05 12:32:40.463138 master-0 kubenswrapper[8731]: I1205 12:32:40.463087 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl"] Dec 05 12:32:40.535917 master-0 kubenswrapper[8731]: I1205 12:32:40.535845 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-config\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.535917 master-0 kubenswrapper[8731]: I1205 12:32:40.535910 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-client-ca\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.536221 master-0 kubenswrapper[8731]: I1205 12:32:40.536074 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-serving-cert\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.536221 master-0 kubenswrapper[8731]: I1205 12:32:40.536141 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8p45\" (UniqueName: \"kubernetes.io/projected/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-kube-api-access-t8p45\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.637075 master-0 kubenswrapper[8731]: I1205 12:32:40.637002 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-client-ca\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.637328 master-0 kubenswrapper[8731]: I1205 12:32:40.637109 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-serving-cert\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.637328 master-0 kubenswrapper[8731]: I1205 12:32:40.637216 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8p45\" (UniqueName: \"kubernetes.io/projected/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-kube-api-access-t8p45\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.637518 master-0 kubenswrapper[8731]: I1205 12:32:40.637489 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-config\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.638645 master-0 kubenswrapper[8731]: I1205 12:32:40.638591 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-client-ca\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.638728 master-0 kubenswrapper[8731]: I1205 12:32:40.638674 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-config\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.646602 master-0 kubenswrapper[8731]: I1205 12:32:40.646542 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-serving-cert\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.782352 master-0 kubenswrapper[8731]: I1205 12:32:40.774246 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c76cbf655-294dc"] Dec 05 12:32:40.782352 master-0 kubenswrapper[8731]: I1205 12:32:40.775902 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 05 12:32:40.782352 master-0 kubenswrapper[8731]: I1205 12:32:40.776822 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:40.782352 master-0 kubenswrapper[8731]: I1205 12:32:40.778553 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 05 12:32:40.786665 master-0 kubenswrapper[8731]: I1205 12:32:40.783786 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8p45\" (UniqueName: \"kubernetes.io/projected/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-kube-api-access-t8p45\") pod \"route-controller-manager-858598fd98-5xkcl\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:40.800218 master-0 kubenswrapper[8731]: I1205 12:32:40.794069 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv"] Dec 05 12:32:40.800218 master-0 kubenswrapper[8731]: I1205 12:32:40.795308 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:40.800218 master-0 kubenswrapper[8731]: I1205 12:32:40.797368 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 12:32:40.800218 master-0 kubenswrapper[8731]: I1205 12:32:40.799087 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 12:32:40.800218 master-0 kubenswrapper[8731]: I1205 12:32:40.799445 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 12:32:40.800218 master-0 kubenswrapper[8731]: I1205 12:32:40.799589 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 12:32:40.800218 master-0 kubenswrapper[8731]: I1205 12:32:40.799742 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 12:32:40.800218 master-0 kubenswrapper[8731]: I1205 12:32:40.800078 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 12:32:40.801925 master-0 kubenswrapper[8731]: I1205 12:32:40.801252 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 12:32:40.801925 master-0 kubenswrapper[8731]: I1205 12:32:40.801430 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 12:32:40.928484 master-0 kubenswrapper[8731]: I1205 12:32:40.928435 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_89e6ecfe-a9da-45b5-b388-41cc8856934b/installer/0.log" Dec 05 12:32:40.928580 master-0 kubenswrapper[8731]: I1205 12:32:40.928520 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:40.940636 master-0 kubenswrapper[8731]: I1205 12:32:40.940598 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-encryption-config\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:40.940718 master-0 kubenswrapper[8731]: I1205 12:32:40.940687 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa1512be-895a-47e0-abf5-0155c71500e3-kube-api-access\") pod \"installer-3-master-0\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:40.940775 master-0 kubenswrapper[8731]: I1205 12:32:40.940761 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-client\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:40.940825 master-0 kubenswrapper[8731]: I1205 12:32:40.940801 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjp62\" (UniqueName: \"kubernetes.io/projected/d72b2b71-27b2-4aff-bf69-7054a9556318-kube-api-access-wjp62\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:40.940873 master-0 kubenswrapper[8731]: I1205 12:32:40.940828 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-policies\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:40.940873 master-0 kubenswrapper[8731]: I1205 12:32:40.940844 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-serving-cert\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:40.941121 master-0 kubenswrapper[8731]: I1205 12:32:40.941063 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:40.941206 master-0 kubenswrapper[8731]: I1205 12:32:40.941158 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-trusted-ca-bundle\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:40.941324 master-0 kubenswrapper[8731]: I1205 12:32:40.941286 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-serving-ca\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:40.941373 master-0 kubenswrapper[8731]: I1205 12:32:40.941339 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-var-lock\") pod \"installer-3-master-0\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:40.941455 master-0 kubenswrapper[8731]: I1205 12:32:40.941430 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-dir\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:40.941587 master-0 kubenswrapper[8731]: I1205 12:32:40.941551 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79ccdd97-da60-4ac7-a640-c640c45648f7-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:41.042524 master-0 kubenswrapper[8731]: I1205 12:32:41.042285 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-kubelet-dir\") pod \"89e6ecfe-a9da-45b5-b388-41cc8856934b\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " Dec 05 12:32:41.042524 master-0 kubenswrapper[8731]: I1205 12:32:41.042437 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89e6ecfe-a9da-45b5-b388-41cc8856934b-kube-api-access\") pod \"89e6ecfe-a9da-45b5-b388-41cc8856934b\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " Dec 05 12:32:41.042524 master-0 kubenswrapper[8731]: I1205 12:32:41.042435 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "89e6ecfe-a9da-45b5-b388-41cc8856934b" (UID: "89e6ecfe-a9da-45b5-b388-41cc8856934b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:32:41.042524 master-0 kubenswrapper[8731]: I1205 12:32:41.042515 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-var-lock\") pod \"89e6ecfe-a9da-45b5-b388-41cc8856934b\" (UID: \"89e6ecfe-a9da-45b5-b388-41cc8856934b\") " Dec 05 12:32:41.043301 master-0 kubenswrapper[8731]: I1205 12:32:41.042742 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-client\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.043301 master-0 kubenswrapper[8731]: I1205 12:32:41.042764 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-var-lock" (OuterVolumeSpecName: "var-lock") pod "89e6ecfe-a9da-45b5-b388-41cc8856934b" (UID: "89e6ecfe-a9da-45b5-b388-41cc8856934b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:32:41.043301 master-0 kubenswrapper[8731]: I1205 12:32:41.042803 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjp62\" (UniqueName: \"kubernetes.io/projected/d72b2b71-27b2-4aff-bf69-7054a9556318-kube-api-access-wjp62\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.043287 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-policies\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.043422 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-serving-cert\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.043606 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.043679 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-trusted-ca-bundle\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.043774 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-serving-ca\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.043826 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-var-lock\") pod \"installer-3-master-0\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.043933 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-dir\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.044045 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-encryption-config\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.044083 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa1512be-895a-47e0-abf5-0155c71500e3-kube-api-access\") pod \"installer-3-master-0\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.044216 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.044251 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/89e6ecfe-a9da-45b5-b388-41cc8856934b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.044300 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-dir\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.044376 master-0 kubenswrapper[8731]: I1205 12:32:41.044365 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-var-lock\") pod \"installer-3-master-0\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:41.045640 master-0 kubenswrapper[8731]: I1205 12:32:41.044435 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:41.045640 master-0 kubenswrapper[8731]: I1205 12:32:41.044930 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-policies\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.045640 master-0 kubenswrapper[8731]: I1205 12:32:41.045140 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-trusted-ca-bundle\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.045946 master-0 kubenswrapper[8731]: I1205 12:32:41.045781 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-serving-ca\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.046597 master-0 kubenswrapper[8731]: I1205 12:32:41.046539 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-serving-cert\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.049460 master-0 kubenswrapper[8731]: I1205 12:32:41.046762 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89e6ecfe-a9da-45b5-b388-41cc8856934b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "89e6ecfe-a9da-45b5-b388-41cc8856934b" (UID: "89e6ecfe-a9da-45b5-b388-41cc8856934b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:32:41.049460 master-0 kubenswrapper[8731]: I1205 12:32:41.048692 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-client\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.072281 master-0 kubenswrapper[8731]: I1205 12:32:41.068839 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-encryption-config\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.072281 master-0 kubenswrapper[8731]: I1205 12:32:41.070598 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:41.155362 master-0 kubenswrapper[8731]: I1205 12:32:41.143687 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv"] Dec 05 12:32:41.155362 master-0 kubenswrapper[8731]: I1205 12:32:41.148994 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89e6ecfe-a9da-45b5-b388-41cc8856934b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:41.177256 master-0 kubenswrapper[8731]: I1205 12:32:41.172980 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjp62\" (UniqueName: \"kubernetes.io/projected/d72b2b71-27b2-4aff-bf69-7054a9556318-kube-api-access-wjp62\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.177256 master-0 kubenswrapper[8731]: I1205 12:32:41.173759 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa1512be-895a-47e0-abf5-0155c71500e3-kube-api-access\") pod \"installer-3-master-0\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:41.229245 master-0 kubenswrapper[8731]: I1205 12:32:41.228462 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58c47b4bcf-j2srw" podStartSLOduration=26.64076402 podStartE2EDuration="31.228443864s" podCreationTimestamp="2025-12-05 12:32:10 +0000 UTC" firstStartedPulling="2025-12-05 12:32:27.85315016 +0000 UTC m=+46.157134327" lastFinishedPulling="2025-12-05 12:32:32.440830004 +0000 UTC m=+50.744814171" observedRunningTime="2025-12-05 12:32:41.144083059 +0000 UTC m=+59.448067226" watchObservedRunningTime="2025-12-05 12:32:41.228443864 +0000 UTC m=+59.532428031" Dec 05 12:32:41.438000 master-0 kubenswrapper[8731]: I1205 12:32:41.437485 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:32:41.458562 master-0 kubenswrapper[8731]: I1205 12:32:41.458347 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:41.464695 master-0 kubenswrapper[8731]: I1205 12:32:41.464631 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_89e6ecfe-a9da-45b5-b388-41cc8856934b/installer/0.log" Dec 05 12:32:41.464793 master-0 kubenswrapper[8731]: I1205 12:32:41.464725 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"89e6ecfe-a9da-45b5-b388-41cc8856934b","Type":"ContainerDied","Data":"25e71d5eddcafcd9829a5c66c2ef26fe2e32162f5afff5293fd0a08750acdc93"} Dec 05 12:32:41.464793 master-0 kubenswrapper[8731]: I1205 12:32:41.464779 8731 scope.go:117] "RemoveContainer" containerID="3bb4a9c911fb9f3218850f905848b2ec7f252465ca2760ba76b95367c063b67a" Dec 05 12:32:41.464927 master-0 kubenswrapper[8731]: I1205 12:32:41.464845 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 05 12:32:41.944219 master-0 kubenswrapper[8731]: I1205 12:32:41.944135 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79ccdd97-da60-4ac7-a640-c640c45648f7" path="/var/lib/kubelet/pods/79ccdd97-da60-4ac7-a640-c640c45648f7/volumes" Dec 05 12:32:42.105968 master-0 kubenswrapper[8731]: I1205 12:32:42.105908 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv"] Dec 05 12:32:42.105968 master-0 kubenswrapper[8731]: I1205 12:32:42.105974 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 05 12:32:42.121239 master-0 kubenswrapper[8731]: I1205 12:32:42.119373 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl"] Dec 05 12:32:42.201195 master-0 kubenswrapper[8731]: I1205 12:32:42.197287 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58c47b4bcf-j2srw"] Dec 05 12:32:42.230214 master-0 kubenswrapper[8731]: I1205 12:32:42.213259 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-58c47b4bcf-j2srw"] Dec 05 12:32:42.238689 master-0 kubenswrapper[8731]: I1205 12:32:42.238323 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 05 12:32:42.246650 master-0 kubenswrapper[8731]: I1205 12:32:42.244645 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 05 12:32:42.490083 master-0 kubenswrapper[8731]: I1205 12:32:42.489999 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" event={"ID":"d72b2b71-27b2-4aff-bf69-7054a9556318","Type":"ContainerStarted","Data":"9abf289d98169b2aa959495298e72df522e02a710723a8c85b99355af8b7eae3"} Dec 05 12:32:42.491593 master-0 kubenswrapper[8731]: I1205 12:32:42.491531 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" event={"ID":"bb7dd3e9-5a59-4741-970e-aa41c4e078cc","Type":"ContainerStarted","Data":"5c4af08f057f648c818372db0ec480f0be2db13e0c4e8fe00fc9f59a56ca06ec"} Dec 05 12:32:42.493663 master-0 kubenswrapper[8731]: I1205 12:32:42.493608 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"fa1512be-895a-47e0-abf5-0155c71500e3","Type":"ContainerStarted","Data":"a0700728063122d7318a3238bf1b7f099537df8a3022348c540ee8f3798feac2"} Dec 05 12:32:43.203993 master-0 kubenswrapper[8731]: I1205 12:32:43.203551 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_26b7da93-bb3a-48c9-a2dc-d91c73db5578/installer/0.log" Dec 05 12:32:43.203993 master-0 kubenswrapper[8731]: I1205 12:32:43.203993 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:43.259241 master-0 kubenswrapper[8731]: I1205 12:32:43.259149 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 05 12:32:43.288384 master-0 kubenswrapper[8731]: I1205 12:32:43.288280 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-var-lock\") pod \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " Dec 05 12:32:43.288384 master-0 kubenswrapper[8731]: I1205 12:32:43.288386 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kube-api-access\") pod \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " Dec 05 12:32:43.288692 master-0 kubenswrapper[8731]: I1205 12:32:43.288437 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kubelet-dir\") pod \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\" (UID: \"26b7da93-bb3a-48c9-a2dc-d91c73db5578\") " Dec 05 12:32:43.289343 master-0 kubenswrapper[8731]: I1205 12:32:43.289308 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-var-lock" (OuterVolumeSpecName: "var-lock") pod "26b7da93-bb3a-48c9-a2dc-d91c73db5578" (UID: "26b7da93-bb3a-48c9-a2dc-d91c73db5578"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:32:43.289499 master-0 kubenswrapper[8731]: I1205 12:32:43.289394 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "26b7da93-bb3a-48c9-a2dc-d91c73db5578" (UID: "26b7da93-bb3a-48c9-a2dc-d91c73db5578"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:32:43.289855 master-0 kubenswrapper[8731]: I1205 12:32:43.289818 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:43.289890 master-0 kubenswrapper[8731]: I1205 12:32:43.289854 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:43.297490 master-0 kubenswrapper[8731]: I1205 12:32:43.297433 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "26b7da93-bb3a-48c9-a2dc-d91c73db5578" (UID: "26b7da93-bb3a-48c9-a2dc-d91c73db5578"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:32:43.391411 master-0 kubenswrapper[8731]: I1205 12:32:43.391349 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/26b7da93-bb3a-48c9-a2dc-d91c73db5578-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:43.478378 master-0 kubenswrapper[8731]: I1205 12:32:43.475436 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 05 12:32:43.478378 master-0 kubenswrapper[8731]: E1205 12:32:43.475653 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e6ecfe-a9da-45b5-b388-41cc8856934b" containerName="installer" Dec 05 12:32:43.478378 master-0 kubenswrapper[8731]: I1205 12:32:43.475668 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e6ecfe-a9da-45b5-b388-41cc8856934b" containerName="installer" Dec 05 12:32:43.478378 master-0 kubenswrapper[8731]: E1205 12:32:43.475689 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b7da93-bb3a-48c9-a2dc-d91c73db5578" containerName="installer" Dec 05 12:32:43.478378 master-0 kubenswrapper[8731]: I1205 12:32:43.475695 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b7da93-bb3a-48c9-a2dc-d91c73db5578" containerName="installer" Dec 05 12:32:43.478378 master-0 kubenswrapper[8731]: I1205 12:32:43.475757 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="89e6ecfe-a9da-45b5-b388-41cc8856934b" containerName="installer" Dec 05 12:32:43.478378 master-0 kubenswrapper[8731]: I1205 12:32:43.475775 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b7da93-bb3a-48c9-a2dc-d91c73db5578" containerName="installer" Dec 05 12:32:43.478378 master-0 kubenswrapper[8731]: I1205 12:32:43.476134 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.483303 master-0 kubenswrapper[8731]: I1205 12:32:43.479455 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 12:32:43.501419 master-0 kubenswrapper[8731]: I1205 12:32:43.493657 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 05 12:32:43.501419 master-0 kubenswrapper[8731]: I1205 12:32:43.494882 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.501419 master-0 kubenswrapper[8731]: I1205 12:32:43.494957 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-var-lock\") pod \"installer-1-master-0\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.501419 master-0 kubenswrapper[8731]: I1205 12:32:43.494984 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.508319 master-0 kubenswrapper[8731]: I1205 12:32:43.507427 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_26b7da93-bb3a-48c9-a2dc-d91c73db5578/installer/0.log" Dec 05 12:32:43.508319 master-0 kubenswrapper[8731]: I1205 12:32:43.507500 8731 generic.go:334] "Generic (PLEG): container finished" podID="26b7da93-bb3a-48c9-a2dc-d91c73db5578" containerID="a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25" exitCode=1 Dec 05 12:32:43.508319 master-0 kubenswrapper[8731]: I1205 12:32:43.507563 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 05 12:32:43.508319 master-0 kubenswrapper[8731]: I1205 12:32:43.507603 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"26b7da93-bb3a-48c9-a2dc-d91c73db5578","Type":"ContainerDied","Data":"a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25"} Dec 05 12:32:43.508319 master-0 kubenswrapper[8731]: I1205 12:32:43.507699 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"26b7da93-bb3a-48c9-a2dc-d91c73db5578","Type":"ContainerDied","Data":"30ed2a2a29bc0515d570bdea00d443e316a56c4e00d683ca90b32cc9841c20a6"} Dec 05 12:32:43.508319 master-0 kubenswrapper[8731]: I1205 12:32:43.507730 8731 scope.go:117] "RemoveContainer" containerID="a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25" Dec 05 12:32:43.512130 master-0 kubenswrapper[8731]: I1205 12:32:43.511130 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"fa1512be-895a-47e0-abf5-0155c71500e3","Type":"ContainerStarted","Data":"110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93"} Dec 05 12:32:43.527885 master-0 kubenswrapper[8731]: I1205 12:32:43.526655 8731 scope.go:117] "RemoveContainer" containerID="a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25" Dec 05 12:32:43.529212 master-0 kubenswrapper[8731]: E1205 12:32:43.529152 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25\": container with ID starting with a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25 not found: ID does not exist" containerID="a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25" Dec 05 12:32:43.529306 master-0 kubenswrapper[8731]: I1205 12:32:43.529215 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25"} err="failed to get container status \"a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25\": rpc error: code = NotFound desc = could not find container \"a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25\": container with ID starting with a96e1efb63148942b97853dca9df4ffab2547d1232ce0445c574be5258ea9b25 not found: ID does not exist" Dec 05 12:32:43.544338 master-0 kubenswrapper[8731]: I1205 12:32:43.544137 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=3.544086849 podStartE2EDuration="3.544086849s" podCreationTimestamp="2025-12-05 12:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:32:43.529983143 +0000 UTC m=+61.833967310" watchObservedRunningTime="2025-12-05 12:32:43.544086849 +0000 UTC m=+61.848071016" Dec 05 12:32:43.547098 master-0 kubenswrapper[8731]: I1205 12:32:43.545844 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 05 12:32:43.548990 master-0 kubenswrapper[8731]: I1205 12:32:43.548913 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 05 12:32:43.596000 master-0 kubenswrapper[8731]: I1205 12:32:43.595930 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-var-lock\") pod \"installer-1-master-0\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.596377 master-0 kubenswrapper[8731]: I1205 12:32:43.596111 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-var-lock\") pod \"installer-1-master-0\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.596458 master-0 kubenswrapper[8731]: I1205 12:32:43.596399 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.596513 master-0 kubenswrapper[8731]: I1205 12:32:43.596490 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.596759 master-0 kubenswrapper[8731]: I1205 12:32:43.596730 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.615847 master-0 kubenswrapper[8731]: I1205 12:32:43.615780 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.806535 master-0 kubenswrapper[8731]: I1205 12:32:43.806387 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:32:43.943945 master-0 kubenswrapper[8731]: I1205 12:32:43.943875 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26b7da93-bb3a-48c9-a2dc-d91c73db5578" path="/var/lib/kubelet/pods/26b7da93-bb3a-48c9-a2dc-d91c73db5578/volumes" Dec 05 12:32:43.944507 master-0 kubenswrapper[8731]: I1205 12:32:43.944475 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89e6ecfe-a9da-45b5-b388-41cc8856934b" path="/var/lib/kubelet/pods/89e6ecfe-a9da-45b5-b388-41cc8856934b/volumes" Dec 05 12:32:43.945309 master-0 kubenswrapper[8731]: I1205 12:32:43.945277 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa67674b-53bd-45d9-a217-915ed52ff870" path="/var/lib/kubelet/pods/aa67674b-53bd-45d9-a217-915ed52ff870/volumes" Dec 05 12:32:44.517439 master-0 kubenswrapper[8731]: I1205 12:32:44.517333 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="fa1512be-895a-47e0-abf5-0155c71500e3" containerName="installer" containerID="cri-o://110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93" gracePeriod=30 Dec 05 12:32:45.051623 master-0 kubenswrapper[8731]: I1205 12:32:45.049887 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 05 12:32:45.076165 master-0 kubenswrapper[8731]: W1205 12:32:45.076091 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd627fcf3_2a80_4739_add9_e21ad4efc6eb.slice/crio-f05e889af7263b1ba07fc81648bcfe8b4d672794681d2558bab4fed4dcbd28ca WatchSource:0}: Error finding container f05e889af7263b1ba07fc81648bcfe8b4d672794681d2558bab4fed4dcbd28ca: Status 404 returned error can't find the container with id f05e889af7263b1ba07fc81648bcfe8b4d672794681d2558bab4fed4dcbd28ca Dec 05 12:32:45.465107 master-0 kubenswrapper[8731]: I1205 12:32:45.464682 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 05 12:32:45.466016 master-0 kubenswrapper[8731]: I1205 12:32:45.465962 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:45.476233 master-0 kubenswrapper[8731]: I1205 12:32:45.475934 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 05 12:32:45.523321 master-0 kubenswrapper[8731]: I1205 12:32:45.523222 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-var-lock\") pod \"installer-4-master-0\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:45.524755 master-0 kubenswrapper[8731]: I1205 12:32:45.523599 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:45.524755 master-0 kubenswrapper[8731]: I1205 12:32:45.523706 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/076dafdf-a5d2-4e2d-9c38-6932910f7327-kube-api-access\") pod \"installer-4-master-0\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:45.525469 master-0 kubenswrapper[8731]: I1205 12:32:45.525395 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" event={"ID":"bb7dd3e9-5a59-4741-970e-aa41c4e078cc","Type":"ContainerStarted","Data":"d12e1c8bf264de03492186948f2fcb8fa30acf3e5c6ac0dd00637ed1e75cfa31"} Dec 05 12:32:45.526264 master-0 kubenswrapper[8731]: I1205 12:32:45.526207 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:45.529026 master-0 kubenswrapper[8731]: I1205 12:32:45.528966 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"d627fcf3-2a80-4739-add9-e21ad4efc6eb","Type":"ContainerStarted","Data":"8654b600b7307ea1bcd3fe84275fb56084c5722cbe5ccf524025cea2bfa3d8cd"} Dec 05 12:32:45.529026 master-0 kubenswrapper[8731]: I1205 12:32:45.529010 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"d627fcf3-2a80-4739-add9-e21ad4efc6eb","Type":"ContainerStarted","Data":"f05e889af7263b1ba07fc81648bcfe8b4d672794681d2558bab4fed4dcbd28ca"} Dec 05 12:32:45.531682 master-0 kubenswrapper[8731]: I1205 12:32:45.531601 8731 generic.go:334] "Generic (PLEG): container finished" podID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerID="836113a149a4eefb4c2ce8d65a7d2c1b43cd3294cab879526b98ff307bc6e81d" exitCode=0 Dec 05 12:32:45.531813 master-0 kubenswrapper[8731]: I1205 12:32:45.531682 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" event={"ID":"d72b2b71-27b2-4aff-bf69-7054a9556318","Type":"ContainerDied","Data":"836113a149a4eefb4c2ce8d65a7d2c1b43cd3294cab879526b98ff307bc6e81d"} Dec 05 12:32:45.533936 master-0 kubenswrapper[8731]: I1205 12:32:45.533863 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:32:45.546966 master-0 kubenswrapper[8731]: I1205 12:32:45.546892 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" podStartSLOduration=11.826844536 podStartE2EDuration="14.546873972s" podCreationTimestamp="2025-12-05 12:32:31 +0000 UTC" firstStartedPulling="2025-12-05 12:32:42.145315737 +0000 UTC m=+60.449299924" lastFinishedPulling="2025-12-05 12:32:44.865345193 +0000 UTC m=+63.169329360" observedRunningTime="2025-12-05 12:32:45.545036575 +0000 UTC m=+63.849020752" watchObservedRunningTime="2025-12-05 12:32:45.546873972 +0000 UTC m=+63.850858149" Dec 05 12:32:45.571402 master-0 kubenswrapper[8731]: I1205 12:32:45.571283 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=2.571253153 podStartE2EDuration="2.571253153s" podCreationTimestamp="2025-12-05 12:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:32:45.565430382 +0000 UTC m=+63.869414549" watchObservedRunningTime="2025-12-05 12:32:45.571253153 +0000 UTC m=+63.875237330" Dec 05 12:32:45.631213 master-0 kubenswrapper[8731]: I1205 12:32:45.627461 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:45.631213 master-0 kubenswrapper[8731]: I1205 12:32:45.627584 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/076dafdf-a5d2-4e2d-9c38-6932910f7327-kube-api-access\") pod \"installer-4-master-0\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:45.631213 master-0 kubenswrapper[8731]: I1205 12:32:45.627862 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-var-lock\") pod \"installer-4-master-0\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:45.631213 master-0 kubenswrapper[8731]: I1205 12:32:45.629391 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:45.644211 master-0 kubenswrapper[8731]: I1205 12:32:45.633351 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-var-lock\") pod \"installer-4-master-0\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:45.687206 master-0 kubenswrapper[8731]: I1205 12:32:45.686525 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/076dafdf-a5d2-4e2d-9c38-6932910f7327-kube-api-access\") pod \"installer-4-master-0\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:45.796093 master-0 kubenswrapper[8731]: I1205 12:32:45.796015 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:32:46.043655 master-0 kubenswrapper[8731]: I1205 12:32:46.041678 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 05 12:32:46.415834 master-0 kubenswrapper[8731]: I1205 12:32:46.415764 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw"] Dec 05 12:32:46.416589 master-0 kubenswrapper[8731]: I1205 12:32:46.416532 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.419487 master-0 kubenswrapper[8731]: I1205 12:32:46.419412 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 12:32:46.421354 master-0 kubenswrapper[8731]: I1205 12:32:46.421293 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 12:32:46.421486 master-0 kubenswrapper[8731]: I1205 12:32:46.421470 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 12:32:46.421714 master-0 kubenswrapper[8731]: I1205 12:32:46.421688 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 12:32:46.422126 master-0 kubenswrapper[8731]: I1205 12:32:46.422074 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 12:32:46.430944 master-0 kubenswrapper[8731]: I1205 12:32:46.430884 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw"] Dec 05 12:32:46.431664 master-0 kubenswrapper[8731]: I1205 12:32:46.431596 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 12:32:46.439773 master-0 kubenswrapper[8731]: I1205 12:32:46.439699 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-proxy-ca-bundles\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.439902 master-0 kubenswrapper[8731]: I1205 12:32:46.439804 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-client-ca\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.439991 master-0 kubenswrapper[8731]: I1205 12:32:46.439954 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-config\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.440231 master-0 kubenswrapper[8731]: I1205 12:32:46.440141 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-serving-cert\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.440341 master-0 kubenswrapper[8731]: I1205 12:32:46.440304 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzft9\" (UniqueName: \"kubernetes.io/projected/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-kube-api-access-vzft9\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.539084 master-0 kubenswrapper[8731]: I1205 12:32:46.539024 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" event={"ID":"d72b2b71-27b2-4aff-bf69-7054a9556318","Type":"ContainerStarted","Data":"bbc65050e19c8e05efbd98764627a92089e068c4fefa760a423cea0a25acab48"} Dec 05 12:32:46.541441 master-0 kubenswrapper[8731]: I1205 12:32:46.541388 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-proxy-ca-bundles\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.542674 master-0 kubenswrapper[8731]: I1205 12:32:46.541616 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-client-ca\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.542674 master-0 kubenswrapper[8731]: I1205 12:32:46.541710 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-config\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.542674 master-0 kubenswrapper[8731]: I1205 12:32:46.541793 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-serving-cert\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.542674 master-0 kubenswrapper[8731]: I1205 12:32:46.541829 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzft9\" (UniqueName: \"kubernetes.io/projected/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-kube-api-access-vzft9\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.542928 master-0 kubenswrapper[8731]: I1205 12:32:46.542876 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-client-ca\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.543325 master-0 kubenswrapper[8731]: I1205 12:32:46.543222 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"076dafdf-a5d2-4e2d-9c38-6932910f7327","Type":"ContainerStarted","Data":"f8dc47e77bee6411ef3a450c0123b8279b91a4729700211ae01112ac79fa1d1e"} Dec 05 12:32:46.543395 master-0 kubenswrapper[8731]: I1205 12:32:46.543358 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"076dafdf-a5d2-4e2d-9c38-6932910f7327","Type":"ContainerStarted","Data":"26722ad2bd6e7ca8bda35211d0d46cd57e0c0ba5a29870576dae6f8264697434"} Dec 05 12:32:46.544003 master-0 kubenswrapper[8731]: I1205 12:32:46.543941 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-config\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.544079 master-0 kubenswrapper[8731]: I1205 12:32:46.543948 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-proxy-ca-bundles\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.553220 master-0 kubenswrapper[8731]: I1205 12:32:46.550405 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-serving-cert\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.565224 master-0 kubenswrapper[8731]: I1205 12:32:46.563772 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzft9\" (UniqueName: \"kubernetes.io/projected/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-kube-api-access-vzft9\") pod \"controller-manager-5c4497cd6c-rg6xw\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.590195 master-0 kubenswrapper[8731]: I1205 12:32:46.590059 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podStartSLOduration=3.862089024 podStartE2EDuration="6.590021285s" podCreationTimestamp="2025-12-05 12:32:40 +0000 UTC" firstStartedPulling="2025-12-05 12:32:42.142908225 +0000 UTC m=+60.446892392" lastFinishedPulling="2025-12-05 12:32:44.870840486 +0000 UTC m=+63.174824653" observedRunningTime="2025-12-05 12:32:46.568816226 +0000 UTC m=+64.872800423" watchObservedRunningTime="2025-12-05 12:32:46.590021285 +0000 UTC m=+64.894005452" Dec 05 12:32:46.590429 master-0 kubenswrapper[8731]: I1205 12:32:46.590238 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=1.59023316 podStartE2EDuration="1.59023316s" podCreationTimestamp="2025-12-05 12:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:32:46.588405563 +0000 UTC m=+64.892389730" watchObservedRunningTime="2025-12-05 12:32:46.59023316 +0000 UTC m=+64.894217327" Dec 05 12:32:46.617771 master-0 kubenswrapper[8731]: I1205 12:32:46.617609 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw"] Dec 05 12:32:46.618131 master-0 kubenswrapper[8731]: I1205 12:32:46.618091 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:46.644071 master-0 kubenswrapper[8731]: I1205 12:32:46.641225 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl"] Dec 05 12:32:46.746087 master-0 kubenswrapper[8731]: I1205 12:32:46.745941 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:32:46.746087 master-0 kubenswrapper[8731]: I1205 12:32:46.746031 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:32:46.746087 master-0 kubenswrapper[8731]: I1205 12:32:46.746063 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:32:46.746087 master-0 kubenswrapper[8731]: I1205 12:32:46.746095 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:32:46.747338 master-0 kubenswrapper[8731]: I1205 12:32:46.746276 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:32:46.747338 master-0 kubenswrapper[8731]: I1205 12:32:46.746356 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:32:46.747338 master-0 kubenswrapper[8731]: I1205 12:32:46.746410 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:32:46.747338 master-0 kubenswrapper[8731]: I1205 12:32:46.746430 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:32:46.747338 master-0 kubenswrapper[8731]: I1205 12:32:46.746480 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:32:46.747338 master-0 kubenswrapper[8731]: I1205 12:32:46.746518 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:32:46.747338 master-0 kubenswrapper[8731]: I1205 12:32:46.746958 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:32:46.750801 master-0 kubenswrapper[8731]: I1205 12:32:46.750763 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:32:46.750801 master-0 kubenswrapper[8731]: I1205 12:32:46.750795 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:32:46.751077 master-0 kubenswrapper[8731]: I1205 12:32:46.750770 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"cluster-version-operator-77dfcc565f-2chqh\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:32:46.751077 master-0 kubenswrapper[8731]: I1205 12:32:46.750830 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:32:46.751077 master-0 kubenswrapper[8731]: I1205 12:32:46.750857 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:32:46.751077 master-0 kubenswrapper[8731]: I1205 12:32:46.750979 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:32:46.751469 master-0 kubenswrapper[8731]: I1205 12:32:46.751422 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"multus-admission-controller-7dfc5b745f-xlrzq\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:32:46.752370 master-0 kubenswrapper[8731]: I1205 12:32:46.752326 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:32:46.752899 master-0 kubenswrapper[8731]: I1205 12:32:46.752834 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:32:46.752899 master-0 kubenswrapper[8731]: I1205 12:32:46.752878 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:32:46.753055 master-0 kubenswrapper[8731]: I1205 12:32:46.753013 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:32:46.770491 master-0 kubenswrapper[8731]: I1205 12:32:46.770421 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:32:46.771528 master-0 kubenswrapper[8731]: I1205 12:32:46.771313 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:32:46.772162 master-0 kubenswrapper[8731]: I1205 12:32:46.772124 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:32:46.772162 master-0 kubenswrapper[8731]: I1205 12:32:46.772151 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:32:46.772455 master-0 kubenswrapper[8731]: I1205 12:32:46.772419 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:32:46.772525 master-0 kubenswrapper[8731]: I1205 12:32:46.772484 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:32:46.772679 master-0 kubenswrapper[8731]: I1205 12:32:46.772619 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:32:46.772731 master-0 kubenswrapper[8731]: I1205 12:32:46.772629 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:32:46.781919 master-0 kubenswrapper[8731]: I1205 12:32:46.781238 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:32:46.782522 master-0 kubenswrapper[8731]: I1205 12:32:46.782465 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:32:46.813639 master-0 kubenswrapper[8731]: I1205 12:32:46.813577 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw"] Dec 05 12:32:46.840894 master-0 kubenswrapper[8731]: W1205 12:32:46.840826 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29812c4b_48ac_488c_863c_1d52e39ea2ae.slice/crio-16660d02bb2781827fb05b56da3da55397e61aedd1747341b89ed543b687f8e3 WatchSource:0}: Error finding container 16660d02bb2781827fb05b56da3da55397e61aedd1747341b89ed543b687f8e3: Status 404 returned error can't find the container with id 16660d02bb2781827fb05b56da3da55397e61aedd1747341b89ed543b687f8e3 Dec 05 12:32:46.845173 master-0 kubenswrapper[8731]: W1205 12:32:46.845040 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ede7946_e35c_4f7d_bb9f_9e6cc518eaa8.slice/crio-11f4bd44744862a4784027907096a8da7ef03fcfcded0ae25155a811e3329f1b WatchSource:0}: Error finding container 11f4bd44744862a4784027907096a8da7ef03fcfcded0ae25155a811e3329f1b: Status 404 returned error can't find the container with id 11f4bd44744862a4784027907096a8da7ef03fcfcded0ae25155a811e3329f1b Dec 05 12:32:47.126657 master-0 kubenswrapper[8731]: I1205 12:32:47.125876 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z"] Dec 05 12:32:47.261962 master-0 kubenswrapper[8731]: I1205 12:32:47.261339 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 05 12:32:47.262285 master-0 kubenswrapper[8731]: I1205 12:32:47.262000 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.266578 master-0 kubenswrapper[8731]: I1205 12:32:47.266538 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 12:32:47.275489 master-0 kubenswrapper[8731]: I1205 12:32:47.275397 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 05 12:32:47.356394 master-0 kubenswrapper[8731]: I1205 12:32:47.356320 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-f797b99b6-vwhxt"] Dec 05 12:32:47.357477 master-0 kubenswrapper[8731]: I1205 12:32:47.357412 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kube-api-access\") pod \"installer-1-master-0\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.357554 master-0 kubenswrapper[8731]: I1205 12:32:47.357531 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-var-lock\") pod \"installer-1-master-0\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.357618 master-0 kubenswrapper[8731]: I1205 12:32:47.357568 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.370779 master-0 kubenswrapper[8731]: W1205 12:32:47.370147 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e6babfe_724a_4eab_bb3b_bc318bf57b70.slice/crio-9cdc542e09a2b9f60d00a132f1101f8d7a3bb737b3bcc4086c2409ef17b05c7e WatchSource:0}: Error finding container 9cdc542e09a2b9f60d00a132f1101f8d7a3bb737b3bcc4086c2409ef17b05c7e: Status 404 returned error can't find the container with id 9cdc542e09a2b9f60d00a132f1101f8d7a3bb737b3bcc4086c2409ef17b05c7e Dec 05 12:32:47.436618 master-0 kubenswrapper[8731]: I1205 12:32:47.436544 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc"] Dec 05 12:32:47.440811 master-0 kubenswrapper[8731]: I1205 12:32:47.440772 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-8649c48786-7xrk6"] Dec 05 12:32:47.449142 master-0 kubenswrapper[8731]: I1205 12:32:47.449098 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c"] Dec 05 12:32:47.449219 master-0 kubenswrapper[8731]: I1205 12:32:47.449161 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq"] Dec 05 12:32:47.451262 master-0 kubenswrapper[8731]: I1205 12:32:47.450767 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq"] Dec 05 12:32:47.452069 master-0 kubenswrapper[8731]: I1205 12:32:47.452001 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-99djw"] Dec 05 12:32:47.454513 master-0 kubenswrapper[8731]: I1205 12:32:47.454448 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw"] Dec 05 12:32:47.459050 master-0 kubenswrapper[8731]: I1205 12:32:47.458948 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.459050 master-0 kubenswrapper[8731]: I1205 12:32:47.459018 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kube-api-access\") pod \"installer-1-master-0\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.459217 master-0 kubenswrapper[8731]: I1205 12:32:47.459100 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-var-lock\") pod \"installer-1-master-0\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.459262 master-0 kubenswrapper[8731]: I1205 12:32:47.459220 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-var-lock\") pod \"installer-1-master-0\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.459385 master-0 kubenswrapper[8731]: I1205 12:32:47.459280 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.469102 master-0 kubenswrapper[8731]: W1205 12:32:47.469019 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f0c6889_0739_48a3_99cd_6db9d1f83242.slice/crio-92eddccae7e06f02f48401d5d5f367dae7b9b78b2bbc84b00b68ec03e90321c1 WatchSource:0}: Error finding container 92eddccae7e06f02f48401d5d5f367dae7b9b78b2bbc84b00b68ec03e90321c1: Status 404 returned error can't find the container with id 92eddccae7e06f02f48401d5d5f367dae7b9b78b2bbc84b00b68ec03e90321c1 Dec 05 12:32:47.485717 master-0 kubenswrapper[8731]: I1205 12:32:47.484997 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kube-api-access\") pod \"installer-1-master-0\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.569339 master-0 kubenswrapper[8731]: I1205 12:32:47.569270 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" event={"ID":"58187662-b502-4d90-95ce-2aa91a81d256","Type":"ContainerStarted","Data":"0b9e8ef8efad8c6e16cd6e6a39269d9f5b02a38a45cb5b422afaa90713381fcb"} Dec 05 12:32:47.576906 master-0 kubenswrapper[8731]: I1205 12:32:47.573013 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" event={"ID":"0dda6d9b-cb3a-413a-85af-ef08f15ea42e","Type":"ContainerStarted","Data":"9a083a2de33da77d47cd60a3708aaf6bb8591ce81eba8d8e42788e2c8c58ecd3"} Dec 05 12:32:47.576906 master-0 kubenswrapper[8731]: I1205 12:32:47.574716 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerStarted","Data":"c44264ca51ad61ed3b05ffa4c975691fd7debf64dbafd9a640308d225a077e0b"} Dec 05 12:32:47.576906 master-0 kubenswrapper[8731]: I1205 12:32:47.576840 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" event={"ID":"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8","Type":"ContainerStarted","Data":"858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0"} Dec 05 12:32:47.576906 master-0 kubenswrapper[8731]: I1205 12:32:47.576891 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" event={"ID":"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8","Type":"ContainerStarted","Data":"11f4bd44744862a4784027907096a8da7ef03fcfcded0ae25155a811e3329f1b"} Dec 05 12:32:47.577170 master-0 kubenswrapper[8731]: I1205 12:32:47.577121 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" podUID="1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" containerName="controller-manager" containerID="cri-o://858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0" gracePeriod=30 Dec 05 12:32:47.577985 master-0 kubenswrapper[8731]: I1205 12:32:47.577961 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:47.579206 master-0 kubenswrapper[8731]: I1205 12:32:47.579092 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-99djw" event={"ID":"fb7003a6-4341-49eb-bec3-76ba8610fa12","Type":"ContainerStarted","Data":"e67f95f822c645d6f2dd2098e7e055983609569dd0acfdc0e0bea037bf8d6c03"} Dec 05 12:32:47.584295 master-0 kubenswrapper[8731]: I1205 12:32:47.584237 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" event={"ID":"38941513-e968-45f1-9cb2-b63d40338f36","Type":"ContainerStarted","Data":"065b5ff0754f03af8b21df75fad6ff50fe29b9c92ca5f839b6b57c232043c975"} Dec 05 12:32:47.584640 master-0 kubenswrapper[8731]: I1205 12:32:47.584487 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:47.585818 master-0 kubenswrapper[8731]: I1205 12:32:47.585782 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" event={"ID":"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb","Type":"ContainerStarted","Data":"323592a10d8975a94a7a25bad1c995c5959062afe0321ce857efdc2c6ccb6ebc"} Dec 05 12:32:47.586819 master-0 kubenswrapper[8731]: I1205 12:32:47.586754 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" event={"ID":"29812c4b-48ac-488c-863c-1d52e39ea2ae","Type":"ContainerStarted","Data":"16660d02bb2781827fb05b56da3da55397e61aedd1747341b89ed543b687f8e3"} Dec 05 12:32:47.592051 master-0 kubenswrapper[8731]: I1205 12:32:47.592005 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" event={"ID":"5f0c6889-0739-48a3-99cd-6db9d1f83242","Type":"ContainerStarted","Data":"92eddccae7e06f02f48401d5d5f367dae7b9b78b2bbc84b00b68ec03e90321c1"} Dec 05 12:32:47.597369 master-0 kubenswrapper[8731]: I1205 12:32:47.593149 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" event={"ID":"1e6babfe-724a-4eab-bb3b-bc318bf57b70","Type":"ContainerStarted","Data":"9cdc542e09a2b9f60d00a132f1101f8d7a3bb737b3bcc4086c2409ef17b05c7e"} Dec 05 12:32:47.597369 master-0 kubenswrapper[8731]: I1205 12:32:47.596766 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" event={"ID":"a2acba71-b9dc-4b85-be35-c995b8be2f19","Type":"ContainerStarted","Data":"743ece8bb6e404056a2fb9957949cb0a30330d99bb6dbc633553c08d0fb45759"} Dec 05 12:32:47.601865 master-0 kubenswrapper[8731]: I1205 12:32:47.601828 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:32:47.606094 master-0 kubenswrapper[8731]: I1205 12:32:47.606024 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" podStartSLOduration=16.605999154 podStartE2EDuration="16.605999154s" podCreationTimestamp="2025-12-05 12:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:32:47.604542367 +0000 UTC m=+65.908526534" watchObservedRunningTime="2025-12-05 12:32:47.605999154 +0000 UTC m=+65.909983321" Dec 05 12:32:47.809112 master-0 kubenswrapper[8731]: I1205 12:32:47.809039 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 05 12:32:47.818494 master-0 kubenswrapper[8731]: W1205 12:32:47.818440 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod565d5ef6_b0e7_4f04_9460_61f1d3903d37.slice/crio-ffedcf7b097d85236cfda3f347741e8721f2f2b5597465d279b812038c00b460 WatchSource:0}: Error finding container ffedcf7b097d85236cfda3f347741e8721f2f2b5597465d279b812038c00b460: Status 404 returned error can't find the container with id ffedcf7b097d85236cfda3f347741e8721f2f2b5597465d279b812038c00b460 Dec 05 12:32:47.931974 master-0 kubenswrapper[8731]: I1205 12:32:47.931630 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:47.967018 master-0 kubenswrapper[8731]: I1205 12:32:47.966412 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-client-ca\") pod \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " Dec 05 12:32:47.967018 master-0 kubenswrapper[8731]: I1205 12:32:47.966500 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-proxy-ca-bundles\") pod \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " Dec 05 12:32:47.967018 master-0 kubenswrapper[8731]: I1205 12:32:47.966592 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzft9\" (UniqueName: \"kubernetes.io/projected/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-kube-api-access-vzft9\") pod \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " Dec 05 12:32:47.967018 master-0 kubenswrapper[8731]: I1205 12:32:47.966640 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-config\") pod \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " Dec 05 12:32:47.969540 master-0 kubenswrapper[8731]: I1205 12:32:47.969485 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-675db9579f-4dcg8"] Dec 05 12:32:47.969996 master-0 kubenswrapper[8731]: E1205 12:32:47.969774 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" containerName="controller-manager" Dec 05 12:32:47.969996 master-0 kubenswrapper[8731]: I1205 12:32:47.969795 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" containerName="controller-manager" Dec 05 12:32:47.969996 master-0 kubenswrapper[8731]: I1205 12:32:47.969852 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-client-ca" (OuterVolumeSpecName: "client-ca") pod "1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" (UID: "1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:47.969996 master-0 kubenswrapper[8731]: I1205 12:32:47.969880 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" containerName="controller-manager" Dec 05 12:32:47.970140 master-0 kubenswrapper[8731]: I1205 12:32:47.970034 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" (UID: "1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:47.970979 master-0 kubenswrapper[8731]: I1205 12:32:47.970918 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-config" (OuterVolumeSpecName: "config") pod "1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" (UID: "1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:32:47.972696 master-0 kubenswrapper[8731]: I1205 12:32:47.972576 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:47.979511 master-0 kubenswrapper[8731]: I1205 12:32:47.979473 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-kube-api-access-vzft9" (OuterVolumeSpecName: "kube-api-access-vzft9") pod "1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" (UID: "1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8"). InnerVolumeSpecName "kube-api-access-vzft9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:32:47.983913 master-0 kubenswrapper[8731]: I1205 12:32:47.983675 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-675db9579f-4dcg8"] Dec 05 12:32:48.067779 master-0 kubenswrapper[8731]: I1205 12:32:48.067705 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-serving-cert\") pod \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\" (UID: \"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8\") " Dec 05 12:32:48.068124 master-0 kubenswrapper[8731]: I1205 12:32:48.068087 8731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:48.068124 master-0 kubenswrapper[8731]: I1205 12:32:48.068109 8731 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:48.068124 master-0 kubenswrapper[8731]: I1205 12:32:48.068124 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzft9\" (UniqueName: \"kubernetes.io/projected/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-kube-api-access-vzft9\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:48.068295 master-0 kubenswrapper[8731]: I1205 12:32:48.068135 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:48.073071 master-0 kubenswrapper[8731]: I1205 12:32:48.073042 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" (UID: "1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:32:48.168893 master-0 kubenswrapper[8731]: I1205 12:32:48.168822 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-proxy-ca-bundles\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.169162 master-0 kubenswrapper[8731]: I1205 12:32:48.168933 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-config\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.169162 master-0 kubenswrapper[8731]: I1205 12:32:48.168968 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e562fda-e695-4218-a9cf-4179b8d456db-serving-cert\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.169162 master-0 kubenswrapper[8731]: I1205 12:32:48.169011 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.169162 master-0 kubenswrapper[8731]: I1205 12:32:48.169040 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-client-ca\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.169315 master-0 kubenswrapper[8731]: I1205 12:32:48.169223 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:32:48.213538 master-0 kubenswrapper[8731]: I1205 12:32:48.213399 8731 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 05 12:32:48.215187 master-0 kubenswrapper[8731]: I1205 12:32:48.215103 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcdctl" containerID="cri-o://bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715" gracePeriod=30 Dec 05 12:32:48.215247 master-0 kubenswrapper[8731]: I1205 12:32:48.215139 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcd" containerID="cri-o://296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954" gracePeriod=30 Dec 05 12:32:48.215679 master-0 kubenswrapper[8731]: I1205 12:32:48.215454 8731 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Dec 05 12:32:48.215755 master-0 kubenswrapper[8731]: E1205 12:32:48.215686 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcd" Dec 05 12:32:48.215755 master-0 kubenswrapper[8731]: I1205 12:32:48.215702 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcd" Dec 05 12:32:48.215755 master-0 kubenswrapper[8731]: E1205 12:32:48.215719 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcdctl" Dec 05 12:32:48.215755 master-0 kubenswrapper[8731]: I1205 12:32:48.215726 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcdctl" Dec 05 12:32:48.215971 master-0 kubenswrapper[8731]: I1205 12:32:48.215825 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcdctl" Dec 05 12:32:48.215971 master-0 kubenswrapper[8731]: I1205 12:32:48.215847 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc0396a9a2689b3e8c132c12640cbe83" containerName="etcd" Dec 05 12:32:48.217297 master-0 kubenswrapper[8731]: I1205 12:32:48.217241 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.271403 master-0 kubenswrapper[8731]: I1205 12:32:48.270855 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-config\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.271403 master-0 kubenswrapper[8731]: I1205 12:32:48.270930 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e562fda-e695-4218-a9cf-4179b8d456db-serving-cert\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.271403 master-0 kubenswrapper[8731]: I1205 12:32:48.270989 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.271403 master-0 kubenswrapper[8731]: I1205 12:32:48.271022 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-client-ca\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.271403 master-0 kubenswrapper[8731]: I1205 12:32:48.271055 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-proxy-ca-bundles\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.272727 master-0 kubenswrapper[8731]: I1205 12:32:48.272352 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-proxy-ca-bundles\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.272727 master-0 kubenswrapper[8731]: I1205 12:32:48.272580 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-config\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.272893 master-0 kubenswrapper[8731]: I1205 12:32:48.272849 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-client-ca\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.278125 master-0 kubenswrapper[8731]: I1205 12:32:48.277894 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e562fda-e695-4218-a9cf-4179b8d456db-serving-cert\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:32:48.374315 master-0 kubenswrapper[8731]: I1205 12:32:48.374043 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.374315 master-0 kubenswrapper[8731]: I1205 12:32:48.374170 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.374315 master-0 kubenswrapper[8731]: I1205 12:32:48.374239 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.374796 master-0 kubenswrapper[8731]: I1205 12:32:48.374341 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.374796 master-0 kubenswrapper[8731]: I1205 12:32:48.374381 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.374796 master-0 kubenswrapper[8731]: I1205 12:32:48.374417 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476097 master-0 kubenswrapper[8731]: I1205 12:32:48.476023 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476097 master-0 kubenswrapper[8731]: I1205 12:32:48.476089 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476481 master-0 kubenswrapper[8731]: I1205 12:32:48.476126 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476481 master-0 kubenswrapper[8731]: I1205 12:32:48.476291 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476481 master-0 kubenswrapper[8731]: I1205 12:32:48.476422 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476481 master-0 kubenswrapper[8731]: I1205 12:32:48.476452 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476633 master-0 kubenswrapper[8731]: I1205 12:32:48.476501 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476633 master-0 kubenswrapper[8731]: I1205 12:32:48.476556 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476749 master-0 kubenswrapper[8731]: I1205 12:32:48.476730 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476879 master-0 kubenswrapper[8731]: I1205 12:32:48.476856 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476943 master-0 kubenswrapper[8731]: I1205 12:32:48.476891 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.476943 master-0 kubenswrapper[8731]: I1205 12:32:48.476912 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir\") pod \"etcd-master-0\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:32:48.604688 master-0 kubenswrapper[8731]: I1205 12:32:48.604604 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"565d5ef6-b0e7-4f04-9460-61f1d3903d37","Type":"ContainerStarted","Data":"1cb443e02b64a65178050b34e99e50f308c86d2ef5b4e7e730bfa0faf58cc53e"} Dec 05 12:32:48.604688 master-0 kubenswrapper[8731]: I1205 12:32:48.604670 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"565d5ef6-b0e7-4f04-9460-61f1d3903d37","Type":"ContainerStarted","Data":"ffedcf7b097d85236cfda3f347741e8721f2f2b5597465d279b812038c00b460"} Dec 05 12:32:48.608106 master-0 kubenswrapper[8731]: I1205 12:32:48.607986 8731 generic.go:334] "Generic (PLEG): container finished" podID="1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" containerID="858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0" exitCode=0 Dec 05 12:32:48.608106 master-0 kubenswrapper[8731]: I1205 12:32:48.608057 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" event={"ID":"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8","Type":"ContainerDied","Data":"858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0"} Dec 05 12:32:48.608106 master-0 kubenswrapper[8731]: I1205 12:32:48.608085 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" event={"ID":"1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8","Type":"ContainerDied","Data":"11f4bd44744862a4784027907096a8da7ef03fcfcded0ae25155a811e3329f1b"} Dec 05 12:32:48.608106 master-0 kubenswrapper[8731]: I1205 12:32:48.608108 8731 scope.go:117] "RemoveContainer" containerID="858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0" Dec 05 12:32:48.608341 master-0 kubenswrapper[8731]: I1205 12:32:48.608264 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw" Dec 05 12:32:48.615541 master-0 kubenswrapper[8731]: I1205 12:32:48.615410 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" event={"ID":"0dda6d9b-cb3a-413a-85af-ef08f15ea42e","Type":"ContainerStarted","Data":"3de4ddaa09ada567848564877e7c542bbe9c6a292970b0f8cf886f5ba9fa75db"} Dec 05 12:32:48.615760 master-0 kubenswrapper[8731]: I1205 12:32:48.615509 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" podUID="bb7dd3e9-5a59-4741-970e-aa41c4e078cc" containerName="route-controller-manager" containerID="cri-o://d12e1c8bf264de03492186948f2fcb8fa30acf3e5c6ac0dd00637ed1e75cfa31" gracePeriod=30 Dec 05 12:32:51.072575 master-0 kubenswrapper[8731]: I1205 12:32:51.072046 8731 patch_prober.go:28] interesting pod/route-controller-manager-858598fd98-5xkcl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.39:8443/healthz\": dial tcp 10.128.0.39:8443: connect: connection refused" start-of-body= Dec 05 12:32:51.073565 master-0 kubenswrapper[8731]: I1205 12:32:51.072606 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" podUID="bb7dd3e9-5a59-4741-970e-aa41c4e078cc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.39:8443/healthz\": dial tcp 10.128.0.39:8443: connect: connection refused" Dec 05 12:32:51.459396 master-0 kubenswrapper[8731]: I1205 12:32:51.459310 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:51.459836 master-0 kubenswrapper[8731]: I1205 12:32:51.459774 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:32:59.539747 master-0 kubenswrapper[8731]: I1205 12:32:59.539255 8731 scope.go:117] "RemoveContainer" containerID="858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0" Dec 05 12:32:59.540633 master-0 kubenswrapper[8731]: E1205 12:32:59.540501 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0\": container with ID starting with 858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0 not found: ID does not exist" containerID="858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0" Dec 05 12:32:59.540633 master-0 kubenswrapper[8731]: I1205 12:32:59.540536 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0"} err="failed to get container status \"858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0\": rpc error: code = NotFound desc = could not find container \"858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0\": container with ID starting with 858705d99de76b8cfa9db6cecfef2d5726cbdbc70ea2eca34544ac49f7bc75f0 not found: ID does not exist" Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: I1205 12:33:00.466308 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:33:00.466370 master-0 kubenswrapper[8731]: I1205 12:33:00.466387 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:33:01.072978 master-0 kubenswrapper[8731]: I1205 12:33:01.072876 8731 patch_prober.go:28] interesting pod/route-controller-manager-858598fd98-5xkcl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.39:8443/healthz\": dial tcp 10.128.0.39:8443: connect: connection refused" start-of-body= Dec 05 12:33:01.073592 master-0 kubenswrapper[8731]: I1205 12:33:01.072988 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" podUID="bb7dd3e9-5a59-4741-970e-aa41c4e078cc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.39:8443/healthz\": dial tcp 10.128.0.39:8443: connect: connection refused" Dec 05 12:33:01.259287 master-0 kubenswrapper[8731]: E1205 12:33:01.259201 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 05 12:33:01.259696 master-0 kubenswrapper[8731]: I1205 12:33:01.259668 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 05 12:33:01.772838 master-0 kubenswrapper[8731]: W1205 12:33:01.772715 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24e01603234fe8003f8aae8171b0065.slice/crio-a8ba14d580fe2fb80b55befd2c9e6b4b2c8930c4e5539e40f3d53b13b976e898 WatchSource:0}: Error finding container a8ba14d580fe2fb80b55befd2c9e6b4b2c8930c4e5539e40f3d53b13b976e898: Status 404 returned error can't find the container with id a8ba14d580fe2fb80b55befd2c9e6b4b2c8930c4e5539e40f3d53b13b976e898 Dec 05 12:33:02.697241 master-0 kubenswrapper[8731]: I1205 12:33:02.696249 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" event={"ID":"29812c4b-48ac-488c-863c-1d52e39ea2ae","Type":"ContainerStarted","Data":"611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0"} Dec 05 12:33:02.699000 master-0 kubenswrapper[8731]: I1205 12:33:02.698837 8731 generic.go:334] "Generic (PLEG): container finished" podID="96fa3513-5467-4b0f-a03d-9279d36317bd" containerID="0a7d145dbed8d32146e90821257e92134c8804dafe8896f59ec88530e6ad0c4e" exitCode=0 Dec 05 12:33:02.699000 master-0 kubenswrapper[8731]: I1205 12:33:02.698968 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"96fa3513-5467-4b0f-a03d-9279d36317bd","Type":"ContainerDied","Data":"0a7d145dbed8d32146e90821257e92134c8804dafe8896f59ec88530e6ad0c4e"} Dec 05 12:33:02.701538 master-0 kubenswrapper[8731]: I1205 12:33:02.701472 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" event={"ID":"a2acba71-b9dc-4b85-be35-c995b8be2f19","Type":"ContainerStarted","Data":"0ef8d356dac19c1922c065326d7809108046e5a2cd059d5d50b5229acd7007ec"} Dec 05 12:33:02.704193 master-0 kubenswrapper[8731]: I1205 12:33:02.704104 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" event={"ID":"38941513-e968-45f1-9cb2-b63d40338f36","Type":"ContainerStarted","Data":"418f3d79b0988ff7f7ba36537b8459867264703e9c8f702bbd93e4dee2835882"} Dec 05 12:33:02.708624 master-0 kubenswrapper[8731]: I1205 12:33:02.708571 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-99djw" event={"ID":"fb7003a6-4341-49eb-bec3-76ba8610fa12","Type":"ContainerStarted","Data":"f7e070e3835422f37986b17613bb2a923a628ccc634c0641f7b2911fd3c07111"} Dec 05 12:33:02.708766 master-0 kubenswrapper[8731]: I1205 12:33:02.708640 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-99djw" event={"ID":"fb7003a6-4341-49eb-bec3-76ba8610fa12","Type":"ContainerStarted","Data":"e148c0e6308743ecf579bef0b88df088d99461b29256ac158a317b333df0b195"} Dec 05 12:33:02.711008 master-0 kubenswrapper[8731]: I1205 12:33:02.710968 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" event={"ID":"58187662-b502-4d90-95ce-2aa91a81d256","Type":"ContainerStarted","Data":"d0c256d51be6b67ce11d61e05ebacd6a747bab028d852541d977f5d77734ba1a"} Dec 05 12:33:02.713895 master-0 kubenswrapper[8731]: I1205 12:33:02.713848 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" event={"ID":"0dda6d9b-cb3a-413a-85af-ef08f15ea42e","Type":"ContainerStarted","Data":"494ce5c3826b0b94b974fd41d16b7ba6517fb0d007e27462a2d7b33e01aa4443"} Dec 05 12:33:02.714060 master-0 kubenswrapper[8731]: I1205 12:33:02.714012 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:33:02.716987 master-0 kubenswrapper[8731]: I1205 12:33:02.716939 8731 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18" exitCode=1 Dec 05 12:33:02.717060 master-0 kubenswrapper[8731]: I1205 12:33:02.717017 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18"} Dec 05 12:33:02.717117 master-0 kubenswrapper[8731]: I1205 12:33:02.717091 8731 scope.go:117] "RemoveContainer" containerID="123ca114b6002ab3cd24848ba210c8015d871a3bf5c2f6653a7daa022e0dea48" Dec 05 12:33:02.717737 master-0 kubenswrapper[8731]: I1205 12:33:02.717696 8731 scope.go:117] "RemoveContainer" containerID="f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18" Dec 05 12:33:02.719787 master-0 kubenswrapper[8731]: I1205 12:33:02.719444 8731 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="e6fb13503e825480506895b04ab6a86f432b8d4ca2560cfbca6f20c4af8b50db" exitCode=0 Dec 05 12:33:02.719787 master-0 kubenswrapper[8731]: I1205 12:33:02.719530 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerDied","Data":"e6fb13503e825480506895b04ab6a86f432b8d4ca2560cfbca6f20c4af8b50db"} Dec 05 12:33:02.719787 master-0 kubenswrapper[8731]: I1205 12:33:02.719603 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"a8ba14d580fe2fb80b55befd2c9e6b4b2c8930c4e5539e40f3d53b13b976e898"} Dec 05 12:33:02.724150 master-0 kubenswrapper[8731]: I1205 12:33:02.724018 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerStarted","Data":"8a2315b172a2f4696d36566ac0967bac2a393e7df33c410eb47c73827f2cb352"} Dec 05 12:33:02.724150 master-0 kubenswrapper[8731]: I1205 12:33:02.724100 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerStarted","Data":"25a1113bac1425c0d6b5254d5067b012732c090d8f467edda97019523a2d47be"} Dec 05 12:33:02.727060 master-0 kubenswrapper[8731]: I1205 12:33:02.727002 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" event={"ID":"5f0c6889-0739-48a3-99cd-6db9d1f83242","Type":"ContainerStarted","Data":"d324f3be47b40d64f2eb275a06a3da375cc2d17a721be2f87def91dc6fec78c2"} Dec 05 12:33:02.727285 master-0 kubenswrapper[8731]: I1205 12:33:02.727067 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" event={"ID":"5f0c6889-0739-48a3-99cd-6db9d1f83242","Type":"ContainerStarted","Data":"cac6f03a0427fe3f821f5cb9684613bbc6f78a43198e6a2ef1b43d626c97b8ba"} Dec 05 12:33:02.740500 master-0 kubenswrapper[8731]: I1205 12:33:02.740429 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" event={"ID":"1e6babfe-724a-4eab-bb3b-bc318bf57b70","Type":"ContainerStarted","Data":"9059626ad4510705fe438e1803257849f89596beec2662512048f0044416af28"} Dec 05 12:33:02.741411 master-0 kubenswrapper[8731]: I1205 12:33:02.741372 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:33:02.744657 master-0 kubenswrapper[8731]: I1205 12:33:02.744597 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" event={"ID":"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb","Type":"ContainerStarted","Data":"ec7cd7b19e08539b7cab80696c72c19f718ae2a85d4adde460623354d34db0e3"} Dec 05 12:33:02.744768 master-0 kubenswrapper[8731]: I1205 12:33:02.744663 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" event={"ID":"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb","Type":"ContainerStarted","Data":"a6d8ffe90701aad701ac1d29ce8f42eac206024de7e62e03f130cba9a76b048e"} Dec 05 12:33:02.757915 master-0 kubenswrapper[8731]: I1205 12:33:02.757856 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:33:03.774449 master-0 kubenswrapper[8731]: I1205 12:33:03.773219 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8"} Dec 05 12:33:03.956452 master-0 kubenswrapper[8731]: E1205 12:33:03.956314 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:32:53Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:32:53Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:32:53Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:32:53Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3e65409fc2b27ad0aaeb500a39e264663d2980821f099b830b551785ce4ce8b\\\"],\\\"sizeBytes\\\":1631758507},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9014f384de5f9a0b7418d5869ad349abb9588d16bd09ed650a163c045315dbff\\\"],\\\"sizeBytes\\\":1232140918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b472823604757237c2d16bd6f6221f4cf562aa3b05942c7f602e1e8b2e55a7c6\\\"],\\\"sizeBytes\\\":983705650},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\\\"],\\\"sizeBytes\\\":938303566},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:631a3798b749fecc041a99929eb946618df723e15055e805ff752a1a1273481c\\\"],\\\"sizeBytes\\\":870567329},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b12f830c3316aa4dc061c2d00c74126282b3e2bcccc301eab00d57fff3c4c7c\\\"],\\\"sizeBytes\\\":767284906},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb3ec61f9a932a9ad13bdeb44bcf9477a8d5f728151d7f19ed3ef7d4b02b3a82\\\"],\\\"sizeBytes\\\":682371258},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:916566bb9d0143352324233d460ad94697719c11c8c9158e3aea8f475941751f\\\"],\\\"sizeBytes\\\":677523572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9724d2036305cbd729e1f484c5bad89971de977fff8a6723fef1873858dd1123\\\"],\\\"sizeBytes\\\":616108962},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:df606f3b71d4376d1a2108c09f0d3dab455fc30bcb67c60e91590c105e9025bf\\\"],\\\"sizeBytes\\\":583836304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:79f99fd6cce984287932edf0d009660bb488d663081f3d62ec3b23bc8bfbf6c2\\\"],\\\"sizeBytes\\\":576619763},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eddedae7578d79b5a3f748000ae5c00b9f14a04710f9f9ec7b52fc569be5dfb8\\\"],\\\"sizeBytes\\\":552673986},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\\\"],\\\"sizeBytes\\\":532719167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cfde59e48cd5dee3721f34d249cb119cc3259fd857965d34f9c7ed83b0c363a1\\\"],\\\"sizeBytes\\\":532402162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f4d4282cb53325e737ad68abbfcb70687ae04fb50353f4f0ba0ba5703b15009a\\\"],\\\"sizeBytes\\\":512838054},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\\\"],\\\"sizeBytes\\\":509437356},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e85850a4ae1a1e3ec2c590a4936d640882b6550124da22031c85b526afbf52df\\\"],\\\"sizeBytes\\\":507687221},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8375671da86aa527ee7e291d86971b0baa823ffc7663b5a983084456e76c0f59\\\"],\\\"sizeBytes\\\":506741476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86af77350cfe6fd69280157e4162aa0147873d9431c641ae4ad3e881ff768a73\\\"],\\\"sizeBytes\\\":505628211},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9\\\"],\\\"sizeBytes\\\":503340749},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da\\\"],\\\"sizeBytes\\\":503011144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8eabac819f289e29d75c7ab172d8124554849a47f0b00770928c3eb19a5a31c4\\\"],\\\"sizeBytes\\\":502436444},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10e57ca7611f79710f05777dc6a8f31c7e04eb09da4d8d793a5acfbf0e4692d7\\\"],\\\"sizeBytes\\\":500943492},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f042fa25014f3d37f3ea967d21f361d2a11833ae18f2c750318101b25d2497ce\\\"],\\\"sizeBytes\\\":500848684},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91af633e585621630c40d14f188e37d36b44678d0a59e582d850bf8d593d3a0c\\\"],\\\"sizeBytes\\\":499798563},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\\\"],\\\"sizeBytes\\\":499705918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:75d996f6147edb88c09fd1a052099de66638590d7d03a735006244bc9e19f898\\\"],\\\"sizeBytes\\\":499082775},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f952cec1e5332b84bdffa249cd426f39087058d6544ddcec650a414c15a9b68\\\"],\\\"sizeBytes\\\":489528665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c416b201d480bddb5a4960ec42f4740761a1335001cf84ba5ae19ad6857771b1\\\"],\\\"sizeBytes\\\":481559117},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\\\"],\\\"sizeBytes\\\":459552216},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3ce2cbf1032ad0f24f204db73687002fcf302e86ebde3945801c74351b64576\\\"],\\\"sizeBytes\\\":458169255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f0aa9cd04713acc5c6fea721bd849e1500da8ae945e0b32000887f34d786e0b\\\"],\\\"sizeBytes\\\":442509555},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e438b814f8e16f00b3fc4b69991af80eee79ae111d2a707f34aa64b2ccbb6eb\\\"],\\\"sizeBytes\\\":437737925},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a3d37aa7a22c68afa963ecfb4b43c52cccf152580cd66e4d5382fb69e4037cc\\\"],\\\"sizeBytes\\\":406053031},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9432c13d76bd4ba4eb9197c050cf88c0d701fa2055eeb59257e2e23901f9fdff\\\"],\\\"sizeBytes\\\":401810450},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9\\\"],\\\"sizeBytes\\\":390989693}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:03.991413 master-0 kubenswrapper[8731]: I1205 12:33:03.991384 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 05 12:33:03.999027 master-0 kubenswrapper[8731]: I1205 12:33:03.999004 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96fa3513-5467-4b0f-a03d-9279d36317bd-kube-api-access\") pod \"96fa3513-5467-4b0f-a03d-9279d36317bd\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " Dec 05 12:33:03.999133 master-0 kubenswrapper[8731]: I1205 12:33:03.999042 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-var-lock\") pod \"96fa3513-5467-4b0f-a03d-9279d36317bd\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " Dec 05 12:33:03.999133 master-0 kubenswrapper[8731]: I1205 12:33:03.999068 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-kubelet-dir\") pod \"96fa3513-5467-4b0f-a03d-9279d36317bd\" (UID: \"96fa3513-5467-4b0f-a03d-9279d36317bd\") " Dec 05 12:33:03.999323 master-0 kubenswrapper[8731]: I1205 12:33:03.999262 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-var-lock" (OuterVolumeSpecName: "var-lock") pod "96fa3513-5467-4b0f-a03d-9279d36317bd" (UID: "96fa3513-5467-4b0f-a03d-9279d36317bd"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:33:03.999422 master-0 kubenswrapper[8731]: I1205 12:33:03.999373 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "96fa3513-5467-4b0f-a03d-9279d36317bd" (UID: "96fa3513-5467-4b0f-a03d-9279d36317bd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:33:03.999654 master-0 kubenswrapper[8731]: I1205 12:33:03.999613 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:03.999654 master-0 kubenswrapper[8731]: I1205 12:33:03.999648 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96fa3513-5467-4b0f-a03d-9279d36317bd-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:04.005241 master-0 kubenswrapper[8731]: I1205 12:33:04.005163 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96fa3513-5467-4b0f-a03d-9279d36317bd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "96fa3513-5467-4b0f-a03d-9279d36317bd" (UID: "96fa3513-5467-4b0f-a03d-9279d36317bd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:33:04.100916 master-0 kubenswrapper[8731]: I1205 12:33:04.100653 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96fa3513-5467-4b0f-a03d-9279d36317bd-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:04.779803 master-0 kubenswrapper[8731]: I1205 12:33:04.779729 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"96fa3513-5467-4b0f-a03d-9279d36317bd","Type":"ContainerDied","Data":"89350a747cdc136c0874cbdedf75ff768f3aa173665ba78cb7204afab7285a1e"} Dec 05 12:33:04.779803 master-0 kubenswrapper[8731]: I1205 12:33:04.779786 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89350a747cdc136c0874cbdedf75ff768f3aa173665ba78cb7204afab7285a1e" Dec 05 12:33:04.779803 master-0 kubenswrapper[8731]: I1205 12:33:04.779742 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 05 12:33:05.242996 master-0 kubenswrapper[8731]: I1205 12:33:05.242619 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:33:05.788954 master-0 kubenswrapper[8731]: I1205 12:33:05.788888 8731 generic.go:334] "Generic (PLEG): container finished" podID="5e09e2af7200e6f9be469dbfd9bb1127" containerID="8c7e83119fdbf7fba596a8756e22362ec175fbd883171a7a50b5c673c4302ba8" exitCode=1 Dec 05 12:33:05.788954 master-0 kubenswrapper[8731]: I1205 12:33:05.788953 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerDied","Data":"8c7e83119fdbf7fba596a8756e22362ec175fbd883171a7a50b5c673c4302ba8"} Dec 05 12:33:05.789992 master-0 kubenswrapper[8731]: I1205 12:33:05.789521 8731 scope.go:117] "RemoveContainer" containerID="8c7e83119fdbf7fba596a8756e22362ec175fbd883171a7a50b5c673c4302ba8" Dec 05 12:33:06.796725 master-0 kubenswrapper[8731]: I1205 12:33:06.796633 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"ea798cf6cf2e0e8f9ed09f878b5232d0740a5bbae085c7d7f2ee3609a0190f95"} Dec 05 12:33:08.234628 master-0 kubenswrapper[8731]: E1205 12:33:08.234502 8731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: I1205 12:33:09.471068 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:33:09.471250 master-0 kubenswrapper[8731]: I1205 12:33:09.471155 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:33:10.516023 master-0 kubenswrapper[8731]: I1205 12:33:10.515890 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:33:11.066430 master-0 kubenswrapper[8731]: I1205 12:33:11.066329 8731 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-xxmfp container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Dec 05 12:33:11.066845 master-0 kubenswrapper[8731]: I1205 12:33:11.066458 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" podUID="ba095394-1873-4793-969d-3be979fa0771" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Dec 05 12:33:11.071921 master-0 kubenswrapper[8731]: I1205 12:33:11.071877 8731 patch_prober.go:28] interesting pod/route-controller-manager-858598fd98-5xkcl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.39:8443/healthz\": dial tcp 10.128.0.39:8443: connect: connection refused" start-of-body= Dec 05 12:33:11.073264 master-0 kubenswrapper[8731]: I1205 12:33:11.073218 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" podUID="bb7dd3e9-5a59-4741-970e-aa41c4e078cc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.39:8443/healthz\": dial tcp 10.128.0.39:8443: connect: connection refused" Dec 05 12:33:13.516839 master-0 kubenswrapper[8731]: I1205 12:33:13.516418 8731 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:13.723025 master-0 kubenswrapper[8731]: I1205 12:33:13.722964 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_fa1512be-895a-47e0-abf5-0155c71500e3/installer/0.log" Dec 05 12:33:13.723362 master-0 kubenswrapper[8731]: I1205 12:33:13.723058 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:33:13.836522 master-0 kubenswrapper[8731]: I1205 12:33:13.836454 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_fa1512be-895a-47e0-abf5-0155c71500e3/installer/0.log" Dec 05 12:33:13.836522 master-0 kubenswrapper[8731]: I1205 12:33:13.836516 8731 generic.go:334] "Generic (PLEG): container finished" podID="fa1512be-895a-47e0-abf5-0155c71500e3" containerID="110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93" exitCode=1 Dec 05 12:33:13.837021 master-0 kubenswrapper[8731]: I1205 12:33:13.836554 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"fa1512be-895a-47e0-abf5-0155c71500e3","Type":"ContainerDied","Data":"110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93"} Dec 05 12:33:13.837021 master-0 kubenswrapper[8731]: I1205 12:33:13.836592 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"fa1512be-895a-47e0-abf5-0155c71500e3","Type":"ContainerDied","Data":"a0700728063122d7318a3238bf1b7f099537df8a3022348c540ee8f3798feac2"} Dec 05 12:33:13.837021 master-0 kubenswrapper[8731]: I1205 12:33:13.836615 8731 scope.go:117] "RemoveContainer" containerID="110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93" Dec 05 12:33:13.837021 master-0 kubenswrapper[8731]: I1205 12:33:13.836659 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 05 12:33:13.863975 master-0 kubenswrapper[8731]: I1205 12:33:13.863928 8731 scope.go:117] "RemoveContainer" containerID="110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93" Dec 05 12:33:13.864671 master-0 kubenswrapper[8731]: E1205 12:33:13.864626 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93\": container with ID starting with 110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93 not found: ID does not exist" containerID="110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93" Dec 05 12:33:13.864740 master-0 kubenswrapper[8731]: I1205 12:33:13.864666 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93"} err="failed to get container status \"110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93\": rpc error: code = NotFound desc = could not find container \"110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93\": container with ID starting with 110f5cbd6fb7d97032c4e064ec4baa2f0dbba15dc7401b2a692a82bbe3335c93 not found: ID does not exist" Dec 05 12:33:13.886319 master-0 kubenswrapper[8731]: I1205 12:33:13.886258 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-var-lock\") pod \"fa1512be-895a-47e0-abf5-0155c71500e3\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " Dec 05 12:33:13.886319 master-0 kubenswrapper[8731]: I1205 12:33:13.886326 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-kubelet-dir\") pod \"fa1512be-895a-47e0-abf5-0155c71500e3\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " Dec 05 12:33:13.886734 master-0 kubenswrapper[8731]: I1205 12:33:13.886394 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa1512be-895a-47e0-abf5-0155c71500e3-kube-api-access\") pod \"fa1512be-895a-47e0-abf5-0155c71500e3\" (UID: \"fa1512be-895a-47e0-abf5-0155c71500e3\") " Dec 05 12:33:13.886734 master-0 kubenswrapper[8731]: I1205 12:33:13.886468 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-var-lock" (OuterVolumeSpecName: "var-lock") pod "fa1512be-895a-47e0-abf5-0155c71500e3" (UID: "fa1512be-895a-47e0-abf5-0155c71500e3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:33:13.886734 master-0 kubenswrapper[8731]: I1205 12:33:13.886499 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fa1512be-895a-47e0-abf5-0155c71500e3" (UID: "fa1512be-895a-47e0-abf5-0155c71500e3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:33:13.886734 master-0 kubenswrapper[8731]: I1205 12:33:13.886635 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:13.886734 master-0 kubenswrapper[8731]: I1205 12:33:13.886653 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fa1512be-895a-47e0-abf5-0155c71500e3-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:13.894107 master-0 kubenswrapper[8731]: I1205 12:33:13.894044 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1512be-895a-47e0-abf5-0155c71500e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fa1512be-895a-47e0-abf5-0155c71500e3" (UID: "fa1512be-895a-47e0-abf5-0155c71500e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:33:13.957578 master-0 kubenswrapper[8731]: E1205 12:33:13.957512 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:13.987928 master-0 kubenswrapper[8731]: I1205 12:33:13.987859 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fa1512be-895a-47e0-abf5-0155c71500e3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:15.729579 master-0 kubenswrapper[8731]: E1205 12:33:15.729474 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 05 12:33:15.852161 master-0 kubenswrapper[8731]: I1205 12:33:15.852089 8731 generic.go:334] "Generic (PLEG): container finished" podID="cc0396a9a2689b3e8c132c12640cbe83" containerID="296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954" exitCode=0 Dec 05 12:33:16.861306 master-0 kubenswrapper[8731]: I1205 12:33:16.861094 8731 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="bbac3062d171e964c6a10b8a9a51c923e56d399e294dc2e11516a9c8232774c1" exitCode=0 Dec 05 12:33:16.861306 master-0 kubenswrapper[8731]: I1205 12:33:16.861251 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerDied","Data":"bbac3062d171e964c6a10b8a9a51c923e56d399e294dc2e11516a9c8232774c1"} Dec 05 12:33:16.865010 master-0 kubenswrapper[8731]: I1205 12:33:16.864963 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/0.log" Dec 05 12:33:16.865010 master-0 kubenswrapper[8731]: I1205 12:33:16.865010 8731 generic.go:334] "Generic (PLEG): container finished" podID="d53a4886-db25-43a1-825a-66a9a9a58590" containerID="e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072" exitCode=1 Dec 05 12:33:16.865146 master-0 kubenswrapper[8731]: I1205 12:33:16.865035 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerDied","Data":"e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072"} Dec 05 12:33:16.865475 master-0 kubenswrapper[8731]: I1205 12:33:16.865442 8731 scope.go:117] "RemoveContainer" containerID="e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072" Dec 05 12:33:17.883423 master-0 kubenswrapper[8731]: I1205 12:33:17.883315 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/0.log" Dec 05 12:33:17.883423 master-0 kubenswrapper[8731]: I1205 12:33:17.883430 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerStarted","Data":"0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee"} Dec 05 12:33:18.235743 master-0 kubenswrapper[8731]: E1205 12:33:18.235672 8731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:18.320540 master-0 kubenswrapper[8731]: I1205 12:33:18.320453 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_cc0396a9a2689b3e8c132c12640cbe83/etcdctl/0.log" Dec 05 12:33:18.320777 master-0 kubenswrapper[8731]: I1205 12:33:18.320584 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:33:18.444009 master-0 kubenswrapper[8731]: I1205 12:33:18.443895 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") pod \"cc0396a9a2689b3e8c132c12640cbe83\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " Dec 05 12:33:18.444009 master-0 kubenswrapper[8731]: I1205 12:33:18.443990 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") pod \"cc0396a9a2689b3e8c132c12640cbe83\" (UID: \"cc0396a9a2689b3e8c132c12640cbe83\") " Dec 05 12:33:18.444394 master-0 kubenswrapper[8731]: I1205 12:33:18.444143 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs" (OuterVolumeSpecName: "certs") pod "cc0396a9a2689b3e8c132c12640cbe83" (UID: "cc0396a9a2689b3e8c132c12640cbe83"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:33:18.444394 master-0 kubenswrapper[8731]: I1205 12:33:18.444287 8731 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:18.444394 master-0 kubenswrapper[8731]: I1205 12:33:18.444297 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir" (OuterVolumeSpecName: "data-dir") pod "cc0396a9a2689b3e8c132c12640cbe83" (UID: "cc0396a9a2689b3e8c132c12640cbe83"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: I1205 12:33:18.475947 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:33:18.476160 master-0 kubenswrapper[8731]: I1205 12:33:18.476043 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:33:18.545554 master-0 kubenswrapper[8731]: I1205 12:33:18.545477 8731 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/cc0396a9a2689b3e8c132c12640cbe83-data-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:18.891983 master-0 kubenswrapper[8731]: I1205 12:33:18.891918 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_cc0396a9a2689b3e8c132c12640cbe83/etcdctl/0.log" Dec 05 12:33:18.891983 master-0 kubenswrapper[8731]: I1205 12:33:18.891967 8731 generic.go:334] "Generic (PLEG): container finished" podID="cc0396a9a2689b3e8c132c12640cbe83" containerID="bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715" exitCode=137 Dec 05 12:33:18.893247 master-0 kubenswrapper[8731]: I1205 12:33:18.892028 8731 scope.go:117] "RemoveContainer" containerID="296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954" Dec 05 12:33:18.893247 master-0 kubenswrapper[8731]: I1205 12:33:18.892135 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:33:18.896567 master-0 kubenswrapper[8731]: I1205 12:33:18.896497 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-858598fd98-5xkcl_bb7dd3e9-5a59-4741-970e-aa41c4e078cc/route-controller-manager/0.log" Dec 05 12:33:18.896676 master-0 kubenswrapper[8731]: I1205 12:33:18.896624 8731 generic.go:334] "Generic (PLEG): container finished" podID="bb7dd3e9-5a59-4741-970e-aa41c4e078cc" containerID="d12e1c8bf264de03492186948f2fcb8fa30acf3e5c6ac0dd00637ed1e75cfa31" exitCode=137 Dec 05 12:33:18.896744 master-0 kubenswrapper[8731]: I1205 12:33:18.896694 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" event={"ID":"bb7dd3e9-5a59-4741-970e-aa41c4e078cc","Type":"ContainerDied","Data":"d12e1c8bf264de03492186948f2fcb8fa30acf3e5c6ac0dd00637ed1e75cfa31"} Dec 05 12:33:18.896813 master-0 kubenswrapper[8731]: I1205 12:33:18.896759 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" event={"ID":"bb7dd3e9-5a59-4741-970e-aa41c4e078cc","Type":"ContainerDied","Data":"5c4af08f057f648c818372db0ec480f0be2db13e0c4e8fe00fc9f59a56ca06ec"} Dec 05 12:33:18.896813 master-0 kubenswrapper[8731]: I1205 12:33:18.896803 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c4af08f057f648c818372db0ec480f0be2db13e0c4e8fe00fc9f59a56ca06ec" Dec 05 12:33:18.906865 master-0 kubenswrapper[8731]: I1205 12:33:18.906813 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-858598fd98-5xkcl_bb7dd3e9-5a59-4741-970e-aa41c4e078cc/route-controller-manager/0.log" Dec 05 12:33:18.906989 master-0 kubenswrapper[8731]: I1205 12:33:18.906940 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:33:18.910552 master-0 kubenswrapper[8731]: I1205 12:33:18.910503 8731 scope.go:117] "RemoveContainer" containerID="bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715" Dec 05 12:33:18.937031 master-0 kubenswrapper[8731]: I1205 12:33:18.936915 8731 scope.go:117] "RemoveContainer" containerID="296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954" Dec 05 12:33:18.937648 master-0 kubenswrapper[8731]: E1205 12:33:18.937601 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954\": container with ID starting with 296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954 not found: ID does not exist" containerID="296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954" Dec 05 12:33:18.937648 master-0 kubenswrapper[8731]: I1205 12:33:18.937637 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954"} err="failed to get container status \"296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954\": rpc error: code = NotFound desc = could not find container \"296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954\": container with ID starting with 296f9752095436403474f93df276faa705635dd48e13c86d863312c7d94b3954 not found: ID does not exist" Dec 05 12:33:18.937805 master-0 kubenswrapper[8731]: I1205 12:33:18.937660 8731 scope.go:117] "RemoveContainer" containerID="bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715" Dec 05 12:33:18.938204 master-0 kubenswrapper[8731]: E1205 12:33:18.938134 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715\": container with ID starting with bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715 not found: ID does not exist" containerID="bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715" Dec 05 12:33:18.938204 master-0 kubenswrapper[8731]: I1205 12:33:18.938167 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715"} err="failed to get container status \"bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715\": rpc error: code = NotFound desc = could not find container \"bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715\": container with ID starting with bc0f8f75cee3cab2f35245110c53e2d7aee426e9d1f8fd832cda99c84f270715 not found: ID does not exist" Dec 05 12:33:19.051134 master-0 kubenswrapper[8731]: I1205 12:33:19.050922 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-config\") pod \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " Dec 05 12:33:19.051134 master-0 kubenswrapper[8731]: I1205 12:33:19.050975 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8p45\" (UniqueName: \"kubernetes.io/projected/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-kube-api-access-t8p45\") pod \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " Dec 05 12:33:19.051134 master-0 kubenswrapper[8731]: I1205 12:33:19.051030 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-client-ca\") pod \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " Dec 05 12:33:19.051134 master-0 kubenswrapper[8731]: I1205 12:33:19.051088 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-serving-cert\") pod \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\" (UID: \"bb7dd3e9-5a59-4741-970e-aa41c4e078cc\") " Dec 05 12:33:19.052495 master-0 kubenswrapper[8731]: I1205 12:33:19.052413 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "bb7dd3e9-5a59-4741-970e-aa41c4e078cc" (UID: "bb7dd3e9-5a59-4741-970e-aa41c4e078cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:33:19.052623 master-0 kubenswrapper[8731]: I1205 12:33:19.052470 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-config" (OuterVolumeSpecName: "config") pod "bb7dd3e9-5a59-4741-970e-aa41c4e078cc" (UID: "bb7dd3e9-5a59-4741-970e-aa41c4e078cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:33:19.059283 master-0 kubenswrapper[8731]: I1205 12:33:19.059171 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-kube-api-access-t8p45" (OuterVolumeSpecName: "kube-api-access-t8p45") pod "bb7dd3e9-5a59-4741-970e-aa41c4e078cc" (UID: "bb7dd3e9-5a59-4741-970e-aa41c4e078cc"). InnerVolumeSpecName "kube-api-access-t8p45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:33:19.059728 master-0 kubenswrapper[8731]: I1205 12:33:19.059681 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bb7dd3e9-5a59-4741-970e-aa41c4e078cc" (UID: "bb7dd3e9-5a59-4741-970e-aa41c4e078cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:33:19.153080 master-0 kubenswrapper[8731]: I1205 12:33:19.152974 8731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:19.153080 master-0 kubenswrapper[8731]: I1205 12:33:19.153037 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:19.153080 master-0 kubenswrapper[8731]: I1205 12:33:19.153056 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8p45\" (UniqueName: \"kubernetes.io/projected/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-kube-api-access-t8p45\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:19.153080 master-0 kubenswrapper[8731]: I1205 12:33:19.153075 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7dd3e9-5a59-4741-970e-aa41c4e078cc-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:33:19.904483 master-0 kubenswrapper[8731]: I1205 12:33:19.904399 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl" Dec 05 12:33:19.943869 master-0 kubenswrapper[8731]: I1205 12:33:19.943774 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc0396a9a2689b3e8c132c12640cbe83" path="/var/lib/kubelet/pods/cc0396a9a2689b3e8c132c12640cbe83/volumes" Dec 05 12:33:19.944515 master-0 kubenswrapper[8731]: I1205 12:33:19.944460 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 05 12:33:21.066244 master-0 kubenswrapper[8731]: I1205 12:33:21.066103 8731 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-xxmfp container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Dec 05 12:33:21.066244 master-0 kubenswrapper[8731]: I1205 12:33:21.066213 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" podUID="ba095394-1873-4793-969d-3be979fa0771" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Dec 05 12:33:22.230375 master-0 kubenswrapper[8731]: E1205 12:33:22.230124 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.187e51bb260f8e81 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:cc0396a9a2689b3e8c132c12640cbe83,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:32:48.215101057 +0000 UTC m=+66.519085224,LastTimestamp:2025-12-05 12:32:48.215101057 +0000 UTC m=+66.519085224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:33:22.275486 master-0 kubenswrapper[8731]: E1205 12:33:22.275326 8731 projected.go:194] Error preparing data for projected volume kube-api-access-bll66 for pod openshift-controller-manager/controller-manager-675db9579f-4dcg8: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:33:22.275704 master-0 kubenswrapper[8731]: E1205 12:33:22.275490 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66 podName:7e562fda-e695-4218-a9cf-4179b8d456db nodeName:}" failed. No retries permitted until 2025-12-05 12:33:22.775448542 +0000 UTC m=+101.079432749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bll66" (UniqueName: "kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66") pod "controller-manager-675db9579f-4dcg8" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:33:22.798746 master-0 kubenswrapper[8731]: I1205 12:33:22.798612 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:33:23.517446 master-0 kubenswrapper[8731]: I1205 12:33:23.517301 8731 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:23.958289 master-0 kubenswrapper[8731]: E1205 12:33:23.958134 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: I1205 12:33:27.484367 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:33:27.484949 master-0 kubenswrapper[8731]: I1205 12:33:27.484525 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:33:28.236659 master-0 kubenswrapper[8731]: E1205 12:33:28.236515 8731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:28.968742 master-0 kubenswrapper[8731]: I1205 12:33:28.968658 8731 generic.go:334] "Generic (PLEG): container finished" podID="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" containerID="eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d" exitCode=0 Dec 05 12:33:29.869618 master-0 kubenswrapper[8731]: E1205 12:33:29.869493 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 05 12:33:29.976772 master-0 kubenswrapper[8731]: I1205 12:33:29.976667 8731 generic.go:334] "Generic (PLEG): container finished" podID="1871a9d6-6369-4d08-816f-9c6310b61ddf" containerID="4f8a59bfccc80caaa9ccb9172563888264ac2bfba8642d650c783edb02a956b7" exitCode=0 Dec 05 12:33:30.984362 master-0 kubenswrapper[8731]: I1205 12:33:30.984251 8731 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="49ca67aa7902f9104b46e18f411e1fcfcd3bd696757b09b6ab811180664a0848" exitCode=0 Dec 05 12:33:30.986214 master-0 kubenswrapper[8731]: I1205 12:33:30.986152 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_d627fcf3-2a80-4739-add9-e21ad4efc6eb/installer/0.log" Dec 05 12:33:30.986291 master-0 kubenswrapper[8731]: I1205 12:33:30.986224 8731 generic.go:334] "Generic (PLEG): container finished" podID="d627fcf3-2a80-4739-add9-e21ad4efc6eb" containerID="8654b600b7307ea1bcd3fe84275fb56084c5722cbe5ccf524025cea2bfa3d8cd" exitCode=1 Dec 05 12:33:31.065379 master-0 kubenswrapper[8731]: I1205 12:33:31.065265 8731 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-xxmfp container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Dec 05 12:33:31.065379 master-0 kubenswrapper[8731]: I1205 12:33:31.065347 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" podUID="ba095394-1873-4793-969d-3be979fa0771" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Dec 05 12:33:33.517105 master-0 kubenswrapper[8731]: I1205 12:33:33.516532 8731 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:33.958965 master-0 kubenswrapper[8731]: E1205 12:33:33.958811 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: I1205 12:33:36.493450 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:33:36.493541 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:33:36.494567 master-0 kubenswrapper[8731]: I1205 12:33:36.493561 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:33:37.021383 master-0 kubenswrapper[8731]: I1205 12:33:37.021268 8731 generic.go:334] "Generic (PLEG): container finished" podID="4b7f0d8d-a2bf-4550-b6e6-1c56adae827e" containerID="401643c70c405d6156a16a3ab17611e0b06471ba9931da499a2092a2a6caa1f3" exitCode=0 Dec 05 12:33:38.028728 master-0 kubenswrapper[8731]: I1205 12:33:38.028624 8731 generic.go:334] "Generic (PLEG): container finished" podID="f119ffe4-16bd-49eb-916d-b18ba0d79b54" containerID="47752e8beaf9f853c41667ca645eb6d00a5917c9b6cb4206f48e1b5596bdcc79" exitCode=0 Dec 05 12:33:38.030745 master-0 kubenswrapper[8731]: I1205 12:33:38.030665 8731 generic.go:334] "Generic (PLEG): container finished" podID="ba095394-1873-4793-969d-3be979fa0771" containerID="a4430062c5adda1c62354e9a698c163c97a33327be32fd67d0fc627123050dbf" exitCode=0 Dec 05 12:33:38.237262 master-0 kubenswrapper[8731]: E1205 12:33:38.237084 8731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:39.038790 master-0 kubenswrapper[8731]: I1205 12:33:39.038718 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/0.log" Dec 05 12:33:39.038790 master-0 kubenswrapper[8731]: I1205 12:33:39.038778 8731 generic.go:334] "Generic (PLEG): container finished" podID="5efad170-c154-42ec-a7c0-b36a98d2bfcc" containerID="0caaca757a34c0215195111520c95615b587485cd660ccd63c3b233f466666bb" exitCode=255 Dec 05 12:33:43.960130 master-0 kubenswrapper[8731]: E1205 12:33:43.959554 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:43.960130 master-0 kubenswrapper[8731]: E1205 12:33:43.960094 8731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: I1205 12:33:45.500485 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:33:45.500655 master-0 kubenswrapper[8731]: I1205 12:33:45.500594 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:33:48.237715 master-0 kubenswrapper[8731]: E1205 12:33:48.237570 8731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:33:48.237715 master-0 kubenswrapper[8731]: I1205 12:33:48.237737 8731 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 12:33:48.607798 master-0 kubenswrapper[8731]: I1205 12:33:48.607589 8731 status_manager.go:851] "Failed to get status for pod" podUID="565d5ef6-b0e7-4f04-9460-61f1d3903d37" pod="openshift-kube-controller-manager/installer-1-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-1-master-0)" Dec 05 12:33:50.098674 master-0 kubenswrapper[8731]: I1205 12:33:50.098611 8731 generic.go:334] "Generic (PLEG): container finished" podID="594aaded-5615-4bed-87ee-6173059a73be" containerID="b351d2f70dc6ca77a15619a3104c4ce47b9bc5e14772befd2755648b695c45dd" exitCode=0 Dec 05 12:33:51.106795 master-0 kubenswrapper[8731]: I1205 12:33:51.106667 8731 generic.go:334] "Generic (PLEG): container finished" podID="807d9093-aa67-4840-b5be-7f3abcc1beed" containerID="f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0" exitCode=0 Dec 05 12:33:51.677671 master-0 kubenswrapper[8731]: I1205 12:33:51.677544 8731 patch_prober.go:28] interesting pod/etcd-operator-5bf4d88c6f-dxd24 container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Dec 05 12:33:51.677671 master-0 kubenswrapper[8731]: I1205 12:33:51.677649 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" podUID="f119ffe4-16bd-49eb-916d-b18ba0d79b54" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Dec 05 12:33:53.120585 master-0 kubenswrapper[8731]: I1205 12:33:53.120383 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xwx26_b8233dad-bd19-4842-a4d5-cfa84f1feb83/approver/0.log" Dec 05 12:33:53.121644 master-0 kubenswrapper[8731]: I1205 12:33:53.120752 8731 generic.go:334] "Generic (PLEG): container finished" podID="b8233dad-bd19-4842-a4d5-cfa84f1feb83" containerID="41718b57d6d2e36d2cb94e43774b239e600e6619dc10d3c14a0345e610d821c2" exitCode=1 Dec 05 12:33:53.948924 master-0 kubenswrapper[8731]: E1205 12:33:53.948829 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:33:53.949374 master-0 kubenswrapper[8731]: E1205 12:33:53.949285 8731 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.016s" Dec 05 12:33:53.949374 master-0 kubenswrapper[8731]: I1205 12:33:53.949312 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:33:53.950282 master-0 kubenswrapper[8731]: I1205 12:33:53.950232 8731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 12:33:53.950407 master-0 kubenswrapper[8731]: I1205 12:33:53.950380 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" containerID="cri-o://dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8" gracePeriod=30 Dec 05 12:33:53.955435 master-0 kubenswrapper[8731]: I1205 12:33:53.955320 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 05 12:33:54.132037 master-0 kubenswrapper[8731]: I1205 12:33:54.131965 8731 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8" exitCode=2 Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: I1205 12:33:54.508839 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:33:54.509012 master-0 kubenswrapper[8731]: I1205 12:33:54.508915 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:33:56.234122 master-0 kubenswrapper[8731]: E1205 12:33:56.233880 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{installer-1-master-0.187e51bb28555376 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:installer-1-master-0,UID:565d5ef6-b0e7-4f04-9460-61f1d3903d37,APIVersion:v1,ResourceVersion:6929,FieldPath:spec.containers{installer},},Reason:Created,Message:Created container: installer,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:32:48.253227894 +0000 UTC m=+66.557212061,LastTimestamp:2025-12-05 12:32:48.253227894 +0000 UTC m=+66.557212061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:33:56.802656 master-0 kubenswrapper[8731]: E1205 12:33:56.802530 8731 projected.go:194] Error preparing data for projected volume kube-api-access-bll66 for pod openshift-controller-manager/controller-manager-675db9579f-4dcg8: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:33:56.802656 master-0 kubenswrapper[8731]: E1205 12:33:56.802667 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66 podName:7e562fda-e695-4218-a9cf-4179b8d456db nodeName:}" failed. No retries permitted until 2025-12-05 12:33:57.8026309 +0000 UTC m=+136.106615097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-bll66" (UniqueName: "kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66") pod "controller-manager-675db9579f-4dcg8" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:33:57.813514 master-0 kubenswrapper[8731]: I1205 12:33:57.813373 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:33:58.238901 master-0 kubenswrapper[8731]: E1205 12:33:58.238803 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: I1205 12:34:03.514755 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:34:03.515387 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:34:03.517650 master-0 kubenswrapper[8731]: I1205 12:34:03.515433 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:34:04.021614 master-0 kubenswrapper[8731]: E1205 12:34:04.021387 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:33:54Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:33:54Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:33:54Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:33:54Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3e65409fc2b27ad0aaeb500a39e264663d2980821f099b830b551785ce4ce8b\\\"],\\\"sizeBytes\\\":1631758507},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9014f384de5f9a0b7418d5869ad349abb9588d16bd09ed650a163c045315dbff\\\"],\\\"sizeBytes\\\":1232140918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b472823604757237c2d16bd6f6221f4cf562aa3b05942c7f602e1e8b2e55a7c6\\\"],\\\"sizeBytes\\\":983705650},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\\\"],\\\"sizeBytes\\\":938303566},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:631a3798b749fecc041a99929eb946618df723e15055e805ff752a1a1273481c\\\"],\\\"sizeBytes\\\":870567329},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f1ca78c423f43f89a0411e40393642f64e4f8df9e5f61c25e31047c4cce170f9\\\"],\\\"sizeBytes\\\":857069957},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b12f830c3316aa4dc061c2d00c74126282b3e2bcccc301eab00d57fff3c4c7c\\\"],\\\"sizeBytes\\\":767284906},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb3ec61f9a932a9ad13bdeb44bcf9477a8d5f728151d7f19ed3ef7d4b02b3a82\\\"],\\\"sizeBytes\\\":682371258},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:916566bb9d0143352324233d460ad94697719c11c8c9158e3aea8f475941751f\\\"],\\\"sizeBytes\\\":677523572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5451aa441e5b8d8689c032405d410c8049a849ef2edf77e5b6a5ce2838c6569b\\\"],\\\"sizeBytes\\\":672407260},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9724d2036305cbd729e1f484c5bad89971de977fff8a6723fef1873858dd1123\\\"],\\\"sizeBytes\\\":616108962},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:df606f3b71d4376d1a2108c09f0d3dab455fc30bcb67c60e91590c105e9025bf\\\"],\\\"sizeBytes\\\":583836304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:79f99fd6cce984287932edf0d009660bb488d663081f3d62ec3b23bc8bfbf6c2\\\"],\\\"sizeBytes\\\":576619763},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eddedae7578d79b5a3f748000ae5c00b9f14a04710f9f9ec7b52fc569be5dfb8\\\"],\\\"sizeBytes\\\":552673986},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa24edce3d740f84c40018e94cdbf2bc7375268d13d57c2d664e43a46ccea3fc\\\"],\\\"sizeBytes\\\":543227406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\\\"],\\\"sizeBytes\\\":532719167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cfde59e48cd5dee3721f34d249cb119cc3259fd857965d34f9c7ed83b0c363a1\\\"],\\\"sizeBytes\\\":532402162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f4d4282cb53325e737ad68abbfcb70687ae04fb50353f4f0ba0ba5703b15009a\\\"],\\\"sizeBytes\\\":512838054},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:8c885ea0b3c5124989f0a9b93eba98eb9fca6bbd0262772d85d90bf713a4d572\\\"],\\\"sizeBytes\\\":512452153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\\\"],\\\"sizeBytes\\\":509437356},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e85850a4ae1a1e3ec2c590a4936d640882b6550124da22031c85b526afbf52df\\\"],\\\"sizeBytes\\\":507687221},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8375671da86aa527ee7e291d86971b0baa823ffc7663b5a983084456e76c0f59\\\"],\\\"sizeBytes\\\":506741476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:831f30660844091d6154e2674d3a9da6f34271bf8a2c40b56f7416066318742b\\\"],\\\"sizeBytes\\\":505649178},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86af77350cfe6fd69280157e4162aa0147873d9431c641ae4ad3e881ff768a73\\\"],\\\"sizeBytes\\\":505628211},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9\\\"],\\\"sizeBytes\\\":503340749},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da\\\"],\\\"sizeBytes\\\":503011144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8eabac819f289e29d75c7ab172d8124554849a47f0b00770928c3eb19a5a31c4\\\"],\\\"sizeBytes\\\":502436444},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10e57ca7611f79710f05777dc6a8f31c7e04eb09da4d8d793a5acfbf0e4692d7\\\"],\\\"sizeBytes\\\":500943492},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f042fa25014f3d37f3ea967d21f361d2a11833ae18f2c750318101b25d2497ce\\\"],\\\"sizeBytes\\\":500848684},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91af633e585621630c40d14f188e37d36b44678d0a59e582d850bf8d593d3a0c\\\"],\\\"sizeBytes\\\":499798563},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\\\"],\\\"sizeBytes\\\":499705918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:75d996f6147edb88c09fd1a052099de66638590d7d03a735006244bc9e19f898\\\"],\\\"sizeBytes\\\":499082775},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f952cec1e5332b84bdffa249cd426f39087058d6544ddcec650a414c15a9b68\\\"],\\\"sizeBytes\\\":489528665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c416b201d480bddb5a4960ec42f4740761a1335001cf84ba5ae19ad6857771b1\\\"],\\\"sizeBytes\\\":481559117},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3a77aa4d03b89ea284e3467a268e5989a77a2ef63e685eb1d5c5ea5b3922b7a\\\"],\\\"sizeBytes\\\":478917802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c1edf52f70bf9b1d1457e0c4111bc79cdaa1edd659ddbdb9d8176eff8b46956\\\"],\\\"sizeBytes\\\":462727837},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\\\"],\\\"sizeBytes\\\":459552216},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3ce2cbf1032ad0f24f204db73687002fcf302e86ebde3945801c74351b64576\\\"],\\\"sizeBytes\\\":458169255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7664a2d4cb10e82ed32abbf95799f43fc3d10135d7dd94799730de504a89680a\\\"],\\\"sizeBytes\\\":452589750},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4ecc5bac651ff1942865baee5159582e9602c89b47eeab18400a32abcba8f690\\\"],\\\"sizeBytes\\\":451039520},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2632d7f05d5a992e91038ded81c715898f3fe803420a9b67a0201e9fd8075213\\\"],\\\"sizeBytes\\\":443291941},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f0aa9cd04713acc5c6fea721bd849e1500da8ae945e0b32000887f34d786e0b\\\"],\\\"sizeBytes\\\":442509555},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e438b814f8e16f00b3fc4b69991af80eee79ae111d2a707f34aa64b2ccbb6eb\\\"],\\\"sizeBytes\\\":437737925},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a3d37aa7a22c68afa963ecfb4b43c52cccf152580cd66e4d5382fb69e4037cc\\\"],\\\"sizeBytes\\\":406053031},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9432c13d76bd4ba4eb9197c050cf88c0d701fa2055eeb59257e2e23901f9fdff\\\"],\\\"sizeBytes\\\":401810450},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9\\\"],\\\"sizeBytes\\\":390989693}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:34:05.194283 master-0 kubenswrapper[8731]: I1205 12:34:05.194171 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_076dafdf-a5d2-4e2d-9c38-6932910f7327/installer/0.log" Dec 05 12:34:05.194283 master-0 kubenswrapper[8731]: I1205 12:34:05.194281 8731 generic.go:334] "Generic (PLEG): container finished" podID="076dafdf-a5d2-4e2d-9c38-6932910f7327" containerID="f8dc47e77bee6411ef3a450c0123b8279b91a4729700211ae01112ac79fa1d1e" exitCode=1 Dec 05 12:34:08.440659 master-0 kubenswrapper[8731]: E1205 12:34:08.440545 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: I1205 12:34:12.524800 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:34:12.525312 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:34:12.526760 master-0 kubenswrapper[8731]: I1205 12:34:12.525339 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:34:14.022440 master-0 kubenswrapper[8731]: E1205 12:34:14.022235 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:34:18.842665 master-0 kubenswrapper[8731]: E1205 12:34:18.842508 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: I1205 12:34:21.531883 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:34:21.531972 master-0 kubenswrapper[8731]: I1205 12:34:21.531965 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:34:24.023864 master-0 kubenswrapper[8731]: E1205 12:34:24.023377 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:34:27.958981 master-0 kubenswrapper[8731]: E1205 12:34:27.958869 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:34:27.960585 master-0 kubenswrapper[8731]: E1205 12:34:27.959117 8731 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.01s" Dec 05 12:34:27.960585 master-0 kubenswrapper[8731]: I1205 12:34:27.959510 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:34:27.960585 master-0 kubenswrapper[8731]: I1205 12:34:27.959541 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:34:27.962367 master-0 kubenswrapper[8731]: I1205 12:34:27.962285 8731 scope.go:117] "RemoveContainer" containerID="eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d" Dec 05 12:34:27.962641 master-0 kubenswrapper[8731]: I1205 12:34:27.962564 8731 scope.go:117] "RemoveContainer" containerID="0caaca757a34c0215195111520c95615b587485cd660ccd63c3b233f466666bb" Dec 05 12:34:27.967151 master-0 kubenswrapper[8731]: I1205 12:34:27.965683 8731 scope.go:117] "RemoveContainer" containerID="4f8a59bfccc80caaa9ccb9172563888264ac2bfba8642d650c783edb02a956b7" Dec 05 12:34:27.967711 master-0 kubenswrapper[8731]: I1205 12:34:27.967330 8731 scope.go:117] "RemoveContainer" containerID="f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0" Dec 05 12:34:27.967711 master-0 kubenswrapper[8731]: I1205 12:34:27.967586 8731 scope.go:117] "RemoveContainer" containerID="b351d2f70dc6ca77a15619a3104c4ce47b9bc5e14772befd2755648b695c45dd" Dec 05 12:34:27.967969 master-0 kubenswrapper[8731]: I1205 12:34:27.967893 8731 scope.go:117] "RemoveContainer" containerID="47752e8beaf9f853c41667ca645eb6d00a5917c9b6cb4206f48e1b5596bdcc79" Dec 05 12:34:27.968284 master-0 kubenswrapper[8731]: I1205 12:34:27.968248 8731 scope.go:117] "RemoveContainer" containerID="a4430062c5adda1c62354e9a698c163c97a33327be32fd67d0fc627123050dbf" Dec 05 12:34:27.968910 master-0 kubenswrapper[8731]: I1205 12:34:27.968799 8731 scope.go:117] "RemoveContainer" containerID="41718b57d6d2e36d2cb94e43774b239e600e6619dc10d3c14a0345e610d821c2" Dec 05 12:34:27.981957 master-0 kubenswrapper[8731]: I1205 12:34:27.981783 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 05 12:34:28.329656 master-0 kubenswrapper[8731]: I1205 12:34:28.329119 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xwx26_b8233dad-bd19-4842-a4d5-cfa84f1feb83/approver/0.log" Dec 05 12:34:28.533845 master-0 kubenswrapper[8731]: I1205 12:34:28.533458 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_d627fcf3-2a80-4739-add9-e21ad4efc6eb/installer/0.log" Dec 05 12:34:28.534007 master-0 kubenswrapper[8731]: I1205 12:34:28.533889 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:34:28.545575 master-0 kubenswrapper[8731]: I1205 12:34:28.545544 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_076dafdf-a5d2-4e2d-9c38-6932910f7327/installer/0.log" Dec 05 12:34:28.545685 master-0 kubenswrapper[8731]: I1205 12:34:28.545613 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:34:28.709570 master-0 kubenswrapper[8731]: I1205 12:34:28.709475 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-kubelet-dir\") pod \"076dafdf-a5d2-4e2d-9c38-6932910f7327\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " Dec 05 12:34:28.709947 master-0 kubenswrapper[8731]: I1205 12:34:28.709585 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-var-lock\") pod \"076dafdf-a5d2-4e2d-9c38-6932910f7327\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " Dec 05 12:34:28.709947 master-0 kubenswrapper[8731]: I1205 12:34:28.709664 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/076dafdf-a5d2-4e2d-9c38-6932910f7327-kube-api-access\") pod \"076dafdf-a5d2-4e2d-9c38-6932910f7327\" (UID: \"076dafdf-a5d2-4e2d-9c38-6932910f7327\") " Dec 05 12:34:28.709947 master-0 kubenswrapper[8731]: I1205 12:34:28.709714 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kube-api-access\") pod \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " Dec 05 12:34:28.709947 master-0 kubenswrapper[8731]: I1205 12:34:28.709739 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-var-lock" (OuterVolumeSpecName: "var-lock") pod "076dafdf-a5d2-4e2d-9c38-6932910f7327" (UID: "076dafdf-a5d2-4e2d-9c38-6932910f7327"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:34:28.709947 master-0 kubenswrapper[8731]: I1205 12:34:28.709744 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "076dafdf-a5d2-4e2d-9c38-6932910f7327" (UID: "076dafdf-a5d2-4e2d-9c38-6932910f7327"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:34:28.709947 master-0 kubenswrapper[8731]: I1205 12:34:28.709772 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kubelet-dir\") pod \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " Dec 05 12:34:28.709947 master-0 kubenswrapper[8731]: I1205 12:34:28.709912 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-var-lock\") pod \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\" (UID: \"d627fcf3-2a80-4739-add9-e21ad4efc6eb\") " Dec 05 12:34:28.709947 master-0 kubenswrapper[8731]: I1205 12:34:28.709820 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d627fcf3-2a80-4739-add9-e21ad4efc6eb" (UID: "d627fcf3-2a80-4739-add9-e21ad4efc6eb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:34:28.710379 master-0 kubenswrapper[8731]: I1205 12:34:28.710076 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-var-lock" (OuterVolumeSpecName: "var-lock") pod "d627fcf3-2a80-4739-add9-e21ad4efc6eb" (UID: "d627fcf3-2a80-4739-add9-e21ad4efc6eb"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:34:28.710379 master-0 kubenswrapper[8731]: I1205 12:34:28.710362 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:34:28.710464 master-0 kubenswrapper[8731]: I1205 12:34:28.710397 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/076dafdf-a5d2-4e2d-9c38-6932910f7327-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:34:28.710464 master-0 kubenswrapper[8731]: I1205 12:34:28.710428 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:34:28.710464 master-0 kubenswrapper[8731]: I1205 12:34:28.710449 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d627fcf3-2a80-4739-add9-e21ad4efc6eb-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:34:28.717063 master-0 kubenswrapper[8731]: I1205 12:34:28.717002 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d627fcf3-2a80-4739-add9-e21ad4efc6eb" (UID: "d627fcf3-2a80-4739-add9-e21ad4efc6eb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:34:28.717382 master-0 kubenswrapper[8731]: I1205 12:34:28.717340 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/076dafdf-a5d2-4e2d-9c38-6932910f7327-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "076dafdf-a5d2-4e2d-9c38-6932910f7327" (UID: "076dafdf-a5d2-4e2d-9c38-6932910f7327"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:34:28.812949 master-0 kubenswrapper[8731]: I1205 12:34:28.812534 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/076dafdf-a5d2-4e2d-9c38-6932910f7327-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:34:28.812949 master-0 kubenswrapper[8731]: I1205 12:34:28.812595 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d627fcf3-2a80-4739-add9-e21ad4efc6eb-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:34:29.349167 master-0 kubenswrapper[8731]: I1205 12:34:29.349096 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_076dafdf-a5d2-4e2d-9c38-6932910f7327/installer/0.log" Dec 05 12:34:29.351099 master-0 kubenswrapper[8731]: I1205 12:34:29.349357 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:34:29.358642 master-0 kubenswrapper[8731]: I1205 12:34:29.358600 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/0.log" Dec 05 12:34:29.361575 master-0 kubenswrapper[8731]: I1205 12:34:29.361512 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_d627fcf3-2a80-4739-add9-e21ad4efc6eb/installer/0.log" Dec 05 12:34:29.361960 master-0 kubenswrapper[8731]: I1205 12:34:29.361725 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:34:29.644695 master-0 kubenswrapper[8731]: E1205 12:34:29.644407 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Dec 05 12:34:30.237847 master-0 kubenswrapper[8731]: E1205 12:34:30.237656 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{installer-1-master-0.187e51bb2acc06ae openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:installer-1-master-0,UID:565d5ef6-b0e7-4f04-9460-61f1d3903d37,APIVersion:v1,ResourceVersion:6929,FieldPath:spec.containers{installer},},Reason:Started,Message:Started container installer,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:32:48.294561454 +0000 UTC m=+66.598545621,LastTimestamp:2025-12-05 12:32:48.294561454 +0000 UTC m=+66.598545621,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: I1205 12:34:30.539022 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:34:30.539293 master-0 kubenswrapper[8731]: I1205 12:34:30.539133 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:34:31.817927 master-0 kubenswrapper[8731]: E1205 12:34:31.817822 8731 projected.go:194] Error preparing data for projected volume kube-api-access-bll66 for pod openshift-controller-manager/controller-manager-675db9579f-4dcg8: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:34:31.817927 master-0 kubenswrapper[8731]: E1205 12:34:31.817940 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66 podName:7e562fda-e695-4218-a9cf-4179b8d456db nodeName:}" failed. No retries permitted until 2025-12-05 12:34:33.817912071 +0000 UTC m=+172.121896238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-bll66" (UniqueName: "kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66") pod "controller-manager-675db9579f-4dcg8" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:34:33.877175 master-0 kubenswrapper[8731]: I1205 12:34:33.876665 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:34:34.024570 master-0 kubenswrapper[8731]: E1205 12:34:34.024448 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: I1205 12:34:39.546711 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:34:39.546867 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:34:39.548582 master-0 kubenswrapper[8731]: I1205 12:34:39.546890 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:34:41.247407 master-0 kubenswrapper[8731]: E1205 12:34:41.246515 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 05 12:34:44.026054 master-0 kubenswrapper[8731]: E1205 12:34:44.025766 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:34:44.026054 master-0 kubenswrapper[8731]: E1205 12:34:44.025907 8731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 12:34:48.484684 master-0 kubenswrapper[8731]: I1205 12:34:48.484227 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/1.log" Dec 05 12:34:48.486689 master-0 kubenswrapper[8731]: I1205 12:34:48.486632 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/0.log" Dec 05 12:34:48.486755 master-0 kubenswrapper[8731]: I1205 12:34:48.486719 8731 generic.go:334] "Generic (PLEG): container finished" podID="d53a4886-db25-43a1-825a-66a9a9a58590" containerID="0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee" exitCode=255 Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: I1205 12:34:48.555855 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:34:48.555946 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:34:48.556699 master-0 kubenswrapper[8731]: I1205 12:34:48.555959 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:34:48.612806 master-0 kubenswrapper[8731]: I1205 12:34:48.612687 8731 status_manager.go:851] "Failed to get status for pod" podUID="565d5ef6-b0e7-4f04-9460-61f1d3903d37" pod="openshift-kube-controller-manager/installer-1-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-1-master-0)" Dec 05 12:34:49.493396 master-0 kubenswrapper[8731]: I1205 12:34:49.493310 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_565d5ef6-b0e7-4f04-9460-61f1d3903d37/installer/0.log" Dec 05 12:34:49.493396 master-0 kubenswrapper[8731]: I1205 12:34:49.493373 8731 generic.go:334] "Generic (PLEG): container finished" podID="565d5ef6-b0e7-4f04-9460-61f1d3903d37" containerID="1cb443e02b64a65178050b34e99e50f308c86d2ef5b4e7e730bfa0faf58cc53e" exitCode=1 Dec 05 12:34:51.020967 master-0 kubenswrapper[8731]: E1205 12:34:51.020841 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-bll66], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" podUID="7e562fda-e695-4218-a9cf-4179b8d456db" Dec 05 12:34:51.505670 master-0 kubenswrapper[8731]: I1205 12:34:51.505529 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:34:54.449101 master-0 kubenswrapper[8731]: E1205 12:34:54.448952 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 05 12:34:55.534831 master-0 kubenswrapper[8731]: I1205 12:34:55.534703 8731 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5" exitCode=1 Dec 05 12:34:56.543905 master-0 kubenswrapper[8731]: I1205 12:34:56.543840 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-d9g7k_153fec1f-a10b-4c6c-a997-60fa80c13a86/manager/0.log" Dec 05 12:34:56.543905 master-0 kubenswrapper[8731]: I1205 12:34:56.543901 8731 generic.go:334] "Generic (PLEG): container finished" podID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerID="b02b74337c561023bb77d95397661e10a1ee5fc12d28b2fd7ee9556bbaba81e5" exitCode=1 Dec 05 12:34:56.546561 master-0 kubenswrapper[8731]: I1205 12:34:56.546513 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/0.log" Dec 05 12:34:56.546646 master-0 kubenswrapper[8731]: I1205 12:34:56.546580 8731 generic.go:334] "Generic (PLEG): container finished" podID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" containerID="5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57" exitCode=1 Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: I1205 12:34:57.561318 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:34:57.561404 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:34:57.562458 master-0 kubenswrapper[8731]: I1205 12:34:57.561419 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:34:58.736289 master-0 kubenswrapper[8731]: I1205 12:34:58.736150 8731 patch_prober.go:28] interesting pod/catalogd-controller-manager-7cc89f4c4c-n28z2 container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.30:8081/readyz\": dial tcp 10.128.0.30:8081: connect: connection refused" start-of-body= Dec 05 12:34:58.736854 master-0 kubenswrapper[8731]: I1205 12:34:58.736282 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" podUID="3b741029-0eb5-409b-b7f1-95e8385dc400" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.30:8081/readyz\": dial tcp 10.128.0.30:8081: connect: connection refused" Dec 05 12:34:58.736854 master-0 kubenswrapper[8731]: I1205 12:34:58.736366 8731 patch_prober.go:28] interesting pod/catalogd-controller-manager-7cc89f4c4c-n28z2 container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.30:8081/healthz\": dial tcp 10.128.0.30:8081: connect: connection refused" start-of-body= Dec 05 12:34:58.736854 master-0 kubenswrapper[8731]: I1205 12:34:58.736477 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" podUID="3b741029-0eb5-409b-b7f1-95e8385dc400" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.30:8081/healthz\": dial tcp 10.128.0.30:8081: connect: connection refused" Dec 05 12:34:59.566245 master-0 kubenswrapper[8731]: I1205 12:34:59.566120 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-n28z2_3b741029-0eb5-409b-b7f1-95e8385dc400/manager/0.log" Dec 05 12:34:59.567105 master-0 kubenswrapper[8731]: I1205 12:34:59.567024 8731 generic.go:334] "Generic (PLEG): container finished" podID="3b741029-0eb5-409b-b7f1-95e8385dc400" containerID="73f6bfa12151c71020cd1cc8c48ebdf6c4c24dbf1a05b4873ce05f073bdcce94" exitCode=1 Dec 05 12:35:01.986040 master-0 kubenswrapper[8731]: E1205 12:35:01.985895 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:35:01.986851 master-0 kubenswrapper[8731]: E1205 12:35:01.986146 8731 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.027s" Dec 05 12:35:01.986851 master-0 kubenswrapper[8731]: I1205 12:35:01.986220 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerDied","Data":"eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d"} Dec 05 12:35:01.996880 master-0 kubenswrapper[8731]: I1205 12:35:01.996794 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 05 12:35:02.592242 master-0 kubenswrapper[8731]: I1205 12:35:02.591673 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-7ff994598c-lgc7z_58187662-b502-4d90-95ce-2aa91a81d256/cluster-monitoring-operator/0.log" Dec 05 12:35:02.592242 master-0 kubenswrapper[8731]: I1205 12:35:02.592080 8731 generic.go:334] "Generic (PLEG): container finished" podID="58187662-b502-4d90-95ce-2aa91a81d256" containerID="d0c256d51be6b67ce11d61e05ebacd6a747bab028d852541d977f5d77734ba1a" exitCode=1 Dec 05 12:35:02.594998 master-0 kubenswrapper[8731]: I1205 12:35:02.594945 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/0.log" Dec 05 12:35:02.594998 master-0 kubenswrapper[8731]: I1205 12:35:02.594980 8731 generic.go:334] "Generic (PLEG): container finished" podID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" containerID="25a1113bac1425c0d6b5254d5067b012732c090d8f467edda97019523a2d47be" exitCode=1 Dec 05 12:35:03.605343 master-0 kubenswrapper[8731]: I1205 12:35:03.605069 8731 generic.go:334] "Generic (PLEG): container finished" podID="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" containerID="e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39" exitCode=0 Dec 05 12:35:04.229641 master-0 kubenswrapper[8731]: E1205 12:35:04.229412 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:34:54Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:34:54Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:34:54Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:34:54Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3e65409fc2b27ad0aaeb500a39e264663d2980821f099b830b551785ce4ce8b\\\"],\\\"sizeBytes\\\":1631758507},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9014f384de5f9a0b7418d5869ad349abb9588d16bd09ed650a163c045315dbff\\\"],\\\"sizeBytes\\\":1232140918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b472823604757237c2d16bd6f6221f4cf562aa3b05942c7f602e1e8b2e55a7c6\\\"],\\\"sizeBytes\\\":983705650},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\\\"],\\\"sizeBytes\\\":938303566},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:631a3798b749fecc041a99929eb946618df723e15055e805ff752a1a1273481c\\\"],\\\"sizeBytes\\\":870567329},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f1ca78c423f43f89a0411e40393642f64e4f8df9e5f61c25e31047c4cce170f9\\\"],\\\"sizeBytes\\\":857069957},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b12f830c3316aa4dc061c2d00c74126282b3e2bcccc301eab00d57fff3c4c7c\\\"],\\\"sizeBytes\\\":767284906},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb3ec61f9a932a9ad13bdeb44bcf9477a8d5f728151d7f19ed3ef7d4b02b3a82\\\"],\\\"sizeBytes\\\":682371258},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:916566bb9d0143352324233d460ad94697719c11c8c9158e3aea8f475941751f\\\"],\\\"sizeBytes\\\":677523572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5451aa441e5b8d8689c032405d410c8049a849ef2edf77e5b6a5ce2838c6569b\\\"],\\\"sizeBytes\\\":672407260},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9724d2036305cbd729e1f484c5bad89971de977fff8a6723fef1873858dd1123\\\"],\\\"sizeBytes\\\":616108962},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:df606f3b71d4376d1a2108c09f0d3dab455fc30bcb67c60e91590c105e9025bf\\\"],\\\"sizeBytes\\\":583836304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:79f99fd6cce984287932edf0d009660bb488d663081f3d62ec3b23bc8bfbf6c2\\\"],\\\"sizeBytes\\\":576619763},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eddedae7578d79b5a3f748000ae5c00b9f14a04710f9f9ec7b52fc569be5dfb8\\\"],\\\"sizeBytes\\\":552673986},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa24edce3d740f84c40018e94cdbf2bc7375268d13d57c2d664e43a46ccea3fc\\\"],\\\"sizeBytes\\\":543227406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\\\"],\\\"sizeBytes\\\":532719167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cfde59e48cd5dee3721f34d249cb119cc3259fd857965d34f9c7ed83b0c363a1\\\"],\\\"sizeBytes\\\":532402162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f4d4282cb53325e737ad68abbfcb70687ae04fb50353f4f0ba0ba5703b15009a\\\"],\\\"sizeBytes\\\":512838054},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:8c885ea0b3c5124989f0a9b93eba98eb9fca6bbd0262772d85d90bf713a4d572\\\"],\\\"sizeBytes\\\":512452153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\\\"],\\\"sizeBytes\\\":509437356},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e85850a4ae1a1e3ec2c590a4936d640882b6550124da22031c85b526afbf52df\\\"],\\\"sizeBytes\\\":507687221},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8375671da86aa527ee7e291d86971b0baa823ffc7663b5a983084456e76c0f59\\\"],\\\"sizeBytes\\\":506741476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:831f30660844091d6154e2674d3a9da6f34271bf8a2c40b56f7416066318742b\\\"],\\\"sizeBytes\\\":505649178},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86af77350cfe6fd69280157e4162aa0147873d9431c641ae4ad3e881ff768a73\\\"],\\\"sizeBytes\\\":505628211},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9\\\"],\\\"sizeBytes\\\":503340749},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da\\\"],\\\"sizeBytes\\\":503011144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8eabac819f289e29d75c7ab172d8124554849a47f0b00770928c3eb19a5a31c4\\\"],\\\"sizeBytes\\\":502436444},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10e57ca7611f79710f05777dc6a8f31c7e04eb09da4d8d793a5acfbf0e4692d7\\\"],\\\"sizeBytes\\\":500943492},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f042fa25014f3d37f3ea967d21f361d2a11833ae18f2c750318101b25d2497ce\\\"],\\\"sizeBytes\\\":500848684},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91af633e585621630c40d14f188e37d36b44678d0a59e582d850bf8d593d3a0c\\\"],\\\"sizeBytes\\\":499798563},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\\\"],\\\"sizeBytes\\\":499705918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:75d996f6147edb88c09fd1a052099de66638590d7d03a735006244bc9e19f898\\\"],\\\"sizeBytes\\\":499082775},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f952cec1e5332b84bdffa249cd426f39087058d6544ddcec650a414c15a9b68\\\"],\\\"sizeBytes\\\":489528665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c416b201d480bddb5a4960ec42f4740761a1335001cf84ba5ae19ad6857771b1\\\"],\\\"sizeBytes\\\":481559117},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3a77aa4d03b89ea284e3467a268e5989a77a2ef63e685eb1d5c5ea5b3922b7a\\\"],\\\"sizeBytes\\\":478917802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c1edf52f70bf9b1d1457e0c4111bc79cdaa1edd659ddbdb9d8176eff8b46956\\\"],\\\"sizeBytes\\\":462727837},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\\\"],\\\"sizeBytes\\\":459552216},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3ce2cbf1032ad0f24f204db73687002fcf302e86ebde3945801c74351b64576\\\"],\\\"sizeBytes\\\":458169255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7664a2d4cb10e82ed32abbf95799f43fc3d10135d7dd94799730de504a89680a\\\"],\\\"sizeBytes\\\":452589750},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4ecc5bac651ff1942865baee5159582e9602c89b47eeab18400a32abcba8f690\\\"],\\\"sizeBytes\\\":451039520},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2632d7f05d5a992e91038ded81c715898f3fe803420a9b67a0201e9fd8075213\\\"],\\\"sizeBytes\\\":443291941},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f0aa9cd04713acc5c6fea721bd849e1500da8ae945e0b32000887f34d786e0b\\\"],\\\"sizeBytes\\\":442509555},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e438b814f8e16f00b3fc4b69991af80eee79ae111d2a707f34aa64b2ccbb6eb\\\"],\\\"sizeBytes\\\":437737925},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a3d37aa7a22c68afa963ecfb4b43c52cccf152580cd66e4d5382fb69e4037cc\\\"],\\\"sizeBytes\\\":406053031},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9432c13d76bd4ba4eb9197c050cf88c0d701fa2055eeb59257e2e23901f9fdff\\\"],\\\"sizeBytes\\\":401810450},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9\\\"],\\\"sizeBytes\\\":390989693}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:35:04.241358 master-0 kubenswrapper[8731]: E1205 12:35:04.241133 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{route-controller-manager-858598fd98-5xkcl.187e51bb3decfde3 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-858598fd98-5xkcl,UID:bb7dd3e9-5a59-4741-970e-aa41c4e078cc,APIVersion:v1,ResourceVersion:6561,FieldPath:spec.containers{route-controller-manager},},Reason:Killing,Message:Stopping container route-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:32:48.615488995 +0000 UTC m=+66.919473162,LastTimestamp:2025-12-05 12:32:48.615488995 +0000 UTC m=+66.919473162,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:35:04.341120 master-0 kubenswrapper[8731]: I1205 12:35:04.341030 8731 patch_prober.go:28] interesting pod/operator-controller-controller-manager-7cbd59c7f8-d9g7k container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" start-of-body= Dec 05 12:35:04.341120 master-0 kubenswrapper[8731]: I1205 12:35:04.341116 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" Dec 05 12:35:05.620217 master-0 kubenswrapper[8731]: I1205 12:35:05.620102 8731 generic.go:334] "Generic (PLEG): container finished" podID="f3792522-fec6-4022-90ac-0b8467fcd625" containerID="eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457" exitCode=0 Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: I1205 12:35:06.567740 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:35:06.567805 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:35:06.568556 master-0 kubenswrapper[8731]: I1205 12:35:06.567832 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:35:07.881119 master-0 kubenswrapper[8731]: E1205 12:35:07.881038 8731 projected.go:194] Error preparing data for projected volume kube-api-access-bll66 for pod openshift-controller-manager/controller-manager-675db9579f-4dcg8: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:35:07.881817 master-0 kubenswrapper[8731]: E1205 12:35:07.881227 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66 podName:7e562fda-e695-4218-a9cf-4179b8d456db nodeName:}" failed. No retries permitted until 2025-12-05 12:35:11.881175604 +0000 UTC m=+210.185159771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-bll66" (UniqueName: "kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66") pod "controller-manager-675db9579f-4dcg8" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:35:08.736739 master-0 kubenswrapper[8731]: I1205 12:35:08.736611 8731 patch_prober.go:28] interesting pod/catalogd-controller-manager-7cc89f4c4c-n28z2 container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.30:8081/readyz\": dial tcp 10.128.0.30:8081: connect: connection refused" start-of-body= Dec 05 12:35:08.737363 master-0 kubenswrapper[8731]: I1205 12:35:08.736758 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" podUID="3b741029-0eb5-409b-b7f1-95e8385dc400" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.30:8081/readyz\": dial tcp 10.128.0.30:8081: connect: connection refused" Dec 05 12:35:10.849834 master-0 kubenswrapper[8731]: E1205 12:35:10.849726 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:35:11.945525 master-0 kubenswrapper[8731]: I1205 12:35:11.945389 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:35:14.230988 master-0 kubenswrapper[8731]: E1205 12:35:14.230862 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:35:14.342004 master-0 kubenswrapper[8731]: I1205 12:35:14.341894 8731 patch_prober.go:28] interesting pod/operator-controller-controller-manager-7cbd59c7f8-d9g7k container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" start-of-body= Dec 05 12:35:14.342004 master-0 kubenswrapper[8731]: I1205 12:35:14.341926 8731 patch_prober.go:28] interesting pod/operator-controller-controller-manager-7cbd59c7f8-d9g7k container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.33:8081/healthz\": dial tcp 10.128.0.33:8081: connect: connection refused" start-of-body= Dec 05 12:35:14.342385 master-0 kubenswrapper[8731]: I1205 12:35:14.342039 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" Dec 05 12:35:14.342385 master-0 kubenswrapper[8731]: I1205 12:35:14.342023 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.33:8081/healthz\": dial tcp 10.128.0.33:8081: connect: connection refused" Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: I1205 12:35:15.576011 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:35:15.576157 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:35:15.578072 master-0 kubenswrapper[8731]: I1205 12:35:15.576157 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:35:17.698228 master-0 kubenswrapper[8731]: I1205 12:35:17.698136 8731 generic.go:334] "Generic (PLEG): container finished" podID="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" containerID="c63a8034e23c88dd09173f57e05eee7c9bc26e35890cfdd9f1fdc8ef0e16d843" exitCode=0 Dec 05 12:35:18.736054 master-0 kubenswrapper[8731]: I1205 12:35:18.735983 8731 patch_prober.go:28] interesting pod/catalogd-controller-manager-7cc89f4c4c-n28z2 container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.30:8081/healthz\": dial tcp 10.128.0.30:8081: connect: connection refused" start-of-body= Dec 05 12:35:18.736610 master-0 kubenswrapper[8731]: I1205 12:35:18.736085 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" podUID="3b741029-0eb5-409b-b7f1-95e8385dc400" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.30:8081/healthz\": dial tcp 10.128.0.30:8081: connect: connection refused" Dec 05 12:35:18.736610 master-0 kubenswrapper[8731]: I1205 12:35:18.736137 8731 patch_prober.go:28] interesting pod/catalogd-controller-manager-7cc89f4c4c-n28z2 container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.30:8081/readyz\": dial tcp 10.128.0.30:8081: connect: connection refused" start-of-body= Dec 05 12:35:18.736610 master-0 kubenswrapper[8731]: I1205 12:35:18.736250 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" podUID="3b741029-0eb5-409b-b7f1-95e8385dc400" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.30:8081/readyz\": dial tcp 10.128.0.30:8081: connect: connection refused" Dec 05 12:35:24.231827 master-0 kubenswrapper[8731]: E1205 12:35:24.231398 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:35:24.341823 master-0 kubenswrapper[8731]: I1205 12:35:24.341651 8731 patch_prober.go:28] interesting pod/operator-controller-controller-manager-7cbd59c7f8-d9g7k container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" start-of-body= Dec 05 12:35:24.341823 master-0 kubenswrapper[8731]: I1205 12:35:24.341800 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: I1205 12:35:24.584144 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:35:24.584426 master-0 kubenswrapper[8731]: I1205 12:35:24.584273 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:35:27.851754 master-0 kubenswrapper[8731]: E1205 12:35:27.851596 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:35:28.735856 master-0 kubenswrapper[8731]: I1205 12:35:28.735767 8731 patch_prober.go:28] interesting pod/catalogd-controller-manager-7cc89f4c4c-n28z2 container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.30:8081/readyz\": dial tcp 10.128.0.30:8081: connect: connection refused" start-of-body= Dec 05 12:35:28.735856 master-0 kubenswrapper[8731]: I1205 12:35:28.735848 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" podUID="3b741029-0eb5-409b-b7f1-95e8385dc400" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.30:8081/readyz\": dial tcp 10.128.0.30:8081: connect: connection refused" Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: I1205 12:35:33.590094 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:35:33.590220 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:35:33.591870 master-0 kubenswrapper[8731]: I1205 12:35:33.590273 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:35:34.232124 master-0 kubenswrapper[8731]: E1205 12:35:34.231998 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Dec 05 12:35:34.342622 master-0 kubenswrapper[8731]: I1205 12:35:34.342145 8731 patch_prober.go:28] interesting pod/operator-controller-controller-manager-7cbd59c7f8-d9g7k container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.33:8081/healthz\": dial tcp 10.128.0.33:8081: connect: connection refused" start-of-body= Dec 05 12:35:34.342622 master-0 kubenswrapper[8731]: I1205 12:35:34.342590 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.33:8081/healthz\": dial tcp 10.128.0.33:8081: connect: connection refused" Dec 05 12:35:34.342622 master-0 kubenswrapper[8731]: I1205 12:35:34.342163 8731 patch_prober.go:28] interesting pod/operator-controller-controller-manager-7cbd59c7f8-d9g7k container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" start-of-body= Dec 05 12:35:34.343153 master-0 kubenswrapper[8731]: I1205 12:35:34.342665 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" Dec 05 12:35:36.000520 master-0 kubenswrapper[8731]: E1205 12:35:36.000150 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:35:36.000520 master-0 kubenswrapper[8731]: E1205 12:35:36.000360 8731 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.014s" Dec 05 12:35:36.000520 master-0 kubenswrapper[8731]: I1205 12:35:36.000428 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:35:36.000520 master-0 kubenswrapper[8731]: I1205 12:35:36.000460 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" event={"ID":"1871a9d6-6369-4d08-816f-9c6310b61ddf","Type":"ContainerDied","Data":"4f8a59bfccc80caaa9ccb9172563888264ac2bfba8642d650c783edb02a956b7"} Dec 05 12:35:36.001578 master-0 kubenswrapper[8731]: I1205 12:35:36.001354 8731 scope.go:117] "RemoveContainer" containerID="73f6bfa12151c71020cd1cc8c48ebdf6c4c24dbf1a05b4873ce05f073bdcce94" Dec 05 12:35:36.005721 master-0 kubenswrapper[8731]: I1205 12:35:36.005644 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 05 12:35:36.813449 master-0 kubenswrapper[8731]: I1205 12:35:36.812937 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-n28z2_3b741029-0eb5-409b-b7f1-95e8385dc400/manager/0.log" Dec 05 12:35:38.244134 master-0 kubenswrapper[8731]: E1205 12:35:38.243886 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Dec 05 12:35:38.244134 master-0 kubenswrapper[8731]: &Event{ObjectMeta:{route-controller-manager-858598fd98-5xkcl.187e51bbd0611d32 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-858598fd98-5xkcl,UID:bb7dd3e9-5a59-4741-970e-aa41c4e078cc,APIVersion:v1,ResourceVersion:6561,FieldPath:spec.containers{route-controller-manager},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.128.0.39:8443/healthz": dial tcp 10.128.0.39:8443: connect: connection refused Dec 05 12:35:38.244134 master-0 kubenswrapper[8731]: body: Dec 05 12:35:38.244134 master-0 kubenswrapper[8731]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:32:51.072572722 +0000 UTC m=+69.376556899,LastTimestamp:2025-12-05 12:32:51.072572722 +0000 UTC m=+69.376556899,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Dec 05 12:35:38.244134 master-0 kubenswrapper[8731]: > Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: I1205 12:35:42.597516 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:35:42.597589 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:35:42.599282 master-0 kubenswrapper[8731]: I1205 12:35:42.597615 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:35:44.232878 master-0 kubenswrapper[8731]: E1205 12:35:44.232730 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:35:44.232878 master-0 kubenswrapper[8731]: E1205 12:35:44.232802 8731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 12:35:44.341337 master-0 kubenswrapper[8731]: I1205 12:35:44.341156 8731 patch_prober.go:28] interesting pod/operator-controller-controller-manager-7cbd59c7f8-d9g7k container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" start-of-body= Dec 05 12:35:44.341337 master-0 kubenswrapper[8731]: I1205 12:35:44.341292 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" Dec 05 12:35:44.853704 master-0 kubenswrapper[8731]: E1205 12:35:44.853513 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:35:45.949024 master-0 kubenswrapper[8731]: E1205 12:35:45.948899 8731 projected.go:194] Error preparing data for projected volume kube-api-access-bll66 for pod openshift-controller-manager/controller-manager-675db9579f-4dcg8: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:35:45.950300 master-0 kubenswrapper[8731]: E1205 12:35:45.949049 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66 podName:7e562fda-e695-4218-a9cf-4179b8d456db nodeName:}" failed. No retries permitted until 2025-12-05 12:35:53.949010659 +0000 UTC m=+252.252994866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-bll66" (UniqueName: "kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66") pod "controller-manager-675db9579f-4dcg8" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:35:48.615558 master-0 kubenswrapper[8731]: I1205 12:35:48.615448 8731 status_manager.go:851] "Failed to get status for pod" podUID="8b47694fcc32464ab24d09c23d6efb57" pod="kube-system/bootstrap-kube-controller-manager-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods bootstrap-kube-controller-manager-master-0)" Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: I1205 12:35:51.605974 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:35:51.606113 master-0 kubenswrapper[8731]: I1205 12:35:51.606103 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:35:53.975016 master-0 kubenswrapper[8731]: I1205 12:35:53.974834 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:35:54.341647 master-0 kubenswrapper[8731]: I1205 12:35:54.341445 8731 patch_prober.go:28] interesting pod/operator-controller-controller-manager-7cbd59c7f8-d9g7k container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" start-of-body= Dec 05 12:35:54.341647 master-0 kubenswrapper[8731]: I1205 12:35:54.341555 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" Dec 05 12:35:54.341945 master-0 kubenswrapper[8731]: I1205 12:35:54.341647 8731 patch_prober.go:28] interesting pod/operator-controller-controller-manager-7cbd59c7f8-d9g7k container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.33:8081/healthz\": dial tcp 10.128.0.33:8081: connect: connection refused" start-of-body= Dec 05 12:35:54.341945 master-0 kubenswrapper[8731]: I1205 12:35:54.341755 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.33:8081/healthz\": dial tcp 10.128.0.33:8081: connect: connection refused" Dec 05 12:35:59.955375 master-0 kubenswrapper[8731]: I1205 12:35:59.955259 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-848f645654-g6nj5_594aaded-5615-4bed-87ee-6173059a73be/kube-controller-manager-operator/1.log" Dec 05 12:35:59.957099 master-0 kubenswrapper[8731]: I1205 12:35:59.955975 8731 generic.go:334] "Generic (PLEG): container finished" podID="594aaded-5615-4bed-87ee-6173059a73be" containerID="1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237" exitCode=255 Dec 05 12:35:59.958772 master-0 kubenswrapper[8731]: I1205 12:35:59.958724 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-xxmfp_ba095394-1873-4793-969d-3be979fa0771/authentication-operator/1.log" Dec 05 12:35:59.959518 master-0 kubenswrapper[8731]: I1205 12:35:59.959453 8731 generic.go:334] "Generic (PLEG): container finished" podID="ba095394-1873-4793-969d-3be979fa0771" containerID="2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994" exitCode=255 Dec 05 12:35:59.961665 master-0 kubenswrapper[8731]: I1205 12:35:59.961616 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/1.log" Dec 05 12:35:59.962621 master-0 kubenswrapper[8731]: I1205 12:35:59.962582 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/0.log" Dec 05 12:35:59.962735 master-0 kubenswrapper[8731]: I1205 12:35:59.962623 8731 generic.go:334] "Generic (PLEG): container finished" podID="5efad170-c154-42ec-a7c0-b36a98d2bfcc" containerID="2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa" exitCode=255 Dec 05 12:35:59.965106 master-0 kubenswrapper[8731]: I1205 12:35:59.965060 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/1.log" Dec 05 12:35:59.965621 master-0 kubenswrapper[8731]: I1205 12:35:59.965576 8731 generic.go:334] "Generic (PLEG): container finished" podID="f119ffe4-16bd-49eb-916d-b18ba0d79b54" containerID="a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77" exitCode=255 Dec 05 12:35:59.968003 master-0 kubenswrapper[8731]: I1205 12:35:59.967955 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f85974995-4vsjv_1871a9d6-6369-4d08-816f-9c6310b61ddf/kube-scheduler-operator-container/1.log" Dec 05 12:35:59.968746 master-0 kubenswrapper[8731]: I1205 12:35:59.968692 8731 generic.go:334] "Generic (PLEG): container finished" podID="1871a9d6-6369-4d08-816f-9c6310b61ddf" containerID="6017aa71f5b63d7cdd32da77565edb00ec8b8b5e5059d49e4246bd4f05d6b50b" exitCode=255 Dec 05 12:35:59.972026 master-0 kubenswrapper[8731]: I1205 12:35:59.971954 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-rw57t_807d9093-aa67-4840-b5be-7f3abcc1beed/kube-apiserver-operator/1.log" Dec 05 12:35:59.972962 master-0 kubenswrapper[8731]: I1205 12:35:59.972887 8731 generic.go:334] "Generic (PLEG): container finished" podID="807d9093-aa67-4840-b5be-7f3abcc1beed" containerID="1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae" exitCode=255 Dec 05 12:35:59.975825 master-0 kubenswrapper[8731]: I1205 12:35:59.975749 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b9c5dfc78-2n8gt_7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/kube-storage-version-migrator-operator/1.log" Dec 05 12:35:59.976622 master-0 kubenswrapper[8731]: I1205 12:35:59.976565 8731 generic.go:334] "Generic (PLEG): container finished" podID="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" containerID="e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497" exitCode=255 Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: I1205 12:36:00.613526 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:36:00.613657 master-0 kubenswrapper[8731]: I1205 12:36:00.613644 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:36:01.855021 master-0 kubenswrapper[8731]: E1205 12:36:01.854886 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:36:04.342095 master-0 kubenswrapper[8731]: I1205 12:36:04.342004 8731 patch_prober.go:28] interesting pod/operator-controller-controller-manager-7cbd59c7f8-d9g7k container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" start-of-body= Dec 05 12:36:04.343511 master-0 kubenswrapper[8731]: I1205 12:36:04.343120 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.33:8081/readyz\": dial tcp 10.128.0.33:8081: connect: connection refused" Dec 05 12:36:04.431328 master-0 kubenswrapper[8731]: E1205 12:36:04.430981 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:35:54Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:35:54Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:35:54Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:35:54Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3e65409fc2b27ad0aaeb500a39e264663d2980821f099b830b551785ce4ce8b\\\"],\\\"sizeBytes\\\":1631758507},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9014f384de5f9a0b7418d5869ad349abb9588d16bd09ed650a163c045315dbff\\\"],\\\"sizeBytes\\\":1232140918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b472823604757237c2d16bd6f6221f4cf562aa3b05942c7f602e1e8b2e55a7c6\\\"],\\\"sizeBytes\\\":983705650},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\\\"],\\\"sizeBytes\\\":938303566},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:631a3798b749fecc041a99929eb946618df723e15055e805ff752a1a1273481c\\\"],\\\"sizeBytes\\\":870567329},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f1ca78c423f43f89a0411e40393642f64e4f8df9e5f61c25e31047c4cce170f9\\\"],\\\"sizeBytes\\\":857069957},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b12f830c3316aa4dc061c2d00c74126282b3e2bcccc301eab00d57fff3c4c7c\\\"],\\\"sizeBytes\\\":767284906},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb3ec61f9a932a9ad13bdeb44bcf9477a8d5f728151d7f19ed3ef7d4b02b3a82\\\"],\\\"sizeBytes\\\":682371258},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:916566bb9d0143352324233d460ad94697719c11c8c9158e3aea8f475941751f\\\"],\\\"sizeBytes\\\":677523572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5451aa441e5b8d8689c032405d410c8049a849ef2edf77e5b6a5ce2838c6569b\\\"],\\\"sizeBytes\\\":672407260},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9724d2036305cbd729e1f484c5bad89971de977fff8a6723fef1873858dd1123\\\"],\\\"sizeBytes\\\":616108962},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:df606f3b71d4376d1a2108c09f0d3dab455fc30bcb67c60e91590c105e9025bf\\\"],\\\"sizeBytes\\\":583836304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:79f99fd6cce984287932edf0d009660bb488d663081f3d62ec3b23bc8bfbf6c2\\\"],\\\"sizeBytes\\\":576619763},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eddedae7578d79b5a3f748000ae5c00b9f14a04710f9f9ec7b52fc569be5dfb8\\\"],\\\"sizeBytes\\\":552673986},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa24edce3d740f84c40018e94cdbf2bc7375268d13d57c2d664e43a46ccea3fc\\\"],\\\"sizeBytes\\\":543227406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\\\"],\\\"sizeBytes\\\":532719167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cfde59e48cd5dee3721f34d249cb119cc3259fd857965d34f9c7ed83b0c363a1\\\"],\\\"sizeBytes\\\":532402162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f4d4282cb53325e737ad68abbfcb70687ae04fb50353f4f0ba0ba5703b15009a\\\"],\\\"sizeBytes\\\":512838054},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:8c885ea0b3c5124989f0a9b93eba98eb9fca6bbd0262772d85d90bf713a4d572\\\"],\\\"sizeBytes\\\":512452153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\\\"],\\\"sizeBytes\\\":509437356},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e85850a4ae1a1e3ec2c590a4936d640882b6550124da22031c85b526afbf52df\\\"],\\\"sizeBytes\\\":507687221},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8375671da86aa527ee7e291d86971b0baa823ffc7663b5a983084456e76c0f59\\\"],\\\"sizeBytes\\\":506741476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:831f30660844091d6154e2674d3a9da6f34271bf8a2c40b56f7416066318742b\\\"],\\\"sizeBytes\\\":505649178},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86af77350cfe6fd69280157e4162aa0147873d9431c641ae4ad3e881ff768a73\\\"],\\\"sizeBytes\\\":505628211},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9\\\"],\\\"sizeBytes\\\":503340749},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da\\\"],\\\"sizeBytes\\\":503011144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8eabac819f289e29d75c7ab172d8124554849a47f0b00770928c3eb19a5a31c4\\\"],\\\"sizeBytes\\\":502436444},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10e57ca7611f79710f05777dc6a8f31c7e04eb09da4d8d793a5acfbf0e4692d7\\\"],\\\"sizeBytes\\\":500943492},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f042fa25014f3d37f3ea967d21f361d2a11833ae18f2c750318101b25d2497ce\\\"],\\\"sizeBytes\\\":500848684},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91af633e585621630c40d14f188e37d36b44678d0a59e582d850bf8d593d3a0c\\\"],\\\"sizeBytes\\\":499798563},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\\\"],\\\"sizeBytes\\\":499705918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:75d996f6147edb88c09fd1a052099de66638590d7d03a735006244bc9e19f898\\\"],\\\"sizeBytes\\\":499082775},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f952cec1e5332b84bdffa249cd426f39087058d6544ddcec650a414c15a9b68\\\"],\\\"sizeBytes\\\":489528665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c416b201d480bddb5a4960ec42f4740761a1335001cf84ba5ae19ad6857771b1\\\"],\\\"sizeBytes\\\":481559117},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3a77aa4d03b89ea284e3467a268e5989a77a2ef63e685eb1d5c5ea5b3922b7a\\\"],\\\"sizeBytes\\\":478917802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c1edf52f70bf9b1d1457e0c4111bc79cdaa1edd659ddbdb9d8176eff8b46956\\\"],\\\"sizeBytes\\\":462727837},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\\\"],\\\"sizeBytes\\\":459552216},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3ce2cbf1032ad0f24f204db73687002fcf302e86ebde3945801c74351b64576\\\"],\\\"sizeBytes\\\":458169255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7664a2d4cb10e82ed32abbf95799f43fc3d10135d7dd94799730de504a89680a\\\"],\\\"sizeBytes\\\":452589750},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4ecc5bac651ff1942865baee5159582e9602c89b47eeab18400a32abcba8f690\\\"],\\\"sizeBytes\\\":451039520},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2632d7f05d5a992e91038ded81c715898f3fe803420a9b67a0201e9fd8075213\\\"],\\\"sizeBytes\\\":443291941},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f0aa9cd04713acc5c6fea721bd849e1500da8ae945e0b32000887f34d786e0b\\\"],\\\"sizeBytes\\\":442509555},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e438b814f8e16f00b3fc4b69991af80eee79ae111d2a707f34aa64b2ccbb6eb\\\"],\\\"sizeBytes\\\":437737925},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a3d37aa7a22c68afa963ecfb4b43c52cccf152580cd66e4d5382fb69e4037cc\\\"],\\\"sizeBytes\\\":406053031},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9432c13d76bd4ba4eb9197c050cf88c0d701fa2055eeb59257e2e23901f9fdff\\\"],\\\"sizeBytes\\\":401810450},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9\\\"],\\\"sizeBytes\\\":390989693}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: I1205 12:36:09.621875 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:36:09.621986 master-0 kubenswrapper[8731]: I1205 12:36:09.621956 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:36:10.008165 master-0 kubenswrapper[8731]: E1205 12:36:10.008070 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:36:10.008710 master-0 kubenswrapper[8731]: E1205 12:36:10.008281 8731 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.008s" Dec 05 12:36:10.008710 master-0 kubenswrapper[8731]: I1205 12:36:10.008306 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:36:10.008710 master-0 kubenswrapper[8731]: I1205 12:36:10.008331 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerDied","Data":"49ca67aa7902f9104b46e18f411e1fcfcd3bd696757b09b6ab811180664a0848"} Dec 05 12:36:10.008710 master-0 kubenswrapper[8731]: I1205 12:36:10.008362 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"d627fcf3-2a80-4739-add9-e21ad4efc6eb","Type":"ContainerDied","Data":"8654b600b7307ea1bcd3fe84275fb56084c5722cbe5ccf524025cea2bfa3d8cd"} Dec 05 12:36:10.008710 master-0 kubenswrapper[8731]: I1205 12:36:10.008410 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:36:10.008710 master-0 kubenswrapper[8731]: I1205 12:36:10.008558 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" event={"ID":"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e","Type":"ContainerDied","Data":"401643c70c405d6156a16a3ab17611e0b06471ba9931da499a2092a2a6caa1f3"} Dec 05 12:36:10.008710 master-0 kubenswrapper[8731]: I1205 12:36:10.008575 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:36:10.009120 master-0 kubenswrapper[8731]: I1205 12:36:10.008832 8731 scope.go:117] "RemoveContainer" containerID="a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77" Dec 05 12:36:10.009120 master-0 kubenswrapper[8731]: I1205 12:36:10.008878 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:36:10.009120 master-0 kubenswrapper[8731]: I1205 12:36:10.008984 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerDied","Data":"47752e8beaf9f853c41667ca645eb6d00a5917c9b6cb4206f48e1b5596bdcc79"} Dec 05 12:36:10.009345 master-0 kubenswrapper[8731]: I1205 12:36:10.009166 8731 scope.go:117] "RemoveContainer" containerID="47752e8beaf9f853c41667ca645eb6d00a5917c9b6cb4206f48e1b5596bdcc79" Dec 05 12:36:10.013682 master-0 kubenswrapper[8731]: I1205 12:36:10.011303 8731 scope.go:117] "RemoveContainer" containerID="e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39" Dec 05 12:36:10.013682 master-0 kubenswrapper[8731]: I1205 12:36:10.011750 8731 scope.go:117] "RemoveContainer" containerID="5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57" Dec 05 12:36:10.013682 master-0 kubenswrapper[8731]: I1205 12:36:10.011798 8731 scope.go:117] "RemoveContainer" containerID="eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457" Dec 05 12:36:10.013682 master-0 kubenswrapper[8731]: I1205 12:36:10.011921 8731 scope.go:117] "RemoveContainer" containerID="e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497" Dec 05 12:36:10.013682 master-0 kubenswrapper[8731]: I1205 12:36:10.012395 8731 scope.go:117] "RemoveContainer" containerID="2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994" Dec 05 12:36:10.013682 master-0 kubenswrapper[8731]: I1205 12:36:10.012782 8731 scope.go:117] "RemoveContainer" containerID="25a1113bac1425c0d6b5254d5067b012732c090d8f467edda97019523a2d47be" Dec 05 12:36:10.013682 master-0 kubenswrapper[8731]: I1205 12:36:10.013140 8731 scope.go:117] "RemoveContainer" containerID="401643c70c405d6156a16a3ab17611e0b06471ba9931da499a2092a2a6caa1f3" Dec 05 12:36:10.013682 master-0 kubenswrapper[8731]: I1205 12:36:10.013222 8731 scope.go:117] "RemoveContainer" containerID="880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5" Dec 05 12:36:10.014511 master-0 kubenswrapper[8731]: I1205 12:36:10.013900 8731 scope.go:117] "RemoveContainer" containerID="b02b74337c561023bb77d95397661e10a1ee5fc12d28b2fd7ee9556bbaba81e5" Dec 05 12:36:10.014511 master-0 kubenswrapper[8731]: I1205 12:36:10.014003 8731 scope.go:117] "RemoveContainer" containerID="2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa" Dec 05 12:36:10.014848 master-0 kubenswrapper[8731]: I1205 12:36:10.014582 8731 scope.go:117] "RemoveContainer" containerID="c63a8034e23c88dd09173f57e05eee7c9bc26e35890cfdd9f1fdc8ef0e16d843" Dec 05 12:36:10.014848 master-0 kubenswrapper[8731]: I1205 12:36:10.014659 8731 scope.go:117] "RemoveContainer" containerID="1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae" Dec 05 12:36:10.015296 master-0 kubenswrapper[8731]: I1205 12:36:10.015263 8731 scope.go:117] "RemoveContainer" containerID="1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237" Dec 05 12:36:10.016609 master-0 kubenswrapper[8731]: I1205 12:36:10.015748 8731 scope.go:117] "RemoveContainer" containerID="0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee" Dec 05 12:36:10.016609 master-0 kubenswrapper[8731]: I1205 12:36:10.016261 8731 scope.go:117] "RemoveContainer" containerID="d0c256d51be6b67ce11d61e05ebacd6a747bab028d852541d977f5d77734ba1a" Dec 05 12:36:10.024918 master-0 kubenswrapper[8731]: I1205 12:36:10.024858 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 05 12:36:10.350623 master-0 kubenswrapper[8731]: I1205 12:36:10.349857 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_565d5ef6-b0e7-4f04-9460-61f1d3903d37/installer/0.log" Dec 05 12:36:10.350853 master-0 kubenswrapper[8731]: I1205 12:36:10.350692 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:36:10.394574 master-0 kubenswrapper[8731]: I1205 12:36:10.394141 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-var-lock\") pod \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " Dec 05 12:36:10.394574 master-0 kubenswrapper[8731]: I1205 12:36:10.394207 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kubelet-dir\") pod \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " Dec 05 12:36:10.394574 master-0 kubenswrapper[8731]: I1205 12:36:10.394294 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kube-api-access\") pod \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\" (UID: \"565d5ef6-b0e7-4f04-9460-61f1d3903d37\") " Dec 05 12:36:10.394574 master-0 kubenswrapper[8731]: I1205 12:36:10.394453 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-var-lock" (OuterVolumeSpecName: "var-lock") pod "565d5ef6-b0e7-4f04-9460-61f1d3903d37" (UID: "565d5ef6-b0e7-4f04-9460-61f1d3903d37"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:36:10.394574 master-0 kubenswrapper[8731]: I1205 12:36:10.394465 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "565d5ef6-b0e7-4f04-9460-61f1d3903d37" (UID: "565d5ef6-b0e7-4f04-9460-61f1d3903d37"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:36:10.416509 master-0 kubenswrapper[8731]: I1205 12:36:10.416442 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "565d5ef6-b0e7-4f04-9460-61f1d3903d37" (UID: "565d5ef6-b0e7-4f04-9460-61f1d3903d37"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:36:10.495274 master-0 kubenswrapper[8731]: I1205 12:36:10.495227 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:36:10.495274 master-0 kubenswrapper[8731]: I1205 12:36:10.495267 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:36:10.495481 master-0 kubenswrapper[8731]: I1205 12:36:10.495286 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/565d5ef6-b0e7-4f04-9460-61f1d3903d37-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:36:11.066050 master-0 kubenswrapper[8731]: I1205 12:36:11.065574 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-rw57t_807d9093-aa67-4840-b5be-7f3abcc1beed/kube-apiserver-operator/1.log" Dec 05 12:36:11.079203 master-0 kubenswrapper[8731]: I1205 12:36:11.079113 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/1.log" Dec 05 12:36:11.079898 master-0 kubenswrapper[8731]: I1205 12:36:11.079841 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/0.log" Dec 05 12:36:11.086638 master-0 kubenswrapper[8731]: I1205 12:36:11.086567 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-d9g7k_153fec1f-a10b-4c6c-a997-60fa80c13a86/manager/0.log" Dec 05 12:36:11.089792 master-0 kubenswrapper[8731]: I1205 12:36:11.089734 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-7ff994598c-lgc7z_58187662-b502-4d90-95ce-2aa91a81d256/cluster-monitoring-operator/0.log" Dec 05 12:36:11.092149 master-0 kubenswrapper[8731]: I1205 12:36:11.092103 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b9c5dfc78-2n8gt_7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/kube-storage-version-migrator-operator/1.log" Dec 05 12:36:11.099481 master-0 kubenswrapper[8731]: I1205 12:36:11.099422 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-xxmfp_ba095394-1873-4793-969d-3be979fa0771/authentication-operator/1.log" Dec 05 12:36:11.102325 master-0 kubenswrapper[8731]: I1205 12:36:11.102282 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_565d5ef6-b0e7-4f04-9460-61f1d3903d37/installer/0.log" Dec 05 12:36:11.102498 master-0 kubenswrapper[8731]: I1205 12:36:11.102449 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:36:11.105253 master-0 kubenswrapper[8731]: I1205 12:36:11.105156 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/1.log" Dec 05 12:36:11.107804 master-0 kubenswrapper[8731]: I1205 12:36:11.107752 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/0.log" Dec 05 12:36:11.110086 master-0 kubenswrapper[8731]: I1205 12:36:11.110046 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-848f645654-g6nj5_594aaded-5615-4bed-87ee-6173059a73be/kube-controller-manager-operator/1.log" Dec 05 12:36:11.112848 master-0 kubenswrapper[8731]: I1205 12:36:11.112813 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/1.log" Dec 05 12:36:11.113777 master-0 kubenswrapper[8731]: I1205 12:36:11.113733 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/0.log" Dec 05 12:36:11.119329 master-0 kubenswrapper[8731]: I1205 12:36:11.119288 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/0.log" Dec 05 12:36:12.247700 master-0 kubenswrapper[8731]: E1205 12:36:12.246903 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{route-controller-manager-858598fd98-5xkcl.187e51bbd06223b5 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-858598fd98-5xkcl,UID:bb7dd3e9-5a59-4741-970e-aa41c4e078cc,APIVersion:v1,ResourceVersion:6561,FieldPath:spec.containers{route-controller-manager},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://10.128.0.39:8443/healthz\": dial tcp 10.128.0.39:8443: connect: connection refused,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:32:51.072639925 +0000 UTC m=+69.376624092,LastTimestamp:2025-12-05 12:32:51.072639925 +0000 UTC m=+69.376624092,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:36:14.432608 master-0 kubenswrapper[8731]: E1205 12:36:14.432486 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: I1205 12:36:18.630141 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:36:18.631015 master-0 kubenswrapper[8731]: I1205 12:36:18.630881 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:36:18.856459 master-0 kubenswrapper[8731]: E1205 12:36:18.856343 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:36:23.018090 master-0 kubenswrapper[8731]: E1205 12:36:23.018020 8731 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 05 12:36:24.433829 master-0 kubenswrapper[8731]: E1205 12:36:24.433329 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: I1205 12:36:27.636405 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:36:27.636946 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:36:27.638353 master-0 kubenswrapper[8731]: I1205 12:36:27.636962 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:36:27.979220 master-0 kubenswrapper[8731]: E1205 12:36:27.979109 8731 projected.go:194] Error preparing data for projected volume kube-api-access-bll66 for pod openshift-controller-manager/controller-manager-675db9579f-4dcg8: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:36:27.979754 master-0 kubenswrapper[8731]: E1205 12:36:27.979322 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66 podName:7e562fda-e695-4218-a9cf-4179b8d456db nodeName:}" failed. No retries permitted until 2025-12-05 12:36:43.979274744 +0000 UTC m=+302.283258941 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-bll66" (UniqueName: "kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66") pod "controller-manager-675db9579f-4dcg8" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:36:34.434665 master-0 kubenswrapper[8731]: E1205 12:36:34.434570 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:36:35.858415 master-0 kubenswrapper[8731]: E1205 12:36:35.858343 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: I1205 12:36:36.645093 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:36:36.645223 master-0 kubenswrapper[8731]: I1205 12:36:36.645217 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:36:41.351755 master-0 kubenswrapper[8731]: I1205 12:36:41.351245 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/1.log" Dec 05 12:36:41.353309 master-0 kubenswrapper[8731]: I1205 12:36:41.352750 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/0.log" Dec 05 12:36:41.353309 master-0 kubenswrapper[8731]: I1205 12:36:41.352826 8731 generic.go:334] "Generic (PLEG): container finished" podID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" containerID="98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad" exitCode=1 Dec 05 12:36:41.993109 master-0 kubenswrapper[8731]: I1205 12:36:41.993019 8731 scope.go:117] "RemoveContainer" containerID="87e2f0751f7349d9f2700480abbb17089facf86a7329bd4aecf04d7f2bed205a" Dec 05 12:36:43.989525 master-0 kubenswrapper[8731]: I1205 12:36:43.989380 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:36:44.029444 master-0 kubenswrapper[8731]: E1205 12:36:44.029319 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:36:44.029807 master-0 kubenswrapper[8731]: E1205 12:36:44.029577 8731 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.02s" Dec 05 12:36:44.029807 master-0 kubenswrapper[8731]: I1205 12:36:44.029623 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:36:44.039093 master-0 kubenswrapper[8731]: I1205 12:36:44.039017 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 05 12:36:44.435975 master-0 kubenswrapper[8731]: E1205 12:36:44.435766 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:36:44.435975 master-0 kubenswrapper[8731]: E1205 12:36:44.435823 8731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: I1205 12:36:45.654120 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:36:45.654707 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:36:45.656042 master-0 kubenswrapper[8731]: I1205 12:36:45.654737 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:36:46.251518 master-0 kubenswrapper[8731]: E1205 12:36:46.251114 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{network-metrics-daemon-99djw.187e51bdc62eff46 openshift-multus 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-multus,Name:network-metrics-daemon-99djw,UID:fb7003a6-4341-49eb-bec3-76ba8610fa12,APIVersion:v1,ResourceVersion:3343,FieldPath:spec.containers{network-metrics-daemon},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2632d7f05d5a992e91038ded81c715898f3fe803420a9b67a0201e9fd8075213\" in 12.031s (12.031s including waiting). Image size: 443291941 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:32:59.491450694 +0000 UTC m=+77.795434881,LastTimestamp:2025-12-05 12:32:59.491450694 +0000 UTC m=+77.795434881,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:36:48.618571 master-0 kubenswrapper[8731]: I1205 12:36:48.617984 8731 status_manager.go:851] "Failed to get status for pod" podUID="a2acba71-b9dc-4b85-be35-c995b8be2f19" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods cluster-node-tuning-operator-85cff47f46-p9xtc)" Dec 05 12:36:51.676817 master-0 kubenswrapper[8731]: I1205 12:36:51.676750 8731 patch_prober.go:28] interesting pod/etcd-operator-5bf4d88c6f-dxd24 container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Dec 05 12:36:51.678827 master-0 kubenswrapper[8731]: I1205 12:36:51.678763 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" podUID="f119ffe4-16bd-49eb-916d-b18ba0d79b54" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Dec 05 12:36:52.860991 master-0 kubenswrapper[8731]: E1205 12:36:52.860817 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:36:53.516533 master-0 kubenswrapper[8731]: I1205 12:36:53.516368 8731 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 12:36:54.507294 master-0 kubenswrapper[8731]: E1205 12:36:54.507131 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-bll66], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" podUID="7e562fda-e695-4218-a9cf-4179b8d456db" Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: I1205 12:36:54.663422 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:36:54.663492 master-0 kubenswrapper[8731]: I1205 12:36:54.663493 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:36:55.442406 master-0 kubenswrapper[8731]: I1205 12:36:55.441848 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:37:03.517018 master-0 kubenswrapper[8731]: I1205 12:37:03.516894 8731 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: I1205 12:37:03.668664 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:37:03.668747 master-0 kubenswrapper[8731]: I1205 12:37:03.668749 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:37:04.655272 master-0 kubenswrapper[8731]: E1205 12:37:04.654996 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:36:54Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:36:54Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:36:54Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:36:54Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e3e65409fc2b27ad0aaeb500a39e264663d2980821f099b830b551785ce4ce8b\\\"],\\\"sizeBytes\\\":1631758507},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9014f384de5f9a0b7418d5869ad349abb9588d16bd09ed650a163c045315dbff\\\"],\\\"sizeBytes\\\":1232140918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b472823604757237c2d16bd6f6221f4cf562aa3b05942c7f602e1e8b2e55a7c6\\\"],\\\"sizeBytes\\\":983705650},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\\\"],\\\"sizeBytes\\\":938303566},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:631a3798b749fecc041a99929eb946618df723e15055e805ff752a1a1273481c\\\"],\\\"sizeBytes\\\":870567329},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f1ca78c423f43f89a0411e40393642f64e4f8df9e5f61c25e31047c4cce170f9\\\"],\\\"sizeBytes\\\":857069957},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b12f830c3316aa4dc061c2d00c74126282b3e2bcccc301eab00d57fff3c4c7c\\\"],\\\"sizeBytes\\\":767284906},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb3ec61f9a932a9ad13bdeb44bcf9477a8d5f728151d7f19ed3ef7d4b02b3a82\\\"],\\\"sizeBytes\\\":682371258},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:916566bb9d0143352324233d460ad94697719c11c8c9158e3aea8f475941751f\\\"],\\\"sizeBytes\\\":677523572},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5451aa441e5b8d8689c032405d410c8049a849ef2edf77e5b6a5ce2838c6569b\\\"],\\\"sizeBytes\\\":672407260},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9724d2036305cbd729e1f484c5bad89971de977fff8a6723fef1873858dd1123\\\"],\\\"sizeBytes\\\":616108962},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:df606f3b71d4376d1a2108c09f0d3dab455fc30bcb67c60e91590c105e9025bf\\\"],\\\"sizeBytes\\\":583836304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:79f99fd6cce984287932edf0d009660bb488d663081f3d62ec3b23bc8bfbf6c2\\\"],\\\"sizeBytes\\\":576619763},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eddedae7578d79b5a3f748000ae5c00b9f14a04710f9f9ec7b52fc569be5dfb8\\\"],\\\"sizeBytes\\\":552673986},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:aa24edce3d740f84c40018e94cdbf2bc7375268d13d57c2d664e43a46ccea3fc\\\"],\\\"sizeBytes\\\":543227406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:188637a52cafee61ec461e92fb0c605e28be325b9ac1f2ac8a37d68e97654718\\\"],\\\"sizeBytes\\\":532719167},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cfde59e48cd5dee3721f34d249cb119cc3259fd857965d34f9c7ed83b0c363a1\\\"],\\\"sizeBytes\\\":532402162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f4d4282cb53325e737ad68abbfcb70687ae04fb50353f4f0ba0ba5703b15009a\\\"],\\\"sizeBytes\\\":512838054},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:8c885ea0b3c5124989f0a9b93eba98eb9fca6bbd0262772d85d90bf713a4d572\\\"],\\\"sizeBytes\\\":512452153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0f43c31aa3359159d4557dad3cfaf812d8ce44db9cb9ae970e06d3479070b660\\\"],\\\"sizeBytes\\\":509437356},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e85850a4ae1a1e3ec2c590a4936d640882b6550124da22031c85b526afbf52df\\\"],\\\"sizeBytes\\\":507687221},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8375671da86aa527ee7e291d86971b0baa823ffc7663b5a983084456e76c0f59\\\"],\\\"sizeBytes\\\":506741476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:831f30660844091d6154e2674d3a9da6f34271bf8a2c40b56f7416066318742b\\\"],\\\"sizeBytes\\\":505649178},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86af77350cfe6fd69280157e4162aa0147873d9431c641ae4ad3e881ff768a73\\\"],\\\"sizeBytes\\\":505628211},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a824e468cf8dd61d347e35b2ee5bc2f815666957647098e21a1bb56ff613e5b9\\\"],\\\"sizeBytes\\\":503340749},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8139ed65c0a0a4b0f253b715c11cc52be027efe8a4774da9ccce35c78ef439da\\\"],\\\"sizeBytes\\\":503011144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8eabac819f289e29d75c7ab172d8124554849a47f0b00770928c3eb19a5a31c4\\\"],\\\"sizeBytes\\\":502436444},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10e57ca7611f79710f05777dc6a8f31c7e04eb09da4d8d793a5acfbf0e4692d7\\\"],\\\"sizeBytes\\\":500943492},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f042fa25014f3d37f3ea967d21f361d2a11833ae18f2c750318101b25d2497ce\\\"],\\\"sizeBytes\\\":500848684},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91af633e585621630c40d14f188e37d36b44678d0a59e582d850bf8d593d3a0c\\\"],\\\"sizeBytes\\\":499798563},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d64c13fe7663a0b4ae61d103b1b7598adcf317a01826f296bcb66b1a2de83c96\\\"],\\\"sizeBytes\\\":499705918},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:75d996f6147edb88c09fd1a052099de66638590d7d03a735006244bc9e19f898\\\"],\\\"sizeBytes\\\":499082775},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1f952cec1e5332b84bdffa249cd426f39087058d6544ddcec650a414c15a9b68\\\"],\\\"sizeBytes\\\":489528665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c416b201d480bddb5a4960ec42f4740761a1335001cf84ba5ae19ad6857771b1\\\"],\\\"sizeBytes\\\":481559117},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3a77aa4d03b89ea284e3467a268e5989a77a2ef63e685eb1d5c5ea5b3922b7a\\\"],\\\"sizeBytes\\\":478917802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c1edf52f70bf9b1d1457e0c4111bc79cdaa1edd659ddbdb9d8176eff8b46956\\\"],\\\"sizeBytes\\\":462727837},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69ffd8f8dcceedc2d6eb306cea33f8beabc1be1308cd5f4ee8b9a8e3eab9843\\\"],\\\"sizeBytes\\\":459552216},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d3ce2cbf1032ad0f24f204db73687002fcf302e86ebde3945801c74351b64576\\\"],\\\"sizeBytes\\\":458169255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7664a2d4cb10e82ed32abbf95799f43fc3d10135d7dd94799730de504a89680a\\\"],\\\"sizeBytes\\\":452589750},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4ecc5bac651ff1942865baee5159582e9602c89b47eeab18400a32abcba8f690\\\"],\\\"sizeBytes\\\":451039520},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2632d7f05d5a992e91038ded81c715898f3fe803420a9b67a0201e9fd8075213\\\"],\\\"sizeBytes\\\":443291941},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f0aa9cd04713acc5c6fea721bd849e1500da8ae945e0b32000887f34d786e0b\\\"],\\\"sizeBytes\\\":442509555},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e438b814f8e16f00b3fc4b69991af80eee79ae111d2a707f34aa64b2ccbb6eb\\\"],\\\"sizeBytes\\\":437737925},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a3d37aa7a22c68afa963ecfb4b43c52cccf152580cd66e4d5382fb69e4037cc\\\"],\\\"sizeBytes\\\":406053031},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9432c13d76bd4ba4eb9197c050cf88c0d701fa2055eeb59257e2e23901f9fdff\\\"],\\\"sizeBytes\\\":401810450},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9\\\"],\\\"sizeBytes\\\":390989693}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:37:09.863238 master-0 kubenswrapper[8731]: E1205 12:37:09.862525 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:37:11.162316 master-0 kubenswrapper[8731]: I1205 12:37:11.162259 8731 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": read tcp 192.168.32.10:45604->192.168.32.10:10257: read: connection reset by peer" Dec 05 12:37:11.549506 master-0 kubenswrapper[8731]: I1205 12:37:11.549351 8731 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="ac54524887aecec21958b7b4fb65da11e780a16d7a6537965df0e9b00dd407c3" exitCode=1 Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: I1205 12:37:12.675719 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:37:12.675872 master-0 kubenswrapper[8731]: I1205 12:37:12.675804 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:37:14.656513 master-0 kubenswrapper[8731]: E1205 12:37:14.656342 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:37:17.994019 master-0 kubenswrapper[8731]: E1205 12:37:17.993524 8731 projected.go:194] Error preparing data for projected volume kube-api-access-bll66 for pod openshift-controller-manager/controller-manager-675db9579f-4dcg8: failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:37:17.996111 master-0 kubenswrapper[8731]: E1205 12:37:17.994063 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66 podName:7e562fda-e695-4218-a9cf-4179b8d456db nodeName:}" failed. No retries permitted until 2025-12-05 12:37:49.994024978 +0000 UTC m=+368.298009185 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-bll66" (UniqueName: "kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66") pod "controller-manager-675db9579f-4dcg8" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db") : failed to fetch token: Timeout: request did not complete within requested timeout - context deadline exceeded Dec 05 12:37:18.043423 master-0 kubenswrapper[8731]: E1205 12:37:18.043342 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 05 12:37:18.043746 master-0 kubenswrapper[8731]: E1205 12:37:18.043582 8731 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.014s" Dec 05 12:37:18.043746 master-0 kubenswrapper[8731]: I1205 12:37:18.043610 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerDied","Data":"a4430062c5adda1c62354e9a698c163c97a33327be32fd67d0fc627123050dbf"} Dec 05 12:37:18.043746 master-0 kubenswrapper[8731]: I1205 12:37:18.043736 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerDied","Data":"0caaca757a34c0215195111520c95615b587485cd660ccd63c3b233f466666bb"} Dec 05 12:37:18.043847 master-0 kubenswrapper[8731]: I1205 12:37:18.043761 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:37:18.043847 master-0 kubenswrapper[8731]: I1205 12:37:18.043782 8731 status_manager.go:317] "Container readiness changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5" Dec 05 12:37:18.043847 master-0 kubenswrapper[8731]: I1205 12:37:18.043795 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:18.043847 master-0 kubenswrapper[8731]: I1205 12:37:18.043819 8731 status_manager.go:379] "Container startup changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5" Dec 05 12:37:18.043847 master-0 kubenswrapper[8731]: I1205 12:37:18.043834 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:18.044028 master-0 kubenswrapper[8731]: I1205 12:37:18.043863 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:37:18.044028 master-0 kubenswrapper[8731]: I1205 12:37:18.043878 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:37:18.044028 master-0 kubenswrapper[8731]: I1205 12:37:18.043892 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerDied","Data":"b351d2f70dc6ca77a15619a3104c4ce47b9bc5e14772befd2755648b695c45dd"} Dec 05 12:37:18.044028 master-0 kubenswrapper[8731]: I1205 12:37:18.043979 8731 scope.go:117] "RemoveContainer" containerID="a4430062c5adda1c62354e9a698c163c97a33327be32fd67d0fc627123050dbf" Dec 05 12:37:18.046074 master-0 kubenswrapper[8731]: I1205 12:37:18.045997 8731 scope.go:117] "RemoveContainer" containerID="98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad" Dec 05 12:37:18.046155 master-0 kubenswrapper[8731]: I1205 12:37:18.046090 8731 scope.go:117] "RemoveContainer" containerID="ac54524887aecec21958b7b4fb65da11e780a16d7a6537965df0e9b00dd407c3" Dec 05 12:37:18.046362 master-0 kubenswrapper[8731]: I1205 12:37:18.046266 8731 scope.go:117] "RemoveContainer" containerID="6017aa71f5b63d7cdd32da77565edb00ec8b8b5e5059d49e4246bd4f05d6b50b" Dec 05 12:37:18.046362 master-0 kubenswrapper[8731]: E1205 12:37:18.046337 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:37:18.051774 master-0 kubenswrapper[8731]: I1205 12:37:18.051669 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 05 12:37:18.088437 master-0 kubenswrapper[8731]: I1205 12:37:18.088384 8731 scope.go:117] "RemoveContainer" containerID="0caaca757a34c0215195111520c95615b587485cd660ccd63c3b233f466666bb" Dec 05 12:37:18.352838 master-0 kubenswrapper[8731]: I1205 12:37:18.352782 8731 scope.go:117] "RemoveContainer" containerID="b351d2f70dc6ca77a15619a3104c4ce47b9bc5e14772befd2755648b695c45dd" Dec 05 12:37:18.597279 master-0 kubenswrapper[8731]: I1205 12:37:18.597034 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f85974995-4vsjv_1871a9d6-6369-4d08-816f-9c6310b61ddf/kube-scheduler-operator-container/1.log" Dec 05 12:37:18.600283 master-0 kubenswrapper[8731]: I1205 12:37:18.600159 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/1.log" Dec 05 12:37:18.601115 master-0 kubenswrapper[8731]: I1205 12:37:18.601064 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/0.log" Dec 05 12:37:18.604084 master-0 kubenswrapper[8731]: I1205 12:37:18.604023 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-848f645654-g6nj5_594aaded-5615-4bed-87ee-6173059a73be/kube-controller-manager-operator/1.log" Dec 05 12:37:18.607355 master-0 kubenswrapper[8731]: I1205 12:37:18.607300 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-xxmfp_ba095394-1873-4793-969d-3be979fa0771/authentication-operator/1.log" Dec 05 12:37:18.610969 master-0 kubenswrapper[8731]: I1205 12:37:18.610810 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/1.log" Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: E1205 12:37:20.255086 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: &Event{ObjectMeta:{apiserver-5bdfbf6949-2bhqv.187e51be004b087f openshift-oauth-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-oauth-apiserver,Name:apiserver-5bdfbf6949-2bhqv,UID:d72b2b71-27b2-4aff-bf69-7054a9556318,APIVersion:v1,ResourceVersion:6623,FieldPath:spec.containers{oauth-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: body: [+]ping ok Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:33:00.466366591 +0000 UTC m=+78.770350758,LastTimestamp:2025-12-05 12:33:00.466366591 +0000 UTC m=+78.770350758,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Dec 05 12:37:20.255825 master-0 kubenswrapper[8731]: > Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: I1205 12:37:21.683307 8731 patch_prober.go:28] interesting pod/apiserver-5bdfbf6949-2bhqv container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: [+]log ok Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: [-]etcd failed: reason withheld Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: [+]poststarthook/max-in-flight-filter ok Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartUserInformer ok Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartOAuthInformer ok Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: livez check failed Dec 05 12:37:21.683410 master-0 kubenswrapper[8731]: I1205 12:37:21.683407 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:37:22.065704 master-0 kubenswrapper[8731]: I1205 12:37:22.065611 8731 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-xxmfp container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 12:37:22.065975 master-0 kubenswrapper[8731]: I1205 12:37:22.065718 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" podUID="ba095394-1873-4793-969d-3be979fa0771" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 12:37:24.656705 master-0 kubenswrapper[8731]: E1205 12:37:24.656580 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Dec 05 12:37:26.865001 master-0 kubenswrapper[8731]: E1205 12:37:26.864915 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:37:32.065367 master-0 kubenswrapper[8731]: I1205 12:37:32.065230 8731 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-xxmfp container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 12:37:32.065367 master-0 kubenswrapper[8731]: I1205 12:37:32.065356 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" podUID="ba095394-1873-4793-969d-3be979fa0771" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 12:37:34.657524 master-0 kubenswrapper[8731]: E1205 12:37:34.657392 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:37:40.749860 master-0 kubenswrapper[8731]: I1205 12:37:40.749681 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b9c5dfc78-2n8gt_7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/kube-storage-version-migrator-operator/2.log" Dec 05 12:37:40.751514 master-0 kubenswrapper[8731]: I1205 12:37:40.751139 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b9c5dfc78-2n8gt_7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/kube-storage-version-migrator-operator/1.log" Dec 05 12:37:40.752172 master-0 kubenswrapper[8731]: I1205 12:37:40.752099 8731 generic.go:334] "Generic (PLEG): container finished" podID="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" containerID="49b30bfe4873642e053b957d836ec7eddcac24e11ada8325b8fc8b72bfefafe8" exitCode=255 Dec 05 12:37:41.065550 master-0 kubenswrapper[8731]: I1205 12:37:41.065479 8731 patch_prober.go:28] interesting pod/authentication-operator-6c968fdfdf-xxmfp container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Dec 05 12:37:41.065550 master-0 kubenswrapper[8731]: I1205 12:37:41.065549 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" podUID="ba095394-1873-4793-969d-3be979fa0771" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Dec 05 12:37:41.760648 master-0 kubenswrapper[8731]: I1205 12:37:41.760589 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-rw57t_807d9093-aa67-4840-b5be-7f3abcc1beed/kube-apiserver-operator/2.log" Dec 05 12:37:41.762305 master-0 kubenswrapper[8731]: I1205 12:37:41.761076 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-rw57t_807d9093-aa67-4840-b5be-7f3abcc1beed/kube-apiserver-operator/1.log" Dec 05 12:37:41.762305 master-0 kubenswrapper[8731]: I1205 12:37:41.761446 8731 generic.go:334] "Generic (PLEG): container finished" podID="807d9093-aa67-4840-b5be-7f3abcc1beed" containerID="57b070ec273f7f37c3345984984d73c010928b9bb1f5746c1d4e18feee8dc2dc" exitCode=255 Dec 05 12:37:41.763376 master-0 kubenswrapper[8731]: I1205 12:37:41.763345 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-848f645654-g6nj5_594aaded-5615-4bed-87ee-6173059a73be/kube-controller-manager-operator/2.log" Dec 05 12:37:41.763844 master-0 kubenswrapper[8731]: I1205 12:37:41.763792 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-848f645654-g6nj5_594aaded-5615-4bed-87ee-6173059a73be/kube-controller-manager-operator/1.log" Dec 05 12:37:41.763844 master-0 kubenswrapper[8731]: I1205 12:37:41.763824 8731 generic.go:334] "Generic (PLEG): container finished" podID="594aaded-5615-4bed-87ee-6173059a73be" containerID="7a5d71f74727c7976f8b5ae99bcc9e973922d3b63fcffbddb6ae9dd46ae8aa22" exitCode=255 Dec 05 12:37:41.765961 master-0 kubenswrapper[8731]: I1205 12:37:41.765921 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-56fcb6cc5f-q9njf_ce3d73c1-f4bd-4c91-936a-086dfa5e3460/cluster-olm-operator/1.log" Dec 05 12:37:41.766954 master-0 kubenswrapper[8731]: I1205 12:37:41.766912 8731 generic.go:334] "Generic (PLEG): container finished" podID="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" containerID="551cae19e34d48fca769f493289bed8d81be6209860af5e4e43fa9850482cf12" exitCode=255 Dec 05 12:37:41.768811 master-0 kubenswrapper[8731]: I1205 12:37:41.768767 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/2.log" Dec 05 12:37:41.769702 master-0 kubenswrapper[8731]: I1205 12:37:41.769633 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/1.log" Dec 05 12:37:41.770588 master-0 kubenswrapper[8731]: I1205 12:37:41.770544 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/0.log" Dec 05 12:37:41.770658 master-0 kubenswrapper[8731]: I1205 12:37:41.770587 8731 generic.go:334] "Generic (PLEG): container finished" podID="d53a4886-db25-43a1-825a-66a9a9a58590" containerID="d299754c006efdd8044d5689f796048069701078c77f429aafcfe9eafc6522ec" exitCode=255 Dec 05 12:37:41.773587 master-0 kubenswrapper[8731]: I1205 12:37:41.773491 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/2.log" Dec 05 12:37:41.774104 master-0 kubenswrapper[8731]: I1205 12:37:41.774071 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/1.log" Dec 05 12:37:41.774211 master-0 kubenswrapper[8731]: I1205 12:37:41.774124 8731 generic.go:334] "Generic (PLEG): container finished" podID="5efad170-c154-42ec-a7c0-b36a98d2bfcc" containerID="3507046fb798191fc9af19cde11e6d29feed57ad8ee65fd82dade1b688773700" exitCode=255 Dec 05 12:37:41.777198 master-0 kubenswrapper[8731]: I1205 12:37:41.777134 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/2.log" Dec 05 12:37:41.777756 master-0 kubenswrapper[8731]: I1205 12:37:41.777726 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/1.log" Dec 05 12:37:41.777799 master-0 kubenswrapper[8731]: I1205 12:37:41.777759 8731 generic.go:334] "Generic (PLEG): container finished" podID="f119ffe4-16bd-49eb-916d-b18ba0d79b54" containerID="1313a091281e52fd967237a2bce92f6479da56820a8f33315dc25623f650fa7b" exitCode=255 Dec 05 12:37:41.780816 master-0 kubenswrapper[8731]: I1205 12:37:41.780766 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-7bf7f6b755-b2pxs_4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/openshift-apiserver-operator/1.log" Dec 05 12:37:41.781369 master-0 kubenswrapper[8731]: I1205 12:37:41.781326 8731 generic.go:334] "Generic (PLEG): container finished" podID="4b7f0d8d-a2bf-4550-b6e6-1c56adae827e" containerID="45f66b524a7ae3f72102f1c1f147264a8f0120c9900c09db2778e03215486e8a" exitCode=255 Dec 05 12:37:41.784167 master-0 kubenswrapper[8731]: I1205 12:37:41.784128 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-77758bc754-hfqsp_f3792522-fec6-4022-90ac-0b8467fcd625/service-ca-operator/1.log" Dec 05 12:37:41.785070 master-0 kubenswrapper[8731]: I1205 12:37:41.784938 8731 generic.go:334] "Generic (PLEG): container finished" podID="f3792522-fec6-4022-90ac-0b8467fcd625" containerID="e23a95f64400e0dbc7fc95b5f04dddc2d0290c35a1bcdf186ddbd3cd8314a14f" exitCode=255 Dec 05 12:37:41.786937 master-0 kubenswrapper[8731]: I1205 12:37:41.786885 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-xxmfp_ba095394-1873-4793-969d-3be979fa0771/authentication-operator/2.log" Dec 05 12:37:41.787532 master-0 kubenswrapper[8731]: I1205 12:37:41.787466 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-xxmfp_ba095394-1873-4793-969d-3be979fa0771/authentication-operator/1.log" Dec 05 12:37:41.787532 master-0 kubenswrapper[8731]: I1205 12:37:41.787522 8731 generic.go:334] "Generic (PLEG): container finished" podID="ba095394-1873-4793-969d-3be979fa0771" containerID="23c3457474b764927ea9138fc09fe8b0080b3d5dcfc7a8c9d9bd33c7ad79d98a" exitCode=255 Dec 05 12:37:43.801557 master-0 kubenswrapper[8731]: I1205 12:37:43.801456 8731 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="878914476f342bbe09935d11750836541a3cd256e73418d2dbee280993c5f191" exitCode=0 Dec 05 12:37:43.803375 master-0 kubenswrapper[8731]: I1205 12:37:43.803273 8731 generic.go:334] "Generic (PLEG): container finished" podID="cf8247a1-703a-46b3-9a33-25a73b27ab99" containerID="48561b92390271bf5bcb9ad8430184be011980a59e84d647901af23e4a1dea25" exitCode=0 Dec 05 12:37:44.658957 master-0 kubenswrapper[8731]: E1205 12:37:44.658852 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:37:44.658957 master-0 kubenswrapper[8731]: E1205 12:37:44.658913 8731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 12:37:45.551090 master-0 kubenswrapper[8731]: I1205 12:37:45.550965 8731 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 05 12:37:46.487346 master-0 kubenswrapper[8731]: I1205 12:37:46.487234 8731 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Dec 05 12:37:48.620875 master-0 kubenswrapper[8731]: I1205 12:37:48.620764 8731 status_manager.go:851] "Failed to get status for pod" podUID="d53a4886-db25-43a1-825a-66a9a9a58590" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods openshift-controller-manager-operator-6c8676f99d-546vz)" Dec 05 12:37:48.835959 master-0 kubenswrapper[8731]: I1205 12:37:48.835853 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/2.log" Dec 05 12:37:48.836432 master-0 kubenswrapper[8731]: I1205 12:37:48.836400 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/1.log" Dec 05 12:37:48.836937 master-0 kubenswrapper[8731]: I1205 12:37:48.836840 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/0.log" Dec 05 12:37:48.836937 master-0 kubenswrapper[8731]: I1205 12:37:48.836897 8731 generic.go:334] "Generic (PLEG): container finished" podID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" containerID="142cac7db86d510a3cb1fe121b732aea43f370fa9eb0fe98a9655b028e584160" exitCode=1 Dec 05 12:37:49.996328 master-0 kubenswrapper[8731]: I1205 12:37:49.996229 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:37:50.372205 master-0 kubenswrapper[8731]: I1205 12:37:50.371980 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"controller-manager-675db9579f-4dcg8\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:37:50.586642 master-0 kubenswrapper[8731]: E1205 12:37:50.586548 8731 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="32.543s" Dec 05 12:37:50.597359 master-0 kubenswrapper[8731]: I1205 12:37:50.597277 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608042 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608133 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608224 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608255 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608279 8731 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="5d796e63-a21d-4437-b230-46a1d99f072b" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608313 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerDied","Data":"f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608371 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608404 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608431 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608456 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerDied","Data":"41718b57d6d2e36d2cb94e43774b239e600e6619dc10d3c14a0345e610d821c2"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608491 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608671 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608696 8731 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="5d796e63-a21d-4437-b230-46a1d99f072b" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608719 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608754 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608784 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"076dafdf-a5d2-4e2d-9c38-6932910f7327","Type":"ContainerDied","Data":"f8dc47e77bee6411ef3a450c0123b8279b91a4729700211ae01112ac79fa1d1e"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608819 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerStarted","Data":"2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608846 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerStarted","Data":"efd0af11329fc9886861d20bcf790f4afa476fb62b8a37aabf75eec470dca7ba"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608873 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerStarted","Data":"e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608899 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerStarted","Data":"a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608925 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" event={"ID":"1871a9d6-6369-4d08-816f-9c6310b61ddf","Type":"ContainerStarted","Data":"6017aa71f5b63d7cdd32da77565edb00ec8b8b5e5059d49e4246bd4f05d6b50b"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608951 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.608974 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"076dafdf-a5d2-4e2d-9c38-6932910f7327","Type":"ContainerDied","Data":"26722ad2bd6e7ca8bda35211d0d46cd57e0c0ba5a29870576dae6f8264697434"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.609011 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26722ad2bd6e7ca8bda35211d0d46cd57e0c0ba5a29870576dae6f8264697434" Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.609036 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerStarted","Data":"1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae"} Dec 05 12:37:50.608973 master-0 kubenswrapper[8731]: I1205 12:37:50.609061 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609084 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerStarted","Data":"1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609112 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerStarted","Data":"2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609137 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"d627fcf3-2a80-4739-add9-e21ad4efc6eb","Type":"ContainerDied","Data":"f05e889af7263b1ba07fc81648bcfe8b4d672794681d2558bab4fed4dcbd28ca"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609163 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f05e889af7263b1ba07fc81648bcfe8b4d672794681d2558bab4fed4dcbd28ca" Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609283 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerDied","Data":"0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609321 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"565d5ef6-b0e7-4f04-9460-61f1d3903d37","Type":"ContainerDied","Data":"1cb443e02b64a65178050b34e99e50f308c86d2ef5b4e7e730bfa0faf58cc53e"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609352 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609379 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerDied","Data":"b02b74337c561023bb77d95397661e10a1ee5fc12d28b2fd7ee9556bbaba81e5"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609408 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerDied","Data":"5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609444 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerDied","Data":"73f6bfa12151c71020cd1cc8c48ebdf6c4c24dbf1a05b4873ce05f073bdcce94"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609471 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" event={"ID":"58187662-b502-4d90-95ce-2aa91a81d256","Type":"ContainerDied","Data":"d0c256d51be6b67ce11d61e05ebacd6a747bab028d852541d977f5d77734ba1a"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609497 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerDied","Data":"25a1113bac1425c0d6b5254d5067b012732c090d8f467edda97019523a2d47be"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609530 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerDied","Data":"e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609557 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" event={"ID":"f3792522-fec6-4022-90ac-0b8467fcd625","Type":"ContainerDied","Data":"eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609585 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerDied","Data":"c63a8034e23c88dd09173f57e05eee7c9bc26e35890cfdd9f1fdc8ef0e16d843"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609611 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerStarted","Data":"712a042e7fad33cd815c939ca364362f7220e1f1ce6096f34de7bd5630509fb8"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609641 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerDied","Data":"1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609668 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerDied","Data":"2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609696 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerDied","Data":"2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609723 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerDied","Data":"a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609750 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" event={"ID":"1871a9d6-6369-4d08-816f-9c6310b61ddf","Type":"ContainerDied","Data":"6017aa71f5b63d7cdd32da77565edb00ec8b8b5e5059d49e4246bd4f05d6b50b"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609778 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerDied","Data":"1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609803 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerDied","Data":"e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609831 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerStarted","Data":"57b070ec273f7f37c3345984984d73c010928b9bb1f5746c1d4e18feee8dc2dc"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609859 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"ac54524887aecec21958b7b4fb65da11e780a16d7a6537965df0e9b00dd407c3"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609884 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerStarted","Data":"3507046fb798191fc9af19cde11e6d29feed57ad8ee65fd82dade1b688773700"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609907 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerStarted","Data":"11b2f7447856caa8c6cb51432e0d7392e86f64482a9fae5b398a57d71719f20e"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609929 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerStarted","Data":"7b0e1392f4706a31c5e08db223b1244b230bf09a2ede6f19588e74a4a3860cf4"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609953 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" event={"ID":"58187662-b502-4d90-95ce-2aa91a81d256","Type":"ContainerStarted","Data":"7c715e090bdbc7252a3de31126638fe765c309cab209969215dce8cf6f422ab7"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609975 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerStarted","Data":"49b30bfe4873642e053b957d836ec7eddcac24e11ada8325b8fc8b72bfefafe8"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.609999 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" event={"ID":"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e","Type":"ContainerStarted","Data":"45f66b524a7ae3f72102f1c1f147264a8f0120c9900c09db2778e03215486e8a"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610024 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" event={"ID":"f3792522-fec6-4022-90ac-0b8467fcd625","Type":"ContainerStarted","Data":"e23a95f64400e0dbc7fc95b5f04dddc2d0290c35a1bcdf186ddbd3cd8314a14f"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610047 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerStarted","Data":"23c3457474b764927ea9138fc09fe8b0080b3d5dcfc7a8c9d9bd33c7ad79d98a"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610069 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"565d5ef6-b0e7-4f04-9460-61f1d3903d37","Type":"ContainerDied","Data":"ffedcf7b097d85236cfda3f347741e8721f2f2b5597465d279b812038c00b460"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610094 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffedcf7b097d85236cfda3f347741e8721f2f2b5597465d279b812038c00b460" Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610118 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerStarted","Data":"1313a091281e52fd967237a2bce92f6479da56820a8f33315dc25623f650fa7b"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610139 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerStarted","Data":"98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610161 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerStarted","Data":"7a5d71f74727c7976f8b5ae99bcc9e973922d3b63fcffbddb6ae9dd46ae8aa22"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610240 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerStarted","Data":"d299754c006efdd8044d5689f796048069701078c77f429aafcfe9eafc6522ec"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610266 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerStarted","Data":"551cae19e34d48fca769f493289bed8d81be6209860af5e4e43fa9850482cf12"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610287 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerStarted","Data":"35aadb9bbeac01f2246017a0a2cf81423a3d53a1924e285f6675485164555604"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610315 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"073eeb295461d6cfe17793b727c36b1a0795b59c33714c128e08740e09c87106"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610337 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"00f11e6defd30a3258a136b83ab00d656bb56a377cbe07aa4f6425fb339a65fe"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610359 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"b6aa6b1922706ed7b2ddfb61ba9e6938912e45e804a0e3f6e5253251e33b6f4e"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610380 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"dd2d6c7cdd5eab77e600768a9929fb4a53e0d7ace9dc3035b564a4d26b57a2ca"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610401 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"c24e01603234fe8003f8aae8171b0065","Type":"ContainerStarted","Data":"765a08d8e028edbc45c6c2083dfc23ad6392f98fa33616533c24f46e8e8af646"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610423 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerDied","Data":"98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610455 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"ac54524887aecec21958b7b4fb65da11e780a16d7a6537965df0e9b00dd407c3"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610480 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" event={"ID":"1871a9d6-6369-4d08-816f-9c6310b61ddf","Type":"ContainerStarted","Data":"e133f763b4090ab8deffe912e58f36acd0db95abe046782abfe8e5f290664368"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610505 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerStarted","Data":"142cac7db86d510a3cb1fe121b732aea43f370fa9eb0fe98a9655b028e584160"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610537 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerDied","Data":"49b30bfe4873642e053b957d836ec7eddcac24e11ada8325b8fc8b72bfefafe8"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610563 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerDied","Data":"57b070ec273f7f37c3345984984d73c010928b9bb1f5746c1d4e18feee8dc2dc"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610591 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerDied","Data":"7a5d71f74727c7976f8b5ae99bcc9e973922d3b63fcffbddb6ae9dd46ae8aa22"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610616 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerDied","Data":"551cae19e34d48fca769f493289bed8d81be6209860af5e4e43fa9850482cf12"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610645 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerDied","Data":"d299754c006efdd8044d5689f796048069701078c77f429aafcfe9eafc6522ec"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610672 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerDied","Data":"3507046fb798191fc9af19cde11e6d29feed57ad8ee65fd82dade1b688773700"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610696 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerDied","Data":"1313a091281e52fd967237a2bce92f6479da56820a8f33315dc25623f650fa7b"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610721 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" event={"ID":"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e","Type":"ContainerDied","Data":"45f66b524a7ae3f72102f1c1f147264a8f0120c9900c09db2778e03215486e8a"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610746 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" event={"ID":"f3792522-fec6-4022-90ac-0b8467fcd625","Type":"ContainerDied","Data":"e23a95f64400e0dbc7fc95b5f04dddc2d0290c35a1bcdf186ddbd3cd8314a14f"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610807 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerDied","Data":"23c3457474b764927ea9138fc09fe8b0080b3d5dcfc7a8c9d9bd33c7ad79d98a"} Dec 05 12:37:50.610729 master-0 kubenswrapper[8731]: I1205 12:37:50.610836 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"878914476f342bbe09935d11750836541a3cd256e73418d2dbee280993c5f191"} Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: I1205 12:37:50.610863 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" event={"ID":"cf8247a1-703a-46b3-9a33-25a73b27ab99","Type":"ContainerDied","Data":"48561b92390271bf5bcb9ad8430184be011980a59e84d647901af23e4a1dea25"} Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: I1205 12:37:50.610890 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerDied","Data":"142cac7db86d510a3cb1fe121b732aea43f370fa9eb0fe98a9655b028e584160"} Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: I1205 12:37:50.612052 8731 scope.go:117] "RemoveContainer" containerID="142cac7db86d510a3cb1fe121b732aea43f370fa9eb0fe98a9655b028e584160" Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: E1205 12:37:50.612483 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: I1205 12:37:50.614000 8731 scope.go:117] "RemoveContainer" containerID="ac54524887aecec21958b7b4fb65da11e780a16d7a6537965df0e9b00dd407c3" Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: I1205 12:37:50.614038 8731 scope.go:117] "RemoveContainer" containerID="878914476f342bbe09935d11750836541a3cd256e73418d2dbee280993c5f191" Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: I1205 12:37:50.615183 8731 scope.go:117] "RemoveContainer" containerID="57b070ec273f7f37c3345984984d73c010928b9bb1f5746c1d4e18feee8dc2dc" Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: E1205 12:37:50.615536 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-765d9ff747-rw57t_openshift-kube-apiserver-operator(807d9093-aa67-4840-b5be-7f3abcc1beed)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" podUID="807d9093-aa67-4840-b5be-7f3abcc1beed" Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: I1205 12:37:50.615607 8731 scope.go:117] "RemoveContainer" containerID="1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae" Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: I1205 12:37:50.616044 8731 scope.go:117] "RemoveContainer" containerID="49b30bfe4873642e053b957d836ec7eddcac24e11ada8325b8fc8b72bfefafe8" Dec 05 12:37:50.617726 master-0 kubenswrapper[8731]: E1205 12:37:50.616391 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-b9c5dfc78-2n8gt_openshift-kube-storage-version-migrator-operator(7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" podUID="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" Dec 05 12:37:50.619411 master-0 kubenswrapper[8731]: I1205 12:37:50.618650 8731 scope.go:117] "RemoveContainer" containerID="7a5d71f74727c7976f8b5ae99bcc9e973922d3b63fcffbddb6ae9dd46ae8aa22" Dec 05 12:37:50.619411 master-0 kubenswrapper[8731]: E1205 12:37:50.618797 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-848f645654-g6nj5_openshift-kube-controller-manager-operator(594aaded-5615-4bed-87ee-6173059a73be)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" podUID="594aaded-5615-4bed-87ee-6173059a73be" Dec 05 12:37:50.619411 master-0 kubenswrapper[8731]: I1205 12:37:50.619276 8731 scope.go:117] "RemoveContainer" containerID="45f66b524a7ae3f72102f1c1f147264a8f0120c9900c09db2778e03215486e8a" Dec 05 12:37:50.619664 master-0 kubenswrapper[8731]: E1205 12:37:50.619580 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-7bf7f6b755-b2pxs_openshift-apiserver-operator(4b7f0d8d-a2bf-4550-b6e6-1c56adae827e)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" podUID="4b7f0d8d-a2bf-4550-b6e6-1c56adae827e" Dec 05 12:37:50.620385 master-0 kubenswrapper[8731]: I1205 12:37:50.620260 8731 scope.go:117] "RemoveContainer" containerID="551cae19e34d48fca769f493289bed8d81be6209860af5e4e43fa9850482cf12" Dec 05 12:37:50.625959 master-0 kubenswrapper[8731]: E1205 12:37:50.620570 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-56fcb6cc5f-q9njf_openshift-cluster-olm-operator(ce3d73c1-f4bd-4c91-936a-086dfa5e3460)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" podUID="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" Dec 05 12:37:50.625959 master-0 kubenswrapper[8731]: I1205 12:37:50.623764 8731 scope.go:117] "RemoveContainer" containerID="1313a091281e52fd967237a2bce92f6479da56820a8f33315dc25623f650fa7b" Dec 05 12:37:50.625959 master-0 kubenswrapper[8731]: E1205 12:37:50.624082 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=etcd-operator pod=etcd-operator-5bf4d88c6f-dxd24_openshift-etcd-operator(f119ffe4-16bd-49eb-916d-b18ba0d79b54)\"" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" podUID="f119ffe4-16bd-49eb-916d-b18ba0d79b54" Dec 05 12:37:50.625959 master-0 kubenswrapper[8731]: I1205 12:37:50.624801 8731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="oauth-apiserver" containerStatusID={"Type":"cri-o","ID":"bbc65050e19c8e05efbd98764627a92089e068c4fefa760a423cea0a25acab48"} pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" containerMessage="Container oauth-apiserver failed startup probe, will be restarted" Dec 05 12:37:50.625959 master-0 kubenswrapper[8731]: I1205 12:37:50.624867 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" podUID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerName="oauth-apiserver" containerID="cri-o://bbc65050e19c8e05efbd98764627a92089e068c4fefa760a423cea0a25acab48" gracePeriod=120 Dec 05 12:37:50.625959 master-0 kubenswrapper[8731]: I1205 12:37:50.625252 8731 scope.go:117] "RemoveContainer" containerID="d299754c006efdd8044d5689f796048069701078c77f429aafcfe9eafc6522ec" Dec 05 12:37:50.625959 master-0 kubenswrapper[8731]: E1205 12:37:50.625597 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-6c8676f99d-546vz_openshift-controller-manager-operator(d53a4886-db25-43a1-825a-66a9a9a58590)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" podUID="d53a4886-db25-43a1-825a-66a9a9a58590" Dec 05 12:37:50.630531 master-0 kubenswrapper[8731]: I1205 12:37:50.629620 8731 scope.go:117] "RemoveContainer" containerID="48561b92390271bf5bcb9ad8430184be011980a59e84d647901af23e4a1dea25" Dec 05 12:37:50.640657 master-0 kubenswrapper[8731]: I1205 12:37:50.640067 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 05 12:37:50.640657 master-0 kubenswrapper[8731]: I1205 12:37:50.640587 8731 scope.go:117] "RemoveContainer" containerID="3507046fb798191fc9af19cde11e6d29feed57ad8ee65fd82dade1b688773700" Dec 05 12:37:50.641011 master-0 kubenswrapper[8731]: E1205 12:37:50.640829 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=network-operator pod=network-operator-79767b7ff9-h8qkj_openshift-network-operator(5efad170-c154-42ec-a7c0-b36a98d2bfcc)\"" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" podUID="5efad170-c154-42ec-a7c0-b36a98d2bfcc" Dec 05 12:37:50.641734 master-0 kubenswrapper[8731]: I1205 12:37:50.641666 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 05 12:37:50.642680 master-0 kubenswrapper[8731]: I1205 12:37:50.642667 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:37:50.643488 master-0 kubenswrapper[8731]: I1205 12:37:50.643449 8731 scope.go:117] "RemoveContainer" containerID="23c3457474b764927ea9138fc09fe8b0080b3d5dcfc7a8c9d9bd33c7ad79d98a" Dec 05 12:37:50.643705 master-0 kubenswrapper[8731]: E1205 12:37:50.643668 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=authentication-operator pod=authentication-operator-6c968fdfdf-xxmfp_openshift-authentication-operator(ba095394-1873-4793-969d-3be979fa0771)\"" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" podUID="ba095394-1873-4793-969d-3be979fa0771" Dec 05 12:37:50.644281 master-0 kubenswrapper[8731]: I1205 12:37:50.644219 8731 scope.go:117] "RemoveContainer" containerID="e23a95f64400e0dbc7fc95b5f04dddc2d0290c35a1bcdf186ddbd3cd8314a14f" Dec 05 12:37:50.644421 master-0 kubenswrapper[8731]: E1205 12:37:50.644387 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=service-ca-operator pod=service-ca-operator-77758bc754-hfqsp_openshift-service-ca-operator(f3792522-fec6-4022-90ac-0b8467fcd625)\"" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" podUID="f3792522-fec6-4022-90ac-0b8467fcd625" Dec 05 12:37:50.659529 master-0 kubenswrapper[8731]: I1205 12:37:50.659415 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 05 12:37:50.677583 master-0 kubenswrapper[8731]: I1205 12:37:50.677468 8731 scope.go:117] "RemoveContainer" containerID="f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0" Dec 05 12:37:50.738227 master-0 kubenswrapper[8731]: I1205 12:37:50.738167 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw"] Dec 05 12:37:50.742047 master-0 kubenswrapper[8731]: I1205 12:37:50.741472 8731 scope.go:117] "RemoveContainer" containerID="880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5" Dec 05 12:37:50.742837 master-0 kubenswrapper[8731]: I1205 12:37:50.742161 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c4497cd6c-rg6xw"] Dec 05 12:37:50.805573 master-0 kubenswrapper[8731]: I1205 12:37:50.805504 8731 scope.go:117] "RemoveContainer" containerID="dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8" Dec 05 12:37:50.831344 master-0 kubenswrapper[8731]: I1205 12:37:50.831297 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl"] Dec 05 12:37:50.836095 master-0 kubenswrapper[8731]: I1205 12:37:50.835834 8731 scope.go:117] "RemoveContainer" containerID="f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18" Dec 05 12:37:50.836204 master-0 kubenswrapper[8731]: I1205 12:37:50.836102 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-858598fd98-5xkcl"] Dec 05 12:37:50.853773 master-0 kubenswrapper[8731]: I1205 12:37:50.853726 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-rw57t_807d9093-aa67-4840-b5be-7f3abcc1beed/kube-apiserver-operator/2.log" Dec 05 12:37:50.866209 master-0 kubenswrapper[8731]: I1205 12:37:50.866151 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" event={"ID":"cf8247a1-703a-46b3-9a33-25a73b27ab99","Type":"ContainerStarted","Data":"65805ce826a6880e17ce2c571cd39f060976d0a8a6ae89fcede7232cc66bff52"} Dec 05 12:37:50.873882 master-0 kubenswrapper[8731]: I1205 12:37:50.873834 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-675db9579f-4dcg8"] Dec 05 12:37:50.873882 master-0 kubenswrapper[8731]: I1205 12:37:50.875224 8731 scope.go:117] "RemoveContainer" containerID="0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee" Dec 05 12:37:50.884668 master-0 kubenswrapper[8731]: W1205 12:37:50.884633 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e562fda_e695_4218_a9cf_4179b8d456db.slice/crio-46c71a14a0f9590da88fc8567ffce1570ccabc57f819c41e45925415e66120f4 WatchSource:0}: Error finding container 46c71a14a0f9590da88fc8567ffce1570ccabc57f819c41e45925415e66120f4: Status 404 returned error can't find the container with id 46c71a14a0f9590da88fc8567ffce1570ccabc57f819c41e45925415e66120f4 Dec 05 12:37:50.901806 master-0 kubenswrapper[8731]: E1205 12:37:50.901762 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:37:50.904266 master-0 kubenswrapper[8731]: I1205 12:37:50.904177 8731 scope.go:117] "RemoveContainer" containerID="e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072" Dec 05 12:37:50.925385 master-0 kubenswrapper[8731]: I1205 12:37:50.925341 8731 scope.go:117] "RemoveContainer" containerID="880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5" Dec 05 12:37:50.925997 master-0 kubenswrapper[8731]: E1205 12:37:50.925959 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5\": container with ID starting with 880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5 not found: ID does not exist" containerID="880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5" Dec 05 12:37:50.926067 master-0 kubenswrapper[8731]: I1205 12:37:50.926019 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5"} err="failed to get container status \"880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5\": rpc error: code = NotFound desc = could not find container \"880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5\": container with ID starting with 880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5 not found: ID does not exist" Dec 05 12:37:50.926067 master-0 kubenswrapper[8731]: I1205 12:37:50.926051 8731 scope.go:117] "RemoveContainer" containerID="dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8" Dec 05 12:37:50.926549 master-0 kubenswrapper[8731]: E1205 12:37:50.926492 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8\": container with ID starting with dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8 not found: ID does not exist" containerID="dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8" Dec 05 12:37:50.926621 master-0 kubenswrapper[8731]: I1205 12:37:50.926563 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8"} err="failed to get container status \"dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8\": rpc error: code = NotFound desc = could not find container \"dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8\": container with ID starting with dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8 not found: ID does not exist" Dec 05 12:37:50.926677 master-0 kubenswrapper[8731]: I1205 12:37:50.926621 8731 scope.go:117] "RemoveContainer" containerID="f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18" Dec 05 12:37:50.927251 master-0 kubenswrapper[8731]: E1205 12:37:50.927223 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18\": container with ID starting with f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18 not found: ID does not exist" containerID="f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18" Dec 05 12:37:50.927321 master-0 kubenswrapper[8731]: I1205 12:37:50.927257 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18"} err="failed to get container status \"f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18\": rpc error: code = NotFound desc = could not find container \"f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18\": container with ID starting with f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18 not found: ID does not exist" Dec 05 12:37:50.927321 master-0 kubenswrapper[8731]: I1205 12:37:50.927279 8731 scope.go:117] "RemoveContainer" containerID="98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad" Dec 05 12:37:50.961121 master-0 kubenswrapper[8731]: I1205 12:37:50.960499 8731 scope.go:117] "RemoveContainer" containerID="5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57" Dec 05 12:37:50.976634 master-0 kubenswrapper[8731]: I1205 12:37:50.976592 8731 scope.go:117] "RemoveContainer" containerID="e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39" Dec 05 12:37:50.998104 master-0 kubenswrapper[8731]: I1205 12:37:50.997919 8731 scope.go:117] "RemoveContainer" containerID="eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457" Dec 05 12:37:51.019556 master-0 kubenswrapper[8731]: I1205 12:37:51.019494 8731 scope.go:117] "RemoveContainer" containerID="1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237" Dec 05 12:37:51.054526 master-0 kubenswrapper[8731]: I1205 12:37:51.054453 8731 scope.go:117] "RemoveContainer" containerID="2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994" Dec 05 12:37:51.092917 master-0 kubenswrapper[8731]: I1205 12:37:51.092859 8731 scope.go:117] "RemoveContainer" containerID="2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa" Dec 05 12:37:51.130445 master-0 kubenswrapper[8731]: I1205 12:37:51.130395 8731 scope.go:117] "RemoveContainer" containerID="a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77" Dec 05 12:37:51.161382 master-0 kubenswrapper[8731]: I1205 12:37:51.161284 8731 scope.go:117] "RemoveContainer" containerID="4f8a59bfccc80caaa9ccb9172563888264ac2bfba8642d650c783edb02a956b7" Dec 05 12:37:51.196446 master-0 kubenswrapper[8731]: I1205 12:37:51.195711 8731 scope.go:117] "RemoveContainer" containerID="1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae" Dec 05 12:37:51.199215 master-0 kubenswrapper[8731]: E1205 12:37:51.199162 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae\": container with ID starting with 1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae not found: ID does not exist" containerID="1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae" Dec 05 12:37:51.199306 master-0 kubenswrapper[8731]: I1205 12:37:51.199226 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae"} err="failed to get container status \"1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae\": rpc error: code = NotFound desc = could not find container \"1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae\": container with ID starting with 1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae not found: ID does not exist" Dec 05 12:37:51.199306 master-0 kubenswrapper[8731]: I1205 12:37:51.199251 8731 scope.go:117] "RemoveContainer" containerID="f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0" Dec 05 12:37:51.199906 master-0 kubenswrapper[8731]: E1205 12:37:51.199874 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0\": container with ID starting with f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0 not found: ID does not exist" containerID="f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0" Dec 05 12:37:51.199906 master-0 kubenswrapper[8731]: I1205 12:37:51.199899 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0"} err="failed to get container status \"f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0\": rpc error: code = NotFound desc = could not find container \"f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0\": container with ID starting with f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0 not found: ID does not exist" Dec 05 12:37:51.200009 master-0 kubenswrapper[8731]: I1205 12:37:51.199911 8731 scope.go:117] "RemoveContainer" containerID="e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497" Dec 05 12:37:51.236692 master-0 kubenswrapper[8731]: I1205 12:37:51.236641 8731 scope.go:117] "RemoveContainer" containerID="eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d" Dec 05 12:37:51.260854 master-0 kubenswrapper[8731]: I1205 12:37:51.260795 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 05 12:37:51.261306 master-0 kubenswrapper[8731]: I1205 12:37:51.261248 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 05 12:37:51.264231 master-0 kubenswrapper[8731]: I1205 12:37:51.264130 8731 scope.go:117] "RemoveContainer" containerID="98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad" Dec 05 12:37:51.266697 master-0 kubenswrapper[8731]: E1205 12:37:51.266651 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad\": container with ID starting with 98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad not found: ID does not exist" containerID="98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad" Dec 05 12:37:51.266795 master-0 kubenswrapper[8731]: I1205 12:37:51.266714 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad"} err="failed to get container status \"98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad\": rpc error: code = NotFound desc = could not find container \"98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad\": container with ID starting with 98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad not found: ID does not exist" Dec 05 12:37:51.266795 master-0 kubenswrapper[8731]: I1205 12:37:51.266751 8731 scope.go:117] "RemoveContainer" containerID="5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57" Dec 05 12:37:51.267400 master-0 kubenswrapper[8731]: E1205 12:37:51.267339 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57\": container with ID starting with 5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57 not found: ID does not exist" containerID="5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57" Dec 05 12:37:51.267468 master-0 kubenswrapper[8731]: I1205 12:37:51.267414 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57"} err="failed to get container status \"5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57\": rpc error: code = NotFound desc = could not find container \"5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57\": container with ID starting with 5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57 not found: ID does not exist" Dec 05 12:37:51.267468 master-0 kubenswrapper[8731]: I1205 12:37:51.267453 8731 scope.go:117] "RemoveContainer" containerID="880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5" Dec 05 12:37:51.267864 master-0 kubenswrapper[8731]: I1205 12:37:51.267825 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5"} err="failed to get container status \"880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5\": rpc error: code = NotFound desc = could not find container \"880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5\": container with ID starting with 880f9379fb38b44819566d3ac34f7d19bcaf915975c17f816a75b5b6efd611c5 not found: ID does not exist" Dec 05 12:37:51.267864 master-0 kubenswrapper[8731]: I1205 12:37:51.267851 8731 scope.go:117] "RemoveContainer" containerID="dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8" Dec 05 12:37:51.268271 master-0 kubenswrapper[8731]: I1205 12:37:51.268196 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8"} err="failed to get container status \"dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8\": rpc error: code = NotFound desc = could not find container \"dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8\": container with ID starting with dd2ebdee1673bad7e38b4e5bee4b512ee1bdd788827b711ca8c177c8bf300cd8 not found: ID does not exist" Dec 05 12:37:51.268339 master-0 kubenswrapper[8731]: I1205 12:37:51.268277 8731 scope.go:117] "RemoveContainer" containerID="f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18" Dec 05 12:37:51.268637 master-0 kubenswrapper[8731]: I1205 12:37:51.268596 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18"} err="failed to get container status \"f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18\": rpc error: code = NotFound desc = could not find container \"f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18\": container with ID starting with f3e35001bbc7b9aacb284d7725a5ca2a58d8402805d0143e9a51bfd49c9afe18 not found: ID does not exist" Dec 05 12:37:51.268637 master-0 kubenswrapper[8731]: I1205 12:37:51.268632 8731 scope.go:117] "RemoveContainer" containerID="e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497" Dec 05 12:37:51.268910 master-0 kubenswrapper[8731]: E1205 12:37:51.268873 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497\": container with ID starting with e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497 not found: ID does not exist" containerID="e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497" Dec 05 12:37:51.268958 master-0 kubenswrapper[8731]: I1205 12:37:51.268903 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497"} err="failed to get container status \"e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497\": rpc error: code = NotFound desc = could not find container \"e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497\": container with ID starting with e10ca0df40eb2db4bf995662f42f3b8f09e91dda57cb740df7f1668ef9a54497 not found: ID does not exist" Dec 05 12:37:51.268958 master-0 kubenswrapper[8731]: I1205 12:37:51.268922 8731 scope.go:117] "RemoveContainer" containerID="eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d" Dec 05 12:37:51.269246 master-0 kubenswrapper[8731]: E1205 12:37:51.269214 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d\": container with ID starting with eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d not found: ID does not exist" containerID="eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d" Dec 05 12:37:51.269303 master-0 kubenswrapper[8731]: I1205 12:37:51.269247 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d"} err="failed to get container status \"eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d\": rpc error: code = NotFound desc = could not find container \"eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d\": container with ID starting with eae74267bbff7388ad43e0bcb0a8a1a5c6694e5d3fab6387145bf64deb29417d not found: ID does not exist" Dec 05 12:37:51.269303 master-0 kubenswrapper[8731]: I1205 12:37:51.269267 8731 scope.go:117] "RemoveContainer" containerID="1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae" Dec 05 12:37:51.269593 master-0 kubenswrapper[8731]: I1205 12:37:51.269540 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae"} err="failed to get container status \"1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae\": rpc error: code = NotFound desc = could not find container \"1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae\": container with ID starting with 1c3eadd6edb97d3b0cc400829d9017e175360214701ca83b476fb9ff2c80b5ae not found: ID does not exist" Dec 05 12:37:51.269593 master-0 kubenswrapper[8731]: I1205 12:37:51.269582 8731 scope.go:117] "RemoveContainer" containerID="f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0" Dec 05 12:37:51.269919 master-0 kubenswrapper[8731]: I1205 12:37:51.269874 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0"} err="failed to get container status \"f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0\": rpc error: code = NotFound desc = could not find container \"f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0\": container with ID starting with f38aa8540a6743f409b0fa2aec5a624b9c7ad352e3847bb54aaf4d1b704f18e0 not found: ID does not exist" Dec 05 12:37:51.269919 master-0 kubenswrapper[8731]: I1205 12:37:51.269909 8731 scope.go:117] "RemoveContainer" containerID="1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237" Dec 05 12:37:51.270210 master-0 kubenswrapper[8731]: E1205 12:37:51.270156 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237\": container with ID starting with 1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237 not found: ID does not exist" containerID="1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237" Dec 05 12:37:51.270210 master-0 kubenswrapper[8731]: I1205 12:37:51.270184 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237"} err="failed to get container status \"1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237\": rpc error: code = NotFound desc = could not find container \"1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237\": container with ID starting with 1b060274049216c69ff594edc9d2a695d110f82d29ed8c698a35ad2511d80237 not found: ID does not exist" Dec 05 12:37:51.270304 master-0 kubenswrapper[8731]: I1205 12:37:51.270217 8731 scope.go:117] "RemoveContainer" containerID="e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39" Dec 05 12:37:51.270511 master-0 kubenswrapper[8731]: E1205 12:37:51.270481 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39\": container with ID starting with e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39 not found: ID does not exist" containerID="e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39" Dec 05 12:37:51.270586 master-0 kubenswrapper[8731]: I1205 12:37:51.270513 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39"} err="failed to get container status \"e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39\": rpc error: code = NotFound desc = could not find container \"e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39\": container with ID starting with e661aee8169481bd45ddc453eab7e9b725569fcef2029fd7e4e16d66fbcedf39 not found: ID does not exist" Dec 05 12:37:51.270586 master-0 kubenswrapper[8731]: I1205 12:37:51.270539 8731 scope.go:117] "RemoveContainer" containerID="0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee" Dec 05 12:37:51.270783 master-0 kubenswrapper[8731]: E1205 12:37:51.270755 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee\": container with ID starting with 0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee not found: ID does not exist" containerID="0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee" Dec 05 12:37:51.270832 master-0 kubenswrapper[8731]: I1205 12:37:51.270781 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee"} err="failed to get container status \"0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee\": rpc error: code = NotFound desc = could not find container \"0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee\": container with ID starting with 0ca651057443d22827f48087f13a7a3218451ee691e2f2aee7a07437d8b2d6ee not found: ID does not exist" Dec 05 12:37:51.270832 master-0 kubenswrapper[8731]: I1205 12:37:51.270799 8731 scope.go:117] "RemoveContainer" containerID="e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072" Dec 05 12:37:51.271041 master-0 kubenswrapper[8731]: E1205 12:37:51.271015 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072\": container with ID starting with e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072 not found: ID does not exist" containerID="e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072" Dec 05 12:37:51.271089 master-0 kubenswrapper[8731]: I1205 12:37:51.271039 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072"} err="failed to get container status \"e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072\": rpc error: code = NotFound desc = could not find container \"e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072\": container with ID starting with e559a82c0b834d05036d8b7d7e391db63a90fb95bbf21aef7a0e62a675b47072 not found: ID does not exist" Dec 05 12:37:51.271089 master-0 kubenswrapper[8731]: I1205 12:37:51.271056 8731 scope.go:117] "RemoveContainer" containerID="2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa" Dec 05 12:37:51.271318 master-0 kubenswrapper[8731]: E1205 12:37:51.271292 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa\": container with ID starting with 2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa not found: ID does not exist" containerID="2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa" Dec 05 12:37:51.271377 master-0 kubenswrapper[8731]: I1205 12:37:51.271338 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa"} err="failed to get container status \"2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa\": rpc error: code = NotFound desc = could not find container \"2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa\": container with ID starting with 2353edada22e30fc07f29e9d6b8499dfe371b7dc7d2795c8973bfc870b0b89fa not found: ID does not exist" Dec 05 12:37:51.271377 master-0 kubenswrapper[8731]: I1205 12:37:51.271358 8731 scope.go:117] "RemoveContainer" containerID="a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77" Dec 05 12:37:51.271668 master-0 kubenswrapper[8731]: E1205 12:37:51.271630 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77\": container with ID starting with a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77 not found: ID does not exist" containerID="a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77" Dec 05 12:37:51.271668 master-0 kubenswrapper[8731]: I1205 12:37:51.271658 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77"} err="failed to get container status \"a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77\": rpc error: code = NotFound desc = could not find container \"a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77\": container with ID starting with a782e082d4327637a0dc3ae6b2947858a31ed6dd6d18a60f26c1d1533bc0ed77 not found: ID does not exist" Dec 05 12:37:51.271757 master-0 kubenswrapper[8731]: I1205 12:37:51.271674 8731 scope.go:117] "RemoveContainer" containerID="401643c70c405d6156a16a3ab17611e0b06471ba9931da499a2092a2a6caa1f3" Dec 05 12:37:51.284994 master-0 kubenswrapper[8731]: I1205 12:37:51.284951 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 05 12:37:51.293228 master-0 kubenswrapper[8731]: I1205 12:37:51.292840 8731 scope.go:117] "RemoveContainer" containerID="eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457" Dec 05 12:37:51.293515 master-0 kubenswrapper[8731]: E1205 12:37:51.293442 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457\": container with ID starting with eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457 not found: ID does not exist" containerID="eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457" Dec 05 12:37:51.293623 master-0 kubenswrapper[8731]: I1205 12:37:51.293509 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457"} err="failed to get container status \"eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457\": rpc error: code = NotFound desc = could not find container \"eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457\": container with ID starting with eb12d89ac382a5bb5bdc3b8dbfd70aaf80443c6890bbd6d374803fc81c9ff457 not found: ID does not exist" Dec 05 12:37:51.293623 master-0 kubenswrapper[8731]: I1205 12:37:51.293555 8731 scope.go:117] "RemoveContainer" containerID="2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994" Dec 05 12:37:51.293943 master-0 kubenswrapper[8731]: E1205 12:37:51.293897 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994\": container with ID starting with 2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994 not found: ID does not exist" containerID="2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994" Dec 05 12:37:51.293943 master-0 kubenswrapper[8731]: I1205 12:37:51.293931 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994"} err="failed to get container status \"2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994\": rpc error: code = NotFound desc = could not find container \"2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994\": container with ID starting with 2292fdf5304b8a28a27399e75d5a9964b3c2748ef25e388360a2e0b43dad6994 not found: ID does not exist" Dec 05 12:37:51.294075 master-0 kubenswrapper[8731]: I1205 12:37:51.293948 8731 scope.go:117] "RemoveContainer" containerID="98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad" Dec 05 12:37:51.294356 master-0 kubenswrapper[8731]: I1205 12:37:51.294182 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad"} err="failed to get container status \"98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad\": rpc error: code = NotFound desc = could not find container \"98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad\": container with ID starting with 98354129621c0afcf019ad1d009c07f40541a4f653f993603bc2165f390f6cad not found: ID does not exist" Dec 05 12:37:51.294356 master-0 kubenswrapper[8731]: I1205 12:37:51.294220 8731 scope.go:117] "RemoveContainer" containerID="5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57" Dec 05 12:37:51.294529 master-0 kubenswrapper[8731]: I1205 12:37:51.294433 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57"} err="failed to get container status \"5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57\": rpc error: code = NotFound desc = could not find container \"5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57\": container with ID starting with 5ce1c8c66afab6c062939524a52e4f0b259f2d0f4ce987835a61aefda3e81e57 not found: ID does not exist" Dec 05 12:37:51.674529 master-0 kubenswrapper[8731]: I1205 12:37:51.674285 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:37:51.675029 master-0 kubenswrapper[8731]: I1205 12:37:51.674984 8731 scope.go:117] "RemoveContainer" containerID="1313a091281e52fd967237a2bce92f6479da56820a8f33315dc25623f650fa7b" Dec 05 12:37:51.675594 master-0 kubenswrapper[8731]: E1205 12:37:51.675483 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=etcd-operator pod=etcd-operator-5bf4d88c6f-dxd24_openshift-etcd-operator(f119ffe4-16bd-49eb-916d-b18ba0d79b54)\"" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" podUID="f119ffe4-16bd-49eb-916d-b18ba0d79b54" Dec 05 12:37:51.875590 master-0 kubenswrapper[8731]: I1205 12:37:51.875496 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-56fcb6cc5f-q9njf_ce3d73c1-f4bd-4c91-936a-086dfa5e3460/cluster-olm-operator/1.log" Dec 05 12:37:51.880601 master-0 kubenswrapper[8731]: I1205 12:37:51.880517 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/2.log" Dec 05 12:37:51.883744 master-0 kubenswrapper[8731]: I1205 12:37:51.883611 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" event={"ID":"7e562fda-e695-4218-a9cf-4179b8d456db","Type":"ContainerStarted","Data":"226d693739ba7f1f0405d228dca51a2e2771f758fde843b579c82652f63d7ed6"} Dec 05 12:37:51.883744 master-0 kubenswrapper[8731]: I1205 12:37:51.883664 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" event={"ID":"7e562fda-e695-4218-a9cf-4179b8d456db","Type":"ContainerStarted","Data":"46c71a14a0f9590da88fc8567ffce1570ccabc57f819c41e45925415e66120f4"} Dec 05 12:37:51.884253 master-0 kubenswrapper[8731]: I1205 12:37:51.884220 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:37:51.886686 master-0 kubenswrapper[8731]: I1205 12:37:51.886522 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f85974995-4vsjv_1871a9d6-6369-4d08-816f-9c6310b61ddf/kube-scheduler-operator-container/1.log" Dec 05 12:37:51.889210 master-0 kubenswrapper[8731]: I1205 12:37:51.889128 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-7bf7f6b755-b2pxs_4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/openshift-apiserver-operator/1.log" Dec 05 12:37:51.891045 master-0 kubenswrapper[8731]: I1205 12:37:51.890997 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/2.log" Dec 05 12:37:51.892398 master-0 kubenswrapper[8731]: I1205 12:37:51.892358 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:37:51.893685 master-0 kubenswrapper[8731]: I1205 12:37:51.893623 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"8fbf247ef3f15fe005ee46e673fbe0b71698dcc9f2759966a03a8cd2730f623b"} Dec 05 12:37:51.894222 master-0 kubenswrapper[8731]: I1205 12:37:51.894161 8731 scope.go:117] "RemoveContainer" containerID="ac54524887aecec21958b7b4fb65da11e780a16d7a6537965df0e9b00dd407c3" Dec 05 12:37:51.896476 master-0 kubenswrapper[8731]: I1205 12:37:51.895709 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-848f645654-g6nj5_594aaded-5615-4bed-87ee-6173059a73be/kube-controller-manager-operator/2.log" Dec 05 12:37:51.897272 master-0 kubenswrapper[8731]: I1205 12:37:51.897233 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-77758bc754-hfqsp_f3792522-fec6-4022-90ac-0b8467fcd625/service-ca-operator/1.log" Dec 05 12:37:51.898738 master-0 kubenswrapper[8731]: I1205 12:37:51.898701 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-xxmfp_ba095394-1873-4793-969d-3be979fa0771/authentication-operator/2.log" Dec 05 12:37:51.899951 master-0 kubenswrapper[8731]: I1205 12:37:51.899916 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/2.log" Dec 05 12:37:51.901812 master-0 kubenswrapper[8731]: I1205 12:37:51.901785 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/2.log" Dec 05 12:37:51.903367 master-0 kubenswrapper[8731]: I1205 12:37:51.903336 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b9c5dfc78-2n8gt_7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/kube-storage-version-migrator-operator/2.log" Dec 05 12:37:51.946613 master-0 kubenswrapper[8731]: I1205 12:37:51.946517 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8" path="/var/lib/kubelet/pods/1ede7946-e35c-4f7d-bb9f-9e6cc518eaa8/volumes" Dec 05 12:37:51.947697 master-0 kubenswrapper[8731]: I1205 12:37:51.947635 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7dd3e9-5a59-4741-970e-aa41c4e078cc" path="/var/lib/kubelet/pods/bb7dd3e9-5a59-4741-970e-aa41c4e078cc/volumes" Dec 05 12:37:51.948936 master-0 kubenswrapper[8731]: I1205 12:37:51.948875 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa1512be-895a-47e0-abf5-0155c71500e3" path="/var/lib/kubelet/pods/fa1512be-895a-47e0-abf5-0155c71500e3/volumes" Dec 05 12:37:52.746270 master-0 kubenswrapper[8731]: I1205 12:37:52.745755 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 05 12:37:52.914010 master-0 kubenswrapper[8731]: I1205 12:37:52.913925 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f"} Dec 05 12:37:52.976280 master-0 kubenswrapper[8731]: I1205 12:37:52.976200 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=2.976162015 podStartE2EDuration="2.976162015s" podCreationTimestamp="2025-12-05 12:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:37:52.973327526 +0000 UTC m=+371.277311693" watchObservedRunningTime="2025-12-05 12:37:52.976162015 +0000 UTC m=+371.280146182" Dec 05 12:37:53.125282 master-0 kubenswrapper[8731]: I1205 12:37:53.125074 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" podStartSLOduration=307.125045706 podStartE2EDuration="5m7.125045706s" podCreationTimestamp="2025-12-05 12:32:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:37:53.122637928 +0000 UTC m=+371.426622105" watchObservedRunningTime="2025-12-05 12:37:53.125045706 +0000 UTC m=+371.429029883" Dec 05 12:37:54.258953 master-0 kubenswrapper[8731]: E1205 12:37:54.258790 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{apiserver-5bdfbf6949-2bhqv.187e51be004bb3ca openshift-oauth-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-oauth-apiserver,Name:apiserver-5bdfbf6949-2bhqv,UID:d72b2b71-27b2-4aff-bf69-7054a9556318,APIVersion:v1,ResourceVersion:6623,FieldPath:spec.containers{oauth-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:33:00.466410442 +0000 UTC m=+78.770394609,LastTimestamp:2025-12-05 12:33:00.466410442 +0000 UTC m=+78.770394609,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:37:55.243791 master-0 kubenswrapper[8731]: I1205 12:37:55.243673 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:56.486735 master-0 kubenswrapper[8731]: I1205 12:37:56.485874 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:57.963269 master-0 kubenswrapper[8731]: I1205 12:37:57.963156 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:57.973110 master-0 kubenswrapper[8731]: I1205 12:37:57.972968 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:37:59.763871 master-0 kubenswrapper[8731]: I1205 12:37:59.763798 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq"] Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: E1205 12:37:59.764016 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076dafdf-a5d2-4e2d-9c38-6932910f7327" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764034 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="076dafdf-a5d2-4e2d-9c38-6932910f7327" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: E1205 12:37:59.764055 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565d5ef6-b0e7-4f04-9460-61f1d3903d37" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764063 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="565d5ef6-b0e7-4f04-9460-61f1d3903d37" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: E1205 12:37:59.764077 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1512be-895a-47e0-abf5-0155c71500e3" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764089 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1512be-895a-47e0-abf5-0155c71500e3" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: E1205 12:37:59.764104 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fa3513-5467-4b0f-a03d-9279d36317bd" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764113 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fa3513-5467-4b0f-a03d-9279d36317bd" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: E1205 12:37:59.764128 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d627fcf3-2a80-4739-add9-e21ad4efc6eb" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764141 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d627fcf3-2a80-4739-add9-e21ad4efc6eb" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: E1205 12:37:59.764160 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7dd3e9-5a59-4741-970e-aa41c4e078cc" containerName="route-controller-manager" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764171 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7dd3e9-5a59-4741-970e-aa41c4e078cc" containerName="route-controller-manager" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764293 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="076dafdf-a5d2-4e2d-9c38-6932910f7327" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764306 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="565d5ef6-b0e7-4f04-9460-61f1d3903d37" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764316 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fa3513-5467-4b0f-a03d-9279d36317bd" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764328 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7dd3e9-5a59-4741-970e-aa41c4e078cc" containerName="route-controller-manager" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764349 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d627fcf3-2a80-4739-add9-e21ad4efc6eb" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764358 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1512be-895a-47e0-abf5-0155c71500e3" containerName="installer" Dec 05 12:37:59.765076 master-0 kubenswrapper[8731]: I1205 12:37:59.764749 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.770339 master-0 kubenswrapper[8731]: I1205 12:37:59.769405 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 12:37:59.770339 master-0 kubenswrapper[8731]: I1205 12:37:59.770169 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 12:37:59.770626 master-0 kubenswrapper[8731]: I1205 12:37:59.770426 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 12:37:59.770626 master-0 kubenswrapper[8731]: I1205 12:37:59.770463 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 12:37:59.773150 master-0 kubenswrapper[8731]: I1205 12:37:59.773102 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 12:37:59.828975 master-0 kubenswrapper[8731]: I1205 12:37:59.828882 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-config\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.828975 master-0 kubenswrapper[8731]: I1205 12:37:59.828964 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vvmn\" (UniqueName: \"kubernetes.io/projected/3c753373-e1f9-457c-a134-721fce3b1575-kube-api-access-9vvmn\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.829426 master-0 kubenswrapper[8731]: I1205 12:37:59.829002 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-client-ca\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.829426 master-0 kubenswrapper[8731]: I1205 12:37:59.829064 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c753373-e1f9-457c-a134-721fce3b1575-serving-cert\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.877422 master-0 kubenswrapper[8731]: I1205 12:37:59.876873 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq"] Dec 05 12:37:59.930782 master-0 kubenswrapper[8731]: I1205 12:37:59.930639 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vvmn\" (UniqueName: \"kubernetes.io/projected/3c753373-e1f9-457c-a134-721fce3b1575-kube-api-access-9vvmn\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.930782 master-0 kubenswrapper[8731]: I1205 12:37:59.930798 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-client-ca\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.931252 master-0 kubenswrapper[8731]: I1205 12:37:59.930882 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c753373-e1f9-457c-a134-721fce3b1575-serving-cert\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.931252 master-0 kubenswrapper[8731]: I1205 12:37:59.930914 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-config\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.932813 master-0 kubenswrapper[8731]: I1205 12:37:59.932738 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-client-ca\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.933007 master-0 kubenswrapper[8731]: I1205 12:37:59.932953 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-config\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:37:59.942103 master-0 kubenswrapper[8731]: I1205 12:37:59.942052 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c753373-e1f9-457c-a134-721fce3b1575-serving-cert\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:38:00.024217 master-0 kubenswrapper[8731]: I1205 12:38:00.024042 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vvmn\" (UniqueName: \"kubernetes.io/projected/3c753373-e1f9-457c-a134-721fce3b1575-kube-api-access-9vvmn\") pod \"route-controller-manager-b48f6bd98-4npsq\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:38:00.095013 master-0 kubenswrapper[8731]: I1205 12:38:00.094877 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:38:00.446959 master-0 kubenswrapper[8731]: I1205 12:38:00.446466 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq"] Dec 05 12:38:00.457840 master-0 kubenswrapper[8731]: W1205 12:38:00.457753 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c753373_e1f9_457c_a134_721fce3b1575.slice/crio-78f3bd1c55cef923965fac9726d2f9b634cbb09d4860b2d5a0f0d35bb16ca8fb WatchSource:0}: Error finding container 78f3bd1c55cef923965fac9726d2f9b634cbb09d4860b2d5a0f0d35bb16ca8fb: Status 404 returned error can't find the container with id 78f3bd1c55cef923965fac9726d2f9b634cbb09d4860b2d5a0f0d35bb16ca8fb Dec 05 12:38:00.515933 master-0 kubenswrapper[8731]: I1205 12:38:00.515869 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:38:00.520429 master-0 kubenswrapper[8731]: I1205 12:38:00.520384 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:38:00.520525 master-0 kubenswrapper[8731]: I1205 12:38:00.520449 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:38:00.935804 master-0 kubenswrapper[8731]: I1205 12:38:00.935725 8731 scope.go:117] "RemoveContainer" containerID="45f66b524a7ae3f72102f1c1f147264a8f0120c9900c09db2778e03215486e8a" Dec 05 12:38:00.963564 master-0 kubenswrapper[8731]: I1205 12:38:00.963373 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" event={"ID":"3c753373-e1f9-457c-a134-721fce3b1575","Type":"ContainerStarted","Data":"762ea77408b7f5a306a93d15bedda329d28149d43e08750a9562ca5f23cd1973"} Dec 05 12:38:00.963564 master-0 kubenswrapper[8731]: I1205 12:38:00.963449 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" event={"ID":"3c753373-e1f9-457c-a134-721fce3b1575","Type":"ContainerStarted","Data":"78f3bd1c55cef923965fac9726d2f9b634cbb09d4860b2d5a0f0d35bb16ca8fb"} Dec 05 12:38:00.964008 master-0 kubenswrapper[8731]: I1205 12:38:00.963600 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:38:01.082678 master-0 kubenswrapper[8731]: I1205 12:38:01.078820 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" podStartSLOduration=315.07880077 podStartE2EDuration="5m15.07880077s" podCreationTimestamp="2025-12-05 12:32:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:01.078057729 +0000 UTC m=+379.382041926" watchObservedRunningTime="2025-12-05 12:38:01.07880077 +0000 UTC m=+379.382784937" Dec 05 12:38:01.280873 master-0 kubenswrapper[8731]: I1205 12:38:01.280371 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:38:01.936979 master-0 kubenswrapper[8731]: I1205 12:38:01.936808 8731 scope.go:117] "RemoveContainer" containerID="23c3457474b764927ea9138fc09fe8b0080b3d5dcfc7a8c9d9bd33c7ad79d98a" Dec 05 12:38:01.936979 master-0 kubenswrapper[8731]: I1205 12:38:01.936907 8731 scope.go:117] "RemoveContainer" containerID="e23a95f64400e0dbc7fc95b5f04dddc2d0290c35a1bcdf186ddbd3cd8314a14f" Dec 05 12:38:01.937723 master-0 kubenswrapper[8731]: I1205 12:38:01.937017 8731 scope.go:117] "RemoveContainer" containerID="d299754c006efdd8044d5689f796048069701078c77f429aafcfe9eafc6522ec" Dec 05 12:38:01.937723 master-0 kubenswrapper[8731]: I1205 12:38:01.937285 8731 scope.go:117] "RemoveContainer" containerID="7a5d71f74727c7976f8b5ae99bcc9e973922d3b63fcffbddb6ae9dd46ae8aa22" Dec 05 12:38:01.974062 master-0 kubenswrapper[8731]: I1205 12:38:01.974010 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-7bf7f6b755-b2pxs_4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/openshift-apiserver-operator/1.log" Dec 05 12:38:01.974355 master-0 kubenswrapper[8731]: I1205 12:38:01.974316 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" event={"ID":"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e","Type":"ContainerStarted","Data":"95843f9193f5780326180ee5d96855091da4ac76a7b08a6ce8f5f391baac0caf"} Dec 05 12:38:02.985678 master-0 kubenswrapper[8731]: I1205 12:38:02.985256 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/2.log" Dec 05 12:38:02.986723 master-0 kubenswrapper[8731]: I1205 12:38:02.986686 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerStarted","Data":"7300fdd0ccd012b07cc22015385845a110863d45bf0c343844c7aeba0c0cd40b"} Dec 05 12:38:02.992896 master-0 kubenswrapper[8731]: I1205 12:38:02.992833 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-848f645654-g6nj5_594aaded-5615-4bed-87ee-6173059a73be/kube-controller-manager-operator/2.log" Dec 05 12:38:02.993080 master-0 kubenswrapper[8731]: I1205 12:38:02.992956 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerStarted","Data":"fc976579761cd166f544b17e7e21a078085d48a1844a3caee0473f2393e3d972"} Dec 05 12:38:02.996630 master-0 kubenswrapper[8731]: I1205 12:38:02.996563 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-77758bc754-hfqsp_f3792522-fec6-4022-90ac-0b8467fcd625/service-ca-operator/1.log" Dec 05 12:38:02.996754 master-0 kubenswrapper[8731]: I1205 12:38:02.996716 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" event={"ID":"f3792522-fec6-4022-90ac-0b8467fcd625","Type":"ContainerStarted","Data":"273fc7466339fc71dbff783d03d786641f9cff2c7e10ab401acbc6c674705b52"} Dec 05 12:38:02.999426 master-0 kubenswrapper[8731]: I1205 12:38:02.999399 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-xxmfp_ba095394-1873-4793-969d-3be979fa0771/authentication-operator/2.log" Dec 05 12:38:02.999500 master-0 kubenswrapper[8731]: I1205 12:38:02.999472 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerStarted","Data":"400a9419d33e5072253e6a099476c2c681d982530672b0c4be40561f95d01978"} Dec 05 12:38:03.936715 master-0 kubenswrapper[8731]: I1205 12:38:03.936655 8731 scope.go:117] "RemoveContainer" containerID="49b30bfe4873642e053b957d836ec7eddcac24e11ada8325b8fc8b72bfefafe8" Dec 05 12:38:04.935032 master-0 kubenswrapper[8731]: I1205 12:38:04.934675 8731 scope.go:117] "RemoveContainer" containerID="57b070ec273f7f37c3345984984d73c010928b9bb1f5746c1d4e18feee8dc2dc" Dec 05 12:38:04.936428 master-0 kubenswrapper[8731]: I1205 12:38:04.935066 8731 scope.go:117] "RemoveContainer" containerID="1313a091281e52fd967237a2bce92f6479da56820a8f33315dc25623f650fa7b" Dec 05 12:38:04.936428 master-0 kubenswrapper[8731]: I1205 12:38:04.935166 8731 scope.go:117] "RemoveContainer" containerID="3507046fb798191fc9af19cde11e6d29feed57ad8ee65fd82dade1b688773700" Dec 05 12:38:04.936428 master-0 kubenswrapper[8731]: I1205 12:38:04.935453 8731 scope.go:117] "RemoveContainer" containerID="142cac7db86d510a3cb1fe121b732aea43f370fa9eb0fe98a9655b028e584160" Dec 05 12:38:04.936428 master-0 kubenswrapper[8731]: E1205 12:38:04.935808 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:38:05.016463 master-0 kubenswrapper[8731]: I1205 12:38:05.016401 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b9c5dfc78-2n8gt_7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/kube-storage-version-migrator-operator/2.log" Dec 05 12:38:05.016757 master-0 kubenswrapper[8731]: I1205 12:38:05.016566 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerStarted","Data":"6b78f4686886eb46c40366678ccd87c7785bc499aa4eabf81ddb13759dd9ebc7"} Dec 05 12:38:05.248799 master-0 kubenswrapper[8731]: I1205 12:38:05.248745 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:38:05.934565 master-0 kubenswrapper[8731]: I1205 12:38:05.934475 8731 scope.go:117] "RemoveContainer" containerID="551cae19e34d48fca769f493289bed8d81be6209860af5e4e43fa9850482cf12" Dec 05 12:38:06.026022 master-0 kubenswrapper[8731]: I1205 12:38:06.025564 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-rw57t_807d9093-aa67-4840-b5be-7f3abcc1beed/kube-apiserver-operator/2.log" Dec 05 12:38:06.031633 master-0 kubenswrapper[8731]: I1205 12:38:06.026083 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerStarted","Data":"96432bd98ca024e492fc580cbc73eb38cd510787da2af19671c5dce6d570c07d"} Dec 05 12:38:06.031633 master-0 kubenswrapper[8731]: I1205 12:38:06.029786 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/2.log" Dec 05 12:38:06.031633 master-0 kubenswrapper[8731]: I1205 12:38:06.029842 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerStarted","Data":"0b797ac3c4b54a3959f9e93f6e0af3ca69c035c47e6f5d5a251314015696c012"} Dec 05 12:38:06.032275 master-0 kubenswrapper[8731]: I1205 12:38:06.032251 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/2.log" Dec 05 12:38:06.032346 master-0 kubenswrapper[8731]: I1205 12:38:06.032308 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerStarted","Data":"b907a1534b6517c416df2085c7f1d267c7cb079929611943c0ef4097c8c96c8d"} Dec 05 12:38:07.043971 master-0 kubenswrapper[8731]: I1205 12:38:07.043890 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-56fcb6cc5f-q9njf_ce3d73c1-f4bd-4c91-936a-086dfa5e3460/cluster-olm-operator/1.log" Dec 05 12:38:07.045492 master-0 kubenswrapper[8731]: I1205 12:38:07.044960 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerStarted","Data":"5911e299ea12124949df6b53fe6e36667af26bd5976d0d79c6027eddac8ef8b5"} Dec 05 12:38:10.461210 master-0 kubenswrapper[8731]: I1205 12:38:10.460590 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rzl84"] Dec 05 12:38:10.465201 master-0 kubenswrapper[8731]: I1205 12:38:10.462600 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.474201 master-0 kubenswrapper[8731]: I1205 12:38:10.471724 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 12:38:10.474201 master-0 kubenswrapper[8731]: I1205 12:38:10.472441 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 12:38:10.474201 master-0 kubenswrapper[8731]: I1205 12:38:10.472742 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-5tl2j" Dec 05 12:38:10.474201 master-0 kubenswrapper[8731]: I1205 12:38:10.473311 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 12:38:10.474201 master-0 kubenswrapper[8731]: I1205 12:38:10.473501 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 12:38:10.491213 master-0 kubenswrapper[8731]: I1205 12:38:10.490335 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rzl84"] Dec 05 12:38:10.556025 master-0 kubenswrapper[8731]: I1205 12:38:10.553598 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-dcvtr"] Dec 05 12:38:10.556025 master-0 kubenswrapper[8731]: I1205 12:38:10.554369 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.569068 master-0 kubenswrapper[8731]: I1205 12:38:10.560464 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"tuned-dockercfg-5khzw" Dec 05 12:38:10.569385 master-0 kubenswrapper[8731]: I1205 12:38:10.569353 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-422c9\" (UniqueName: \"kubernetes.io/projected/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-kube-api-access-422c9\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.569893 master-0 kubenswrapper[8731]: I1205 12:38:10.569848 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-config-volume\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.569969 master-0 kubenswrapper[8731]: I1205 12:38:10.569935 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-metrics-tls\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.575061 master-0 kubenswrapper[8731]: I1205 12:38:10.574116 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-f6j7m"] Dec 05 12:38:10.575317 master-0 kubenswrapper[8731]: I1205 12:38:10.575118 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:38:10.582336 master-0 kubenswrapper[8731]: I1205 12:38:10.581614 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-2vcf7" Dec 05 12:38:10.643070 master-0 kubenswrapper[8731]: I1205 12:38:10.642999 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj"] Dec 05 12:38:10.644606 master-0 kubenswrapper[8731]: I1205 12:38:10.643985 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:10.659210 master-0 kubenswrapper[8731]: I1205 12:38:10.650650 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-zknmp" Dec 05 12:38:10.659210 master-0 kubenswrapper[8731]: I1205 12:38:10.650942 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 05 12:38:10.659210 master-0 kubenswrapper[8731]: I1205 12:38:10.651192 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 05 12:38:10.659210 master-0 kubenswrapper[8731]: I1205 12:38:10.651343 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 05 12:38:10.659210 master-0 kubenswrapper[8731]: I1205 12:38:10.651470 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 05 12:38:10.673213 master-0 kubenswrapper[8731]: I1205 12:38:10.664875 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj"] Dec 05 12:38:10.673213 master-0 kubenswrapper[8731]: I1205 12:38:10.673030 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-run\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.673213 master-0 kubenswrapper[8731]: I1205 12:38:10.673077 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-sys\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.673213 master-0 kubenswrapper[8731]: I1205 12:38:10.673109 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-config-volume\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.673213 master-0 kubenswrapper[8731]: I1205 12:38:10.673157 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-tuned\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.673213 master-0 kubenswrapper[8731]: I1205 12:38:10.673205 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-lib-modules\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.673213 master-0 kubenswrapper[8731]: I1205 12:38:10.673224 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-host\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.673213 master-0 kubenswrapper[8731]: I1205 12:38:10.673243 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-var-lib-kubelet\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.673703 master-0 kubenswrapper[8731]: I1205 12:38:10.673264 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-tmp\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.673703 master-0 kubenswrapper[8731]: I1205 12:38:10.673360 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysconfig\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.679204 master-0 kubenswrapper[8731]: I1205 12:38:10.674319 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-config-volume\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.679204 master-0 kubenswrapper[8731]: I1205 12:38:10.675559 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-metrics-tls\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.679204 master-0 kubenswrapper[8731]: I1205 12:38:10.676834 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-kubernetes\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.679204 master-0 kubenswrapper[8731]: I1205 12:38:10.676865 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-systemd\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.679204 master-0 kubenswrapper[8731]: I1205 12:38:10.676902 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-422c9\" (UniqueName: \"kubernetes.io/projected/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-kube-api-access-422c9\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.679204 master-0 kubenswrapper[8731]: I1205 12:38:10.676930 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-conf\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.679204 master-0 kubenswrapper[8731]: I1205 12:38:10.676964 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwrwm\" (UniqueName: \"kubernetes.io/projected/f2635f9f-219b-4d03-b5b3-496c0c836fae-kube-api-access-fwrwm\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.679204 master-0 kubenswrapper[8731]: I1205 12:38:10.676985 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.679204 master-0 kubenswrapper[8731]: I1205 12:38:10.677001 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-modprobe-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.689987 master-0 kubenswrapper[8731]: I1205 12:38:10.689674 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh"] Dec 05 12:38:10.689987 master-0 kubenswrapper[8731]: I1205 12:38:10.689937 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" podUID="29812c4b-48ac-488c-863c-1d52e39ea2ae" containerName="cluster-version-operator" containerID="cri-o://611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0" gracePeriod=130 Dec 05 12:38:10.705253 master-0 kubenswrapper[8731]: I1205 12:38:10.699133 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-metrics-tls\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.720405 master-0 kubenswrapper[8731]: I1205 12:38:10.720261 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-422c9\" (UniqueName: \"kubernetes.io/projected/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-kube-api-access-422c9\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.737537 master-0 kubenswrapper[8731]: I1205 12:38:10.737452 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t"] Dec 05 12:38:10.738608 master-0 kubenswrapper[8731]: I1205 12:38:10.738567 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.743114 master-0 kubenswrapper[8731]: I1205 12:38:10.743063 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 05 12:38:10.745942 master-0 kubenswrapper[8731]: I1205 12:38:10.743561 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 05 12:38:10.745942 master-0 kubenswrapper[8731]: I1205 12:38:10.743803 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 05 12:38:10.745942 master-0 kubenswrapper[8731]: I1205 12:38:10.744032 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:38:10.745942 master-0 kubenswrapper[8731]: I1205 12:38:10.744170 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-m6vhr" Dec 05 12:38:10.745942 master-0 kubenswrapper[8731]: I1205 12:38:10.744321 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 12:38:10.745942 master-0 kubenswrapper[8731]: I1205 12:38:10.745605 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn"] Dec 05 12:38:10.748873 master-0 kubenswrapper[8731]: I1205 12:38:10.746632 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.750659 master-0 kubenswrapper[8731]: I1205 12:38:10.749933 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-fb2xd" Dec 05 12:38:10.750659 master-0 kubenswrapper[8731]: I1205 12:38:10.750000 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 12:38:10.750659 master-0 kubenswrapper[8731]: I1205 12:38:10.750163 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 12:38:10.750659 master-0 kubenswrapper[8731]: I1205 12:38:10.750264 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 12:38:10.750659 master-0 kubenswrapper[8731]: I1205 12:38:10.750406 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 12:38:10.750659 master-0 kubenswrapper[8731]: I1205 12:38:10.750454 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 12:38:10.762230 master-0 kubenswrapper[8731]: I1205 12:38:10.762164 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7"] Dec 05 12:38:10.762877 master-0 kubenswrapper[8731]: I1205 12:38:10.762852 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:38:10.765237 master-0 kubenswrapper[8731]: I1205 12:38:10.765151 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m"] Dec 05 12:38:10.766749 master-0 kubenswrapper[8731]: I1205 12:38:10.766728 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:38:10.767248 master-0 kubenswrapper[8731]: I1205 12:38:10.767200 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 05 12:38:10.767858 master-0 kubenswrapper[8731]: I1205 12:38:10.767837 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m"] Dec 05 12:38:10.770510 master-0 kubenswrapper[8731]: I1205 12:38:10.770296 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-j8gcn" Dec 05 12:38:10.770777 master-0 kubenswrapper[8731]: I1205 12:38:10.770599 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 12:38:10.770851 master-0 kubenswrapper[8731]: I1205 12:38:10.770824 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-s9ftm" Dec 05 12:38:10.771035 master-0 kubenswrapper[8731]: I1205 12:38:10.771011 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 12:38:10.771159 master-0 kubenswrapper[8731]: I1205 12:38:10.771131 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7"] Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.778986 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-tuned\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.779027 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-lib-modules\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.779051 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-host\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.779095 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/665c4362-e2e5-4f96-92c0-1746c63c7422-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.779128 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-var-lib-kubelet\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.779148 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-tmp\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.779170 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/159e5ddd-ce04-491a-996f-7c7b4bcec546-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.779246 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss5kh\" (UniqueName: \"kubernetes.io/projected/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-kube-api-access-ss5kh\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.779408 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-var-lib-kubelet\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.779515 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-host\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.780359 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-lib-modules\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781034 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx4gb\" (UniqueName: \"kubernetes.io/projected/cb32aaee-e2bd-4023-a95e-48786016725b-kube-api-access-vx4gb\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781094 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6wsq\" (UniqueName: \"kubernetes.io/projected/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-kube-api-access-b6wsq\") pod \"cluster-samples-operator-797cfd8b47-6v84m\" (UID: \"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781146 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysconfig\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781256 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-kubernetes\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781298 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-6v84m\" (UID: \"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781325 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-auth-proxy-config\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781354 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-systemd\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781381 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/159e5ddd-ce04-491a-996f-7c7b4bcec546-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781468 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-config\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781494 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-images\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781515 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27z6k\" (UniqueName: \"kubernetes.io/projected/159e5ddd-ce04-491a-996f-7c7b4bcec546-kube-api-access-27z6k\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781551 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-conf\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781592 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwrwm\" (UniqueName: \"kubernetes.io/projected/f2635f9f-219b-4d03-b5b3-496c0c836fae-kube-api-access-fwrwm\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781602 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-kubernetes\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781675 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-systemd\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781701 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysconfig\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781614 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lckv7\" (UniqueName: \"kubernetes.io/projected/665c4362-e2e5-4f96-92c0-1746c63c7422-kube-api-access-lckv7\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781747 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781901 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-conf\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781937 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.781971 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-modprobe-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782001 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/665c4362-e2e5-4f96-92c0-1746c63c7422-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782108 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-modprobe-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782171 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb32aaee-e2bd-4023-a95e-48786016725b-machine-approver-tls\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782223 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782251 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmjn7\" (UniqueName: \"kubernetes.io/projected/bc18a83a-998e-458e-87f0-d5368da52e1b-kube-api-access-bmjn7\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782269 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-run\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782305 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-sys\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782320 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc18a83a-998e-458e-87f0-d5368da52e1b-hosts-file\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782338 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782441 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-sys\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782498 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-run\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.783572 master-0 kubenswrapper[8731]: I1205 12:38:10.782653 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-tmp\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.785925 master-0 kubenswrapper[8731]: I1205 12:38:10.784085 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-tuned\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.785925 master-0 kubenswrapper[8731]: I1205 12:38:10.785403 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 12:38:10.817081 master-0 kubenswrapper[8731]: I1205 12:38:10.817017 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:10.834215 master-0 kubenswrapper[8731]: I1205 12:38:10.831096 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwrwm\" (UniqueName: \"kubernetes.io/projected/f2635f9f-219b-4d03-b5b3-496c0c836fae-kube-api-access-fwrwm\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.836553 master-0 kubenswrapper[8731]: I1205 12:38:10.836517 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r"] Dec 05 12:38:10.840208 master-0 kubenswrapper[8731]: I1205 12:38:10.837544 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:10.843947 master-0 kubenswrapper[8731]: I1205 12:38:10.842681 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 12:38:10.844682 master-0 kubenswrapper[8731]: I1205 12:38:10.844617 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-6t5rm" Dec 05 12:38:10.844906 master-0 kubenswrapper[8731]: I1205 12:38:10.844853 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 05 12:38:10.845062 master-0 kubenswrapper[8731]: I1205 12:38:10.845038 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 12:38:10.845306 master-0 kubenswrapper[8731]: I1205 12:38:10.845203 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 05 12:38:10.851269 master-0 kubenswrapper[8731]: I1205 12:38:10.851161 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-55965856b6-q9qdg"] Dec 05 12:38:10.859112 master-0 kubenswrapper[8731]: I1205 12:38:10.852224 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:10.859112 master-0 kubenswrapper[8731]: I1205 12:38:10.854653 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 05 12:38:10.873510 master-0 kubenswrapper[8731]: I1205 12:38:10.872936 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-8mxmd" Dec 05 12:38:10.873510 master-0 kubenswrapper[8731]: I1205 12:38:10.873364 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 05 12:38:10.873825 master-0 kubenswrapper[8731]: I1205 12:38:10.873647 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 05 12:38:10.873863 master-0 kubenswrapper[8731]: I1205 12:38:10.873844 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 05 12:38:10.874169 master-0 kubenswrapper[8731]: I1205 12:38:10.874071 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 05 12:38:10.879797 master-0 kubenswrapper[8731]: I1205 12:38:10.879723 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj"] Dec 05 12:38:10.887687 master-0 kubenswrapper[8731]: I1205 12:38:10.885886 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lckv7\" (UniqueName: \"kubernetes.io/projected/665c4362-e2e5-4f96-92c0-1746c63c7422-kube-api-access-lckv7\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:10.887687 master-0 kubenswrapper[8731]: I1205 12:38:10.885959 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/665c4362-e2e5-4f96-92c0-1746c63c7422-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:10.887687 master-0 kubenswrapper[8731]: I1205 12:38:10.886081 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb32aaee-e2bd-4023-a95e-48786016725b-machine-approver-tls\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.887687 master-0 kubenswrapper[8731]: I1205 12:38:10.886120 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-service-ca-bundle\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:10.887687 master-0 kubenswrapper[8731]: I1205 12:38:10.886168 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:38:10.887687 master-0 kubenswrapper[8731]: I1205 12:38:10.886248 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmjn7\" (UniqueName: \"kubernetes.io/projected/bc18a83a-998e-458e-87f0-d5368da52e1b-kube-api-access-bmjn7\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:38:10.887687 master-0 kubenswrapper[8731]: I1205 12:38:10.886330 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14df948-1ec4-4785-ad33-28d1e7063959-serving-cert\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:10.887687 master-0 kubenswrapper[8731]: I1205 12:38:10.886447 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.896096 master-0 kubenswrapper[8731]: I1205 12:38:10.886794 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc18a83a-998e-458e-87f0-d5368da52e1b-hosts-file\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:38:10.908639 master-0 kubenswrapper[8731]: I1205 12:38:10.908561 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/665c4362-e2e5-4f96-92c0-1746c63c7422-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:10.908899 master-0 kubenswrapper[8731]: I1205 12:38:10.908654 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc18a83a-998e-458e-87f0-d5368da52e1b-hosts-file\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:38:10.908899 master-0 kubenswrapper[8731]: I1205 12:38:10.908730 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.910403 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.913541 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.913972 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.914237 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.914506 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a14df948-1ec4-4785-ad33-28d1e7063959-snapshots\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.914738 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-trusted-ca-bundle\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.914786 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/665c4362-e2e5-4f96-92c0-1746c63c7422-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.914827 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/159e5ddd-ce04-491a-996f-7c7b4bcec546-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.914971 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss5kh\" (UniqueName: \"kubernetes.io/projected/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-kube-api-access-ss5kh\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.914999 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g7n7\" (UniqueName: \"kubernetes.io/projected/a14df948-1ec4-4785-ad33-28d1e7063959-kube-api-access-2g7n7\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.915053 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx4gb\" (UniqueName: \"kubernetes.io/projected/cb32aaee-e2bd-4023-a95e-48786016725b-kube-api-access-vx4gb\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.915081 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6wsq\" (UniqueName: \"kubernetes.io/projected/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-kube-api-access-b6wsq\") pod \"cluster-samples-operator-797cfd8b47-6v84m\" (UID: \"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.915127 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-6v84m\" (UID: \"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.915153 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-auth-proxy-config\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.915198 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-config\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.915222 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/159e5ddd-ce04-491a-996f-7c7b4bcec546-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.915245 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-images\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.915271 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27z6k\" (UniqueName: \"kubernetes.io/projected/159e5ddd-ce04-491a-996f-7c7b4bcec546-kube-api-access-27z6k\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.915997 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-zpvbv" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.918261 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/159e5ddd-ce04-491a-996f-7c7b4bcec546-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.918572 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-config\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.919011 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-auth-proxy-config\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.919161 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-images\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.919224 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j"] Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.919542 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb32aaee-e2bd-4023-a95e-48786016725b-machine-approver-tls\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.920227 master-0 kubenswrapper[8731]: I1205 12:38:10.919786 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:38:10.921323 master-0 kubenswrapper[8731]: I1205 12:38:10.920431 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/665c4362-e2e5-4f96-92c0-1746c63c7422-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:10.928774 master-0 kubenswrapper[8731]: I1205 12:38:10.927022 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:38:10.929522 master-0 kubenswrapper[8731]: I1205 12:38:10.929419 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:38:10.929783 master-0 kubenswrapper[8731]: I1205 12:38:10.929733 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/159e5ddd-ce04-491a-996f-7c7b4bcec546-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.932879 master-0 kubenswrapper[8731]: I1205 12:38:10.931270 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-jbzfz" Dec 05 12:38:10.932879 master-0 kubenswrapper[8731]: I1205 12:38:10.931534 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 12:38:10.932879 master-0 kubenswrapper[8731]: I1205 12:38:10.932585 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k"] Dec 05 12:38:10.933651 master-0 kubenswrapper[8731]: I1205 12:38:10.933628 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:10.948241 master-0 kubenswrapper[8731]: I1205 12:38:10.937034 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 05 12:38:10.948241 master-0 kubenswrapper[8731]: I1205 12:38:10.937262 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 05 12:38:10.948241 master-0 kubenswrapper[8731]: I1205 12:38:10.937422 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 05 12:38:10.948241 master-0 kubenswrapper[8731]: I1205 12:38:10.937473 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 05 12:38:10.948241 master-0 kubenswrapper[8731]: I1205 12:38:10.937626 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-cz7x2" Dec 05 12:38:10.948241 master-0 kubenswrapper[8731]: I1205 12:38:10.942075 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r"] Dec 05 12:38:10.948241 master-0 kubenswrapper[8731]: I1205 12:38:10.942133 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-55965856b6-q9qdg"] Dec 05 12:38:10.948241 master-0 kubenswrapper[8731]: I1205 12:38:10.945820 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj"] Dec 05 12:38:10.958844 master-0 kubenswrapper[8731]: I1205 12:38:10.948946 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-6v84m\" (UID: \"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:38:10.958844 master-0 kubenswrapper[8731]: I1205 12:38:10.952087 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27z6k\" (UniqueName: \"kubernetes.io/projected/159e5ddd-ce04-491a-996f-7c7b4bcec546-kube-api-access-27z6k\") pod \"cluster-cloud-controller-manager-operator-74f484689c-lbt9t\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:10.958844 master-0 kubenswrapper[8731]: I1205 12:38:10.953125 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lckv7\" (UniqueName: \"kubernetes.io/projected/665c4362-e2e5-4f96-92c0-1746c63c7422-kube-api-access-lckv7\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:10.958844 master-0 kubenswrapper[8731]: I1205 12:38:10.954081 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmjn7\" (UniqueName: \"kubernetes.io/projected/bc18a83a-998e-458e-87f0-d5368da52e1b-kube-api-access-bmjn7\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:38:10.958844 master-0 kubenswrapper[8731]: I1205 12:38:10.954664 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx4gb\" (UniqueName: \"kubernetes.io/projected/cb32aaee-e2bd-4023-a95e-48786016725b-kube-api-access-vx4gb\") pod \"machine-approver-f797d8546-k5pmn\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:10.958844 master-0 kubenswrapper[8731]: I1205 12:38:10.957710 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6wsq\" (UniqueName: \"kubernetes.io/projected/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-kube-api-access-b6wsq\") pod \"cluster-samples-operator-797cfd8b47-6v84m\" (UID: \"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:38:10.959600 master-0 kubenswrapper[8731]: I1205 12:38:10.959212 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k"] Dec 05 12:38:10.974941 master-0 kubenswrapper[8731]: I1205 12:38:10.971552 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss5kh\" (UniqueName: \"kubernetes.io/projected/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-kube-api-access-ss5kh\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:38:10.974941 master-0 kubenswrapper[8731]: I1205 12:38:10.971680 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j"] Dec 05 12:38:10.974941 master-0 kubenswrapper[8731]: I1205 12:38:10.974791 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-88d48b57d-x947v"] Dec 05 12:38:10.975323 master-0 kubenswrapper[8731]: I1205 12:38:10.974946 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:38:10.983212 master-0 kubenswrapper[8731]: I1205 12:38:10.980196 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:10.983212 master-0 kubenswrapper[8731]: I1205 12:38:10.982694 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-88d48b57d-x947v"] Dec 05 12:38:10.993658 master-0 kubenswrapper[8731]: W1205 12:38:10.989336 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2635f9f_219b_4d03_b5b3_496c0c836fae.slice/crio-b3ecec2aa414e2dc966b3af1e3db3667edb0ce30dd8be08c7dc1e26871633e6e WatchSource:0}: Error finding container b3ecec2aa414e2dc966b3af1e3db3667edb0ce30dd8be08c7dc1e26871633e6e: Status 404 returned error can't find the container with id b3ecec2aa414e2dc966b3af1e3db3667edb0ce30dd8be08c7dc1e26871633e6e Dec 05 12:38:10.993658 master-0 kubenswrapper[8731]: I1205 12:38:10.989932 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 12:38:10.993658 master-0 kubenswrapper[8731]: I1205 12:38:10.989954 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 12:38:10.993658 master-0 kubenswrapper[8731]: I1205 12:38:10.990042 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-wrm9q" Dec 05 12:38:10.993658 master-0 kubenswrapper[8731]: I1205 12:38:10.990287 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 12:38:10.995670 master-0 kubenswrapper[8731]: I1205 12:38:10.995595 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:38:11.012897 master-0 kubenswrapper[8731]: I1205 12:38:11.008925 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz"] Dec 05 12:38:11.012897 master-0 kubenswrapper[8731]: I1205 12:38:11.010018 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.013659 master-0 kubenswrapper[8731]: I1205 12:38:11.013518 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x"] Dec 05 12:38:11.013916 master-0 kubenswrapper[8731]: I1205 12:38:11.013866 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:38:11.015260 master-0 kubenswrapper[8731]: I1205 12:38:11.014294 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 12:38:11.015260 master-0 kubenswrapper[8731]: I1205 12:38:11.014294 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-285qn" Dec 05 12:38:11.015260 master-0 kubenswrapper[8731]: I1205 12:38:11.014629 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.015885 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a45f340c-0eca-4460-8961-4ca360467eeb-available-featuregates\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.015917 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb46q\" (UniqueName: \"kubernetes.io/projected/e5dfcb1e-1231-4f07-8c21-748965718729-kube-api-access-pb46q\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.015961 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-service-ca-bundle\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.015979 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5dfcb1e-1231-4f07-8c21-748965718729-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.016008 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14df948-1ec4-4785-ad33-28d1e7063959-serving-cert\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.016028 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a14df948-1ec4-4785-ad33-28d1e7063959-snapshots\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.016045 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-trusted-ca-bundle\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.016064 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7ftf\" (UniqueName: \"kubernetes.io/projected/a45f340c-0eca-4460-8961-4ca360467eeb-kube-api-access-r7ftf\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.016084 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7n7\" (UniqueName: \"kubernetes.io/projected/a14df948-1ec4-4785-ad33-28d1e7063959-kube-api-access-2g7n7\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.016104 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5dfcb1e-1231-4f07-8c21-748965718729-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.016134 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a45f340c-0eca-4460-8961-4ca360467eeb-serving-cert\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.017780 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-service-ca-bundle\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.018342 master-0 kubenswrapper[8731]: I1205 12:38:11.018329 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-trusted-ca-bundle\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.018912 master-0 kubenswrapper[8731]: I1205 12:38:11.018788 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a14df948-1ec4-4785-ad33-28d1e7063959-snapshots\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.020328 master-0 kubenswrapper[8731]: I1205 12:38:11.020288 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 12:38:11.020506 master-0 kubenswrapper[8731]: I1205 12:38:11.020474 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 12:38:11.025214 master-0 kubenswrapper[8731]: I1205 12:38:11.022062 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14df948-1ec4-4785-ad33-28d1e7063959-serving-cert\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.037750 master-0 kubenswrapper[8731]: I1205 12:38:11.037599 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx"] Dec 05 12:38:11.040538 master-0 kubenswrapper[8731]: E1205 12:38:11.038324 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29812c4b-48ac-488c-863c-1d52e39ea2ae" containerName="cluster-version-operator" Dec 05 12:38:11.040538 master-0 kubenswrapper[8731]: I1205 12:38:11.038340 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="29812c4b-48ac-488c-863c-1d52e39ea2ae" containerName="cluster-version-operator" Dec 05 12:38:11.040538 master-0 kubenswrapper[8731]: I1205 12:38:11.038445 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="29812c4b-48ac-488c-863c-1d52e39ea2ae" containerName="cluster-version-operator" Dec 05 12:38:11.040538 master-0 kubenswrapper[8731]: I1205 12:38:11.039082 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.040538 master-0 kubenswrapper[8731]: I1205 12:38:11.039323 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7n7\" (UniqueName: \"kubernetes.io/projected/a14df948-1ec4-4785-ad33-28d1e7063959-kube-api-access-2g7n7\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.044814 master-0 kubenswrapper[8731]: I1205 12:38:11.043014 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:38:11.044814 master-0 kubenswrapper[8731]: I1205 12:38:11.043556 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz"] Dec 05 12:38:11.044814 master-0 kubenswrapper[8731]: I1205 12:38:11.044754 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 12:38:11.044814 master-0 kubenswrapper[8731]: I1205 12:38:11.044787 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-xvwgq" Dec 05 12:38:11.045257 master-0 kubenswrapper[8731]: I1205 12:38:11.044820 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 12:38:11.045257 master-0 kubenswrapper[8731]: I1205 12:38:11.044859 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 12:38:11.045257 master-0 kubenswrapper[8731]: I1205 12:38:11.044898 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 12:38:11.045389 master-0 kubenswrapper[8731]: I1205 12:38:11.045270 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 12:38:11.048773 master-0 kubenswrapper[8731]: I1205 12:38:11.048729 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x"] Dec 05 12:38:11.067342 master-0 kubenswrapper[8731]: I1205 12:38:11.064861 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:11.079978 master-0 kubenswrapper[8731]: I1205 12:38:11.076403 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx"] Dec 05 12:38:11.086627 master-0 kubenswrapper[8731]: I1205 12:38:11.086591 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:11.106360 master-0 kubenswrapper[8731]: I1205 12:38:11.105545 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" event={"ID":"f2635f9f-219b-4d03-b5b3-496c0c836fae","Type":"ContainerStarted","Data":"b3ecec2aa414e2dc966b3af1e3db3667edb0ce30dd8be08c7dc1e26871633e6e"} Dec 05 12:38:11.111883 master-0 kubenswrapper[8731]: I1205 12:38:11.111826 8731 generic.go:334] "Generic (PLEG): container finished" podID="29812c4b-48ac-488c-863c-1d52e39ea2ae" containerID="611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0" exitCode=0 Dec 05 12:38:11.111968 master-0 kubenswrapper[8731]: I1205 12:38:11.111896 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" event={"ID":"29812c4b-48ac-488c-863c-1d52e39ea2ae","Type":"ContainerDied","Data":"611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0"} Dec 05 12:38:11.111968 master-0 kubenswrapper[8731]: I1205 12:38:11.111938 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" event={"ID":"29812c4b-48ac-488c-863c-1d52e39ea2ae","Type":"ContainerDied","Data":"16660d02bb2781827fb05b56da3da55397e61aedd1747341b89ed543b687f8e3"} Dec 05 12:38:11.112038 master-0 kubenswrapper[8731]: I1205 12:38:11.111968 8731 scope.go:117] "RemoveContainer" containerID="611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.116874 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") pod \"29812c4b-48ac-488c-863c-1d52e39ea2ae\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.116987 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29812c4b-48ac-488c-863c-1d52e39ea2ae-service-ca\") pod \"29812c4b-48ac-488c-863c-1d52e39ea2ae\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117015 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-ssl-certs\") pod \"29812c4b-48ac-488c-863c-1d52e39ea2ae\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117067 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29812c4b-48ac-488c-863c-1d52e39ea2ae-kube-api-access\") pod \"29812c4b-48ac-488c-863c-1d52e39ea2ae\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117128 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-cvo-updatepayloads\") pod \"29812c4b-48ac-488c-863c-1d52e39ea2ae\" (UID: \"29812c4b-48ac-488c-863c-1d52e39ea2ae\") " Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117331 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117404 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-srv-cert\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117441 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqblj\" (UniqueName: \"kubernetes.io/projected/531b8927-92db-4e9d-9a0a-12ff948cdaad-kube-api-access-xqblj\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117491 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-images\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117526 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5dfcb1e-1231-4f07-8c21-748965718729-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117562 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ee7a76b-cf1d-4513-b314-5aa314da818d-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117588 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-images\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117614 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117645 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht5kr\" (UniqueName: \"kubernetes.io/projected/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-kube-api-access-ht5kr\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117673 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/531b8927-92db-4e9d-9a0a-12ff948cdaad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117696 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cert\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117717 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkqq7\" (UniqueName: \"kubernetes.io/projected/a280c582-685e-47ac-bf6b-248aa0c129a9-kube-api-access-xkqq7\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117769 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7ftf\" (UniqueName: \"kubernetes.io/projected/a45f340c-0eca-4460-8961-4ca360467eeb-kube-api-access-r7ftf\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117810 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5dfcb1e-1231-4f07-8c21-748965718729-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117839 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkdtr\" (UniqueName: \"kubernetes.io/projected/1ee7a76b-cf1d-4513-b314-5aa314da818d-kube-api-access-lkdtr\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.117861 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-config\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.118526 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwz29\" (UniqueName: \"kubernetes.io/projected/db2e54b6-4879-40f4-9359-a8b0c31e76c2-kube-api-access-nwz29\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.118564 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.118588 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a45f340c-0eca-4460-8961-4ca360467eeb-serving-cert\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.118607 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-srv-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.118623 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.118652 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-config\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.118669 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a45f340c-0eca-4460-8961-4ca360467eeb-available-featuregates\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.118689 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb46q\" (UniqueName: \"kubernetes.io/projected/e5dfcb1e-1231-4f07-8c21-748965718729-kube-api-access-pb46q\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.121811 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "29812c4b-48ac-488c-863c-1d52e39ea2ae" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.122683 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29812c4b-48ac-488c-863c-1d52e39ea2ae-service-ca" (OuterVolumeSpecName: "service-ca") pod "29812c4b-48ac-488c-863c-1d52e39ea2ae" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.122690 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "29812c4b-48ac-488c-863c-1d52e39ea2ae" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.122738 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a45f340c-0eca-4460-8961-4ca360467eeb-available-featuregates\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:11.131224 master-0 kubenswrapper[8731]: I1205 12:38:11.123856 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5dfcb1e-1231-4f07-8c21-748965718729-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:11.132846 master-0 kubenswrapper[8731]: I1205 12:38:11.131789 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "29812c4b-48ac-488c-863c-1d52e39ea2ae" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:38:11.132846 master-0 kubenswrapper[8731]: I1205 12:38:11.132202 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29812c4b-48ac-488c-863c-1d52e39ea2ae-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "29812c4b-48ac-488c-863c-1d52e39ea2ae" (UID: "29812c4b-48ac-488c-863c-1d52e39ea2ae"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:38:11.142992 master-0 kubenswrapper[8731]: I1205 12:38:11.142438 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5dfcb1e-1231-4f07-8c21-748965718729-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:11.148283 master-0 kubenswrapper[8731]: I1205 12:38:11.148229 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb46q\" (UniqueName: \"kubernetes.io/projected/e5dfcb1e-1231-4f07-8c21-748965718729-kube-api-access-pb46q\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:11.152566 master-0 kubenswrapper[8731]: I1205 12:38:11.150433 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a45f340c-0eca-4460-8961-4ca360467eeb-serving-cert\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:11.153912 master-0 kubenswrapper[8731]: W1205 12:38:11.153277 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod159e5ddd_ce04_491a_996f_7c7b4bcec546.slice/crio-bccef0932d1cbf8543a5017aa6fc3ec91308392451786d0877281c1041d23958 WatchSource:0}: Error finding container bccef0932d1cbf8543a5017aa6fc3ec91308392451786d0877281c1041d23958: Status 404 returned error can't find the container with id bccef0932d1cbf8543a5017aa6fc3ec91308392451786d0877281c1041d23958 Dec 05 12:38:11.157126 master-0 kubenswrapper[8731]: I1205 12:38:11.155127 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7ftf\" (UniqueName: \"kubernetes.io/projected/a45f340c-0eca-4460-8961-4ca360467eeb-kube-api-access-r7ftf\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:11.172029 master-0 kubenswrapper[8731]: I1205 12:38:11.171949 8731 scope.go:117] "RemoveContainer" containerID="611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0" Dec 05 12:38:11.172603 master-0 kubenswrapper[8731]: E1205 12:38:11.172557 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0\": container with ID starting with 611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0 not found: ID does not exist" containerID="611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0" Dec 05 12:38:11.172668 master-0 kubenswrapper[8731]: I1205 12:38:11.172623 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0"} err="failed to get container status \"611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0\": rpc error: code = NotFound desc = could not find container \"611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0\": container with ID starting with 611db95b41286905ea53cbc63db74b99b42b65b967c7368704d8de37b85458a0 not found: ID does not exist" Dec 05 12:38:11.215858 master-0 kubenswrapper[8731]: I1205 12:38:11.199592 8731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.224998 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqq7\" (UniqueName: \"kubernetes.io/projected/a280c582-685e-47ac-bf6b-248aa0c129a9-kube-api-access-xkqq7\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225071 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdtr\" (UniqueName: \"kubernetes.io/projected/1ee7a76b-cf1d-4513-b314-5aa314da818d-kube-api-access-lkdtr\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225090 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-config\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225126 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-images\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225153 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwz29\" (UniqueName: \"kubernetes.io/projected/db2e54b6-4879-40f4-9359-a8b0c31e76c2-kube-api-access-nwz29\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225171 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225255 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-srv-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225276 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225295 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225322 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-config\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225350 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-srv-cert\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225379 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqblj\" (UniqueName: \"kubernetes.io/projected/531b8927-92db-4e9d-9a0a-12ff948cdaad-kube-api-access-xqblj\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225397 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-images\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225420 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ee7a76b-cf1d-4513-b314-5aa314da818d-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225437 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-images\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225455 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225485 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht5kr\" (UniqueName: \"kubernetes.io/projected/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-kube-api-access-ht5kr\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225513 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cert\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225530 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjgb\" (UniqueName: \"kubernetes.io/projected/365bf663-fd5b-44df-a327-0438995c015d-kube-api-access-lqjgb\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225549 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/531b8927-92db-4e9d-9a0a-12ff948cdaad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225567 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/365bf663-fd5b-44df-a327-0438995c015d-proxy-tls\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225602 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29812c4b-48ac-488c-863c-1d52e39ea2ae-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225614 8731 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/29812c4b-48ac-488c-863c-1d52e39ea2ae-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225624 8731 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225636 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29812c4b-48ac-488c-863c-1d52e39ea2ae-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.225645 8731 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/29812c4b-48ac-488c-863c-1d52e39ea2ae-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.227042 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-config\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.242078 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-config\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.246515 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.254978 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-images\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.291901 master-0 kubenswrapper[8731]: I1205 12:38:11.258145 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-images\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.357409 master-0 kubenswrapper[8731]: I1205 12:38:11.356095 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-srv-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.357485 master-0 kubenswrapper[8731]: I1205 12:38:11.357442 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.361116 master-0 kubenswrapper[8731]: I1205 12:38:11.357597 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht5kr\" (UniqueName: \"kubernetes.io/projected/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-kube-api-access-ht5kr\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.361116 master-0 kubenswrapper[8731]: I1205 12:38:11.358148 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdtr\" (UniqueName: \"kubernetes.io/projected/1ee7a76b-cf1d-4513-b314-5aa314da818d-kube-api-access-lkdtr\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.361116 master-0 kubenswrapper[8731]: I1205 12:38:11.359299 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.366863 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.367534 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ee7a76b-cf1d-4513-b314-5aa314da818d-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.368637 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.369631 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.369673 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-srv-cert\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: W1205 12:38:11.369791 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc18a83a_998e_458e_87f0_d5368da52e1b.slice/crio-ed538f41551e0e7b372ee4dcc843f84e56fe8d6677fe847816efda02bfd61218 WatchSource:0}: Error finding container ed538f41551e0e7b372ee4dcc843f84e56fe8d6677fe847816efda02bfd61218: Status 404 returned error can't find the container with id ed538f41551e0e7b372ee4dcc843f84e56fe8d6677fe847816efda02bfd61218 Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.370022 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/531b8927-92db-4e9d-9a0a-12ff948cdaad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.370806 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjgb\" (UniqueName: \"kubernetes.io/projected/365bf663-fd5b-44df-a327-0438995c015d-kube-api-access-lqjgb\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.370888 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/365bf663-fd5b-44df-a327-0438995c015d-proxy-tls\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.370957 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-images\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.371298 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.372501 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cert\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.374484 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqq7\" (UniqueName: \"kubernetes.io/projected/a280c582-685e-47ac-bf6b-248aa0c129a9-kube-api-access-xkqq7\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.375455 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwz29\" (UniqueName: \"kubernetes.io/projected/db2e54b6-4879-40f4-9359-a8b0c31e76c2-kube-api-access-nwz29\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.378277 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqblj\" (UniqueName: \"kubernetes.io/projected/531b8927-92db-4e9d-9a0a-12ff948cdaad-kube-api-access-xqblj\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.378720 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.380491 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/365bf663-fd5b-44df-a327-0438995c015d-proxy-tls\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.381842 master-0 kubenswrapper[8731]: I1205 12:38:11.380921 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-images\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.383238 master-0 kubenswrapper[8731]: I1205 12:38:11.382734 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.385245 master-0 kubenswrapper[8731]: I1205 12:38:11.385191 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rzl84"] Dec 05 12:38:11.394173 master-0 kubenswrapper[8731]: I1205 12:38:11.391074 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:38:11.401022 master-0 kubenswrapper[8731]: I1205 12:38:11.398880 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjgb\" (UniqueName: \"kubernetes.io/projected/365bf663-fd5b-44df-a327-0438995c015d-kube-api-access-lqjgb\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.416067 master-0 kubenswrapper[8731]: I1205 12:38:11.415712 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:38:11.422716 master-0 kubenswrapper[8731]: I1205 12:38:11.422570 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7"] Dec 05 12:38:11.434209 master-0 kubenswrapper[8731]: I1205 12:38:11.433308 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m"] Dec 05 12:38:11.451683 master-0 kubenswrapper[8731]: I1205 12:38:11.451447 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj"] Dec 05 12:38:11.461910 master-0 kubenswrapper[8731]: I1205 12:38:11.461514 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:38:11.474857 master-0 kubenswrapper[8731]: I1205 12:38:11.464528 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh"] Dec 05 12:38:11.474857 master-0 kubenswrapper[8731]: W1205 12:38:11.466562 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod665c4362_e2e5_4f96_92c0_1746c63c7422.slice/crio-24b6227b14f227965d3702a28c5ff0f7f572415f72495d988769ab39d10c0d8f WatchSource:0}: Error finding container 24b6227b14f227965d3702a28c5ff0f7f572415f72495d988769ab39d10c0d8f: Status 404 returned error can't find the container with id 24b6227b14f227965d3702a28c5ff0f7f572415f72495d988769ab39d10c0d8f Dec 05 12:38:11.474857 master-0 kubenswrapper[8731]: I1205 12:38:11.467136 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-77dfcc565f-2chqh"] Dec 05 12:38:11.506078 master-0 kubenswrapper[8731]: I1205 12:38:11.506007 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:11.526079 master-0 kubenswrapper[8731]: I1205 12:38:11.524816 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5"] Dec 05 12:38:11.528531 master-0 kubenswrapper[8731]: I1205 12:38:11.527254 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.532476 master-0 kubenswrapper[8731]: I1205 12:38:11.532437 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-tjfgr" Dec 05 12:38:11.532671 master-0 kubenswrapper[8731]: I1205 12:38:11.532625 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 12:38:11.532727 master-0 kubenswrapper[8731]: I1205 12:38:11.532710 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 12:38:11.547359 master-0 kubenswrapper[8731]: I1205 12:38:11.541249 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 12:38:11.550229 master-0 kubenswrapper[8731]: I1205 12:38:11.548752 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:11.583757 master-0 kubenswrapper[8731]: I1205 12:38:11.583646 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0792bf-e2da-4ee7-91fe-032299cea42f-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.583757 master-0 kubenswrapper[8731]: I1205 12:38:11.583719 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d0792bf-e2da-4ee7-91fe-032299cea42f-kube-api-access\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.583890 master-0 kubenswrapper[8731]: I1205 12:38:11.583765 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.585387 master-0 kubenswrapper[8731]: I1205 12:38:11.583799 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d0792bf-e2da-4ee7-91fe-032299cea42f-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.585598 master-0 kubenswrapper[8731]: I1205 12:38:11.585576 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.657036 master-0 kubenswrapper[8731]: I1205 12:38:11.656981 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r"] Dec 05 12:38:11.673343 master-0 kubenswrapper[8731]: I1205 12:38:11.670538 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:38:11.687693 master-0 kubenswrapper[8731]: I1205 12:38:11.687642 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0792bf-e2da-4ee7-91fe-032299cea42f-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.687868 master-0 kubenswrapper[8731]: I1205 12:38:11.687706 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d0792bf-e2da-4ee7-91fe-032299cea42f-kube-api-access\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.687868 master-0 kubenswrapper[8731]: I1205 12:38:11.687741 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.687868 master-0 kubenswrapper[8731]: I1205 12:38:11.687777 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d0792bf-e2da-4ee7-91fe-032299cea42f-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.687868 master-0 kubenswrapper[8731]: I1205 12:38:11.687812 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.687995 master-0 kubenswrapper[8731]: I1205 12:38:11.687902 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.688766 master-0 kubenswrapper[8731]: I1205 12:38:11.688721 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-55965856b6-q9qdg"] Dec 05 12:38:11.688852 master-0 kubenswrapper[8731]: I1205 12:38:11.688835 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.690344 master-0 kubenswrapper[8731]: I1205 12:38:11.690301 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d0792bf-e2da-4ee7-91fe-032299cea42f-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.700092 master-0 kubenswrapper[8731]: I1205 12:38:11.695913 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0792bf-e2da-4ee7-91fe-032299cea42f-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.701776 master-0 kubenswrapper[8731]: W1205 12:38:11.701652 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5dfcb1e_1231_4f07_8c21_748965718729.slice/crio-12c707b6a686095bb6b918fa64b447ec88e080a7e32878fed57fd6822470f9a2 WatchSource:0}: Error finding container 12c707b6a686095bb6b918fa64b447ec88e080a7e32878fed57fd6822470f9a2: Status 404 returned error can't find the container with id 12c707b6a686095bb6b918fa64b447ec88e080a7e32878fed57fd6822470f9a2 Dec 05 12:38:11.730253 master-0 kubenswrapper[8731]: I1205 12:38:11.726872 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d0792bf-e2da-4ee7-91fe-032299cea42f-kube-api-access\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.801983 master-0 kubenswrapper[8731]: I1205 12:38:11.797172 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj"] Dec 05 12:38:11.868870 master-0 kubenswrapper[8731]: I1205 12:38:11.865032 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:38:11.896510 master-0 kubenswrapper[8731]: I1205 12:38:11.894245 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j"] Dec 05 12:38:11.911899 master-0 kubenswrapper[8731]: I1205 12:38:11.899482 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k"] Dec 05 12:38:11.911899 master-0 kubenswrapper[8731]: I1205 12:38:11.909439 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-88d48b57d-x947v"] Dec 05 12:38:11.972256 master-0 kubenswrapper[8731]: I1205 12:38:11.972209 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29812c4b-48ac-488c-863c-1d52e39ea2ae" path="/var/lib/kubelet/pods/29812c4b-48ac-488c-863c-1d52e39ea2ae/volumes" Dec 05 12:38:12.023730 master-0 kubenswrapper[8731]: I1205 12:38:12.014011 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x"] Dec 05 12:38:12.035780 master-0 kubenswrapper[8731]: W1205 12:38:12.035741 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb2e54b6_4879_40f4_9359_a8b0c31e76c2.slice/crio-abe43915cc1089507c40de3eaceadf732ca7d07e2f0e1b5a070959328db4199f WatchSource:0}: Error finding container abe43915cc1089507c40de3eaceadf732ca7d07e2f0e1b5a070959328db4199f: Status 404 returned error can't find the container with id abe43915cc1089507c40de3eaceadf732ca7d07e2f0e1b5a070959328db4199f Dec 05 12:38:12.047442 master-0 kubenswrapper[8731]: I1205 12:38:12.047337 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz"] Dec 05 12:38:12.085793 master-0 kubenswrapper[8731]: I1205 12:38:12.085698 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx"] Dec 05 12:38:12.100600 master-0 kubenswrapper[8731]: W1205 12:38:12.100554 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod365bf663_fd5b_44df_a327_0438995c015d.slice/crio-1e7e859b537def1a21239198a62664ddf26773c1c6f156f411606722ed8cb4e6 WatchSource:0}: Error finding container 1e7e859b537def1a21239198a62664ddf26773c1c6f156f411606722ed8cb4e6: Status 404 returned error can't find the container with id 1e7e859b537def1a21239198a62664ddf26773c1c6f156f411606722ed8cb4e6 Dec 05 12:38:12.138899 master-0 kubenswrapper[8731]: I1205 12:38:12.138829 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" event={"ID":"665c4362-e2e5-4f96-92c0-1746c63c7422","Type":"ContainerStarted","Data":"3db5e7135e78833b3a92c45746a15fb15863c2f0a43a694b41b9752901ee6fff"} Dec 05 12:38:12.139217 master-0 kubenswrapper[8731]: I1205 12:38:12.138916 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" event={"ID":"665c4362-e2e5-4f96-92c0-1746c63c7422","Type":"ContainerStarted","Data":"24b6227b14f227965d3702a28c5ff0f7f572415f72495d988769ab39d10c0d8f"} Dec 05 12:38:12.141274 master-0 kubenswrapper[8731]: I1205 12:38:12.141230 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rzl84" event={"ID":"ce9e2a6b-8ce7-477c-8bc7-24033243eabe","Type":"ContainerStarted","Data":"19edfec7b5dad95038c7d84a7af049f95270320317e900ea90d94c12477f0556"} Dec 05 12:38:12.144934 master-0 kubenswrapper[8731]: I1205 12:38:12.144870 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" event={"ID":"159e5ddd-ce04-491a-996f-7c7b4bcec546","Type":"ContainerStarted","Data":"bccef0932d1cbf8543a5017aa6fc3ec91308392451786d0877281c1041d23958"} Dec 05 12:38:12.146268 master-0 kubenswrapper[8731]: I1205 12:38:12.146235 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" event={"ID":"3d96c85a-fc88-46af-83d5-6c71ec6e2c23","Type":"ContainerStarted","Data":"0b836f01dcb43b6af667ba219b4059e3935a66980e122a92a279a33e963cb964"} Dec 05 12:38:12.150774 master-0 kubenswrapper[8731]: I1205 12:38:12.150719 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" event={"ID":"1ee7a76b-cf1d-4513-b314-5aa314da818d","Type":"ContainerStarted","Data":"12b2377bacbd62ee93e11591af977d559716347304347ca9deca90451df150b7"} Dec 05 12:38:12.157818 master-0 kubenswrapper[8731]: I1205 12:38:12.157766 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" event={"ID":"7d0792bf-e2da-4ee7-91fe-032299cea42f","Type":"ContainerStarted","Data":"62b006cd51c7d10f8e6f8e36ec2fbd7c2b472a5db5854f2056fdbe13f97f07e2"} Dec 05 12:38:12.160387 master-0 kubenswrapper[8731]: I1205 12:38:12.160367 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" event={"ID":"a280c582-685e-47ac-bf6b-248aa0c129a9","Type":"ContainerStarted","Data":"5404e1e33c358f139ce43aadf9014fd74254490d058389642b99e6aa71216243"} Dec 05 12:38:12.162603 master-0 kubenswrapper[8731]: I1205 12:38:12.162582 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" event={"ID":"531b8927-92db-4e9d-9a0a-12ff948cdaad","Type":"ContainerStarted","Data":"9ca3179bcac9021f22c3e7255b372820926d29356fd67cac276625618bd240a6"} Dec 05 12:38:12.197616 master-0 kubenswrapper[8731]: I1205 12:38:12.197519 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" event={"ID":"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f","Type":"ContainerStarted","Data":"04a1540e033fc0d53be3a8dfa10cb49b28b11738b911cb185f8d919660d6db47"} Dec 05 12:38:12.205023 master-0 kubenswrapper[8731]: I1205 12:38:12.204952 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" event={"ID":"f2635f9f-219b-4d03-b5b3-496c0c836fae","Type":"ContainerStarted","Data":"9af8ab651bd63e8bc68f978bbf5aebe3be6cba36632826679028614cf841f7a7"} Dec 05 12:38:12.209546 master-0 kubenswrapper[8731]: I1205 12:38:12.209473 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-q9qdg" event={"ID":"a14df948-1ec4-4785-ad33-28d1e7063959","Type":"ContainerStarted","Data":"dac2262b7105102ce37a8db95766fbd5753d50bed12fb86441b8247f4653fc04"} Dec 05 12:38:12.225875 master-0 kubenswrapper[8731]: I1205 12:38:12.225770 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" event={"ID":"db2e54b6-4879-40f4-9359-a8b0c31e76c2","Type":"ContainerStarted","Data":"abe43915cc1089507c40de3eaceadf732ca7d07e2f0e1b5a070959328db4199f"} Dec 05 12:38:12.233353 master-0 kubenswrapper[8731]: I1205 12:38:12.233229 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" event={"ID":"a45f340c-0eca-4460-8961-4ca360467eeb","Type":"ContainerStarted","Data":"bd884dd8fbf0cb13a01d3369dc09dbcaf952157e210620f5c83187eab601232c"} Dec 05 12:38:12.234518 master-0 kubenswrapper[8731]: I1205 12:38:12.234430 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" podStartSLOduration=2.234407591 podStartE2EDuration="2.234407591s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:12.227213075 +0000 UTC m=+390.531197262" watchObservedRunningTime="2025-12-05 12:38:12.234407591 +0000 UTC m=+390.538391748" Dec 05 12:38:12.235952 master-0 kubenswrapper[8731]: I1205 12:38:12.235892 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" event={"ID":"365bf663-fd5b-44df-a327-0438995c015d","Type":"ContainerStarted","Data":"1e7e859b537def1a21239198a62664ddf26773c1c6f156f411606722ed8cb4e6"} Dec 05 12:38:12.239591 master-0 kubenswrapper[8731]: I1205 12:38:12.239522 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f6j7m" event={"ID":"bc18a83a-998e-458e-87f0-d5368da52e1b","Type":"ContainerStarted","Data":"07e07e8abe6c713822a9a9f9d007d69e82226fff5293360065d48b0d20066a24"} Dec 05 12:38:12.239591 master-0 kubenswrapper[8731]: I1205 12:38:12.239592 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f6j7m" event={"ID":"bc18a83a-998e-458e-87f0-d5368da52e1b","Type":"ContainerStarted","Data":"ed538f41551e0e7b372ee4dcc843f84e56fe8d6677fe847816efda02bfd61218"} Dec 05 12:38:12.245667 master-0 kubenswrapper[8731]: I1205 12:38:12.245576 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" event={"ID":"cb32aaee-e2bd-4023-a95e-48786016725b","Type":"ContainerStarted","Data":"a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd"} Dec 05 12:38:12.245667 master-0 kubenswrapper[8731]: I1205 12:38:12.245638 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" event={"ID":"cb32aaee-e2bd-4023-a95e-48786016725b","Type":"ContainerStarted","Data":"6e1d751af2ee301ad26fbf6333d9b7e721f5c4867d46226437328ecdf257aa8a"} Dec 05 12:38:12.251837 master-0 kubenswrapper[8731]: I1205 12:38:12.251785 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" event={"ID":"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb","Type":"ContainerStarted","Data":"4ed24c6b6f900a1eeba45b567c2d9336f6c8e081eea3b175ce81e0e583f37f25"} Dec 05 12:38:12.261533 master-0 kubenswrapper[8731]: I1205 12:38:12.259932 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" event={"ID":"e5dfcb1e-1231-4f07-8c21-748965718729","Type":"ContainerStarted","Data":"0881763cdee0ccdba8e5778bd81b5f22280f808126ce0c207bab6ce207f27343"} Dec 05 12:38:12.261533 master-0 kubenswrapper[8731]: I1205 12:38:12.260005 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" event={"ID":"e5dfcb1e-1231-4f07-8c21-748965718729","Type":"ContainerStarted","Data":"12c707b6a686095bb6b918fa64b447ec88e080a7e32878fed57fd6822470f9a2"} Dec 05 12:38:12.264406 master-0 kubenswrapper[8731]: I1205 12:38:12.264330 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f6j7m" podStartSLOduration=2.264317757 podStartE2EDuration="2.264317757s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:12.260639048 +0000 UTC m=+390.564623215" watchObservedRunningTime="2025-12-05 12:38:12.264317757 +0000 UTC m=+390.568301924" Dec 05 12:38:13.317197 master-0 kubenswrapper[8731]: I1205 12:38:13.316760 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" event={"ID":"365bf663-fd5b-44df-a327-0438995c015d","Type":"ContainerStarted","Data":"2ddf5b913c8def77b7ff031d3fc7b9bf753cce46d08c9770d77762f9cc280fa4"} Dec 05 12:38:13.317197 master-0 kubenswrapper[8731]: I1205 12:38:13.317154 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" event={"ID":"365bf663-fd5b-44df-a327-0438995c015d","Type":"ContainerStarted","Data":"667cd6e2494d3e418da699cd959c521ed7b9fd7b51dbacbf2b69ac4e7e52a0ee"} Dec 05 12:38:13.324058 master-0 kubenswrapper[8731]: I1205 12:38:13.323994 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" event={"ID":"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb","Type":"ContainerStarted","Data":"d020dc1da875fa7050b8625e3ab5b871982f94b26fe855432e6788c518f5cf79"} Dec 05 12:38:13.324377 master-0 kubenswrapper[8731]: I1205 12:38:13.324212 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:13.327380 master-0 kubenswrapper[8731]: I1205 12:38:13.327329 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" event={"ID":"1ee7a76b-cf1d-4513-b314-5aa314da818d","Type":"ContainerStarted","Data":"fede23ee661b7ea969175a9ba409eaa0d47e0f9069332c22e94196ac525e392e"} Dec 05 12:38:13.330023 master-0 kubenswrapper[8731]: I1205 12:38:13.329950 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" event={"ID":"7d0792bf-e2da-4ee7-91fe-032299cea42f","Type":"ContainerStarted","Data":"b0b6c6e5845f21451ae31807803c6c6a8522e211f03654dc5026b22ef249bb34"} Dec 05 12:38:13.332786 master-0 kubenswrapper[8731]: I1205 12:38:13.332710 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" event={"ID":"db2e54b6-4879-40f4-9359-a8b0c31e76c2","Type":"ContainerStarted","Data":"cdb8d0aeedb7fe170e4e369cb7ea0bb66c8248a41e81c31debeed5037170ef86"} Dec 05 12:38:13.333604 master-0 kubenswrapper[8731]: I1205 12:38:13.333576 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:13.334784 master-0 kubenswrapper[8731]: I1205 12:38:13.334747 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:38:13.338658 master-0 kubenswrapper[8731]: I1205 12:38:13.337639 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" podStartSLOduration=3.337508707 podStartE2EDuration="3.337508707s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:13.335049439 +0000 UTC m=+391.639033696" watchObservedRunningTime="2025-12-05 12:38:13.337508707 +0000 UTC m=+391.641492864" Dec 05 12:38:13.340480 master-0 kubenswrapper[8731]: I1205 12:38:13.340326 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:38:13.384218 master-0 kubenswrapper[8731]: I1205 12:38:13.381041 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" podStartSLOduration=3.381010034 podStartE2EDuration="3.381010034s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:13.379250357 +0000 UTC m=+391.683234524" watchObservedRunningTime="2025-12-05 12:38:13.381010034 +0000 UTC m=+391.684994201" Dec 05 12:38:13.384218 master-0 kubenswrapper[8731]: I1205 12:38:13.382900 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" podStartSLOduration=2.382892515 podStartE2EDuration="2.382892515s" podCreationTimestamp="2025-12-05 12:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:13.357342198 +0000 UTC m=+391.661326355" watchObservedRunningTime="2025-12-05 12:38:13.382892515 +0000 UTC m=+391.686876682" Dec 05 12:38:13.387131 master-0 kubenswrapper[8731]: I1205 12:38:13.387105 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2pp25"] Dec 05 12:38:13.388336 master-0 kubenswrapper[8731]: I1205 12:38:13.388320 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.390684 master-0 kubenswrapper[8731]: I1205 12:38:13.390640 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-wqbtd" Dec 05 12:38:13.405291 master-0 kubenswrapper[8731]: I1205 12:38:13.405205 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" podStartSLOduration=3.405153793 podStartE2EDuration="3.405153793s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:13.397412922 +0000 UTC m=+391.701397089" watchObservedRunningTime="2025-12-05 12:38:13.405153793 +0000 UTC m=+391.709137960" Dec 05 12:38:13.406698 master-0 kubenswrapper[8731]: I1205 12:38:13.405313 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2pp25"] Dec 05 12:38:13.424987 master-0 kubenswrapper[8731]: I1205 12:38:13.424908 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-utilities\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.425432 master-0 kubenswrapper[8731]: I1205 12:38:13.425416 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-catalog-content\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.425572 master-0 kubenswrapper[8731]: I1205 12:38:13.425559 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wst\" (UniqueName: \"kubernetes.io/projected/b74e0607-6ed0-4119-8870-895b7d336830-kube-api-access-72wst\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.527862 master-0 kubenswrapper[8731]: I1205 12:38:13.526468 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-utilities\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.527862 master-0 kubenswrapper[8731]: I1205 12:38:13.527275 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-catalog-content\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.527862 master-0 kubenswrapper[8731]: I1205 12:38:13.527350 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wst\" (UniqueName: \"kubernetes.io/projected/b74e0607-6ed0-4119-8870-895b7d336830-kube-api-access-72wst\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.527862 master-0 kubenswrapper[8731]: I1205 12:38:13.527486 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-utilities\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.528198 master-0 kubenswrapper[8731]: I1205 12:38:13.527927 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-catalog-content\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.554033 master-0 kubenswrapper[8731]: I1205 12:38:13.553963 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wst\" (UniqueName: \"kubernetes.io/projected/b74e0607-6ed0-4119-8870-895b7d336830-kube-api-access-72wst\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.588336 master-0 kubenswrapper[8731]: I1205 12:38:13.586841 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4p8p6"] Dec 05 12:38:13.588336 master-0 kubenswrapper[8731]: I1205 12:38:13.588077 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.592251 master-0 kubenswrapper[8731]: I1205 12:38:13.590388 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-rbhdx" Dec 05 12:38:13.602514 master-0 kubenswrapper[8731]: I1205 12:38:13.602006 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4p8p6"] Dec 05 12:38:13.628730 master-0 kubenswrapper[8731]: I1205 12:38:13.628537 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-utilities\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.628730 master-0 kubenswrapper[8731]: I1205 12:38:13.628594 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpxqg\" (UniqueName: \"kubernetes.io/projected/8defe125-1529-4091-adff-e9d17a2b298f-kube-api-access-jpxqg\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.628730 master-0 kubenswrapper[8731]: I1205 12:38:13.628627 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-catalog-content\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.730823 master-0 kubenswrapper[8731]: I1205 12:38:13.729670 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-utilities\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.730823 master-0 kubenswrapper[8731]: I1205 12:38:13.729725 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpxqg\" (UniqueName: \"kubernetes.io/projected/8defe125-1529-4091-adff-e9d17a2b298f-kube-api-access-jpxqg\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.730823 master-0 kubenswrapper[8731]: I1205 12:38:13.729752 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-catalog-content\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.730823 master-0 kubenswrapper[8731]: I1205 12:38:13.730297 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-catalog-content\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.730823 master-0 kubenswrapper[8731]: I1205 12:38:13.730509 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-utilities\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.745339 master-0 kubenswrapper[8731]: I1205 12:38:13.744593 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:38:13.754261 master-0 kubenswrapper[8731]: I1205 12:38:13.753628 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c"] Dec 05 12:38:13.754638 master-0 kubenswrapper[8731]: I1205 12:38:13.754624 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:13.766156 master-0 kubenswrapper[8731]: W1205 12:38:13.765470 8731 reflector.go:561] object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert": failed to list *v1.Secret: secrets "packageserver-service-cert" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-operator-lifecycle-manager": no relationship found between node 'master-0' and this object Dec 05 12:38:13.766156 master-0 kubenswrapper[8731]: E1205 12:38:13.765532 8731 reflector.go:158] "Unhandled Error" err="object-\"openshift-operator-lifecycle-manager\"/\"packageserver-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"packageserver-service-cert\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operator-lifecycle-manager\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 05 12:38:13.772077 master-0 kubenswrapper[8731]: I1205 12:38:13.771956 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpxqg\" (UniqueName: \"kubernetes.io/projected/8defe125-1529-4091-adff-e9d17a2b298f-kube-api-access-jpxqg\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.783347 master-0 kubenswrapper[8731]: I1205 12:38:13.783304 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c"] Dec 05 12:38:13.833154 master-0 kubenswrapper[8731]: I1205 12:38:13.833044 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b13885ef-d2b5-4591-825d-446cf8729bc1-tmpfs\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:13.833443 master-0 kubenswrapper[8731]: I1205 12:38:13.833337 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmjkp\" (UniqueName: \"kubernetes.io/projected/b13885ef-d2b5-4591-825d-446cf8729bc1-kube-api-access-xmjkp\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:13.833477 master-0 kubenswrapper[8731]: I1205 12:38:13.833438 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:13.833543 master-0 kubenswrapper[8731]: I1205 12:38:13.833470 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:13.915979 master-0 kubenswrapper[8731]: I1205 12:38:13.915533 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:38:13.935089 master-0 kubenswrapper[8731]: I1205 12:38:13.933915 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmjkp\" (UniqueName: \"kubernetes.io/projected/b13885ef-d2b5-4591-825d-446cf8729bc1-kube-api-access-xmjkp\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:13.935089 master-0 kubenswrapper[8731]: I1205 12:38:13.933974 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:13.935089 master-0 kubenswrapper[8731]: I1205 12:38:13.933992 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:13.935089 master-0 kubenswrapper[8731]: I1205 12:38:13.934023 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b13885ef-d2b5-4591-825d-446cf8729bc1-tmpfs\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:13.935089 master-0 kubenswrapper[8731]: I1205 12:38:13.934529 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b13885ef-d2b5-4591-825d-446cf8729bc1-tmpfs\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:13.950399 master-0 kubenswrapper[8731]: I1205 12:38:13.950352 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmjkp\" (UniqueName: \"kubernetes.io/projected/b13885ef-d2b5-4591-825d-446cf8729bc1-kube-api-access-xmjkp\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:14.935287 master-0 kubenswrapper[8731]: E1205 12:38:14.935220 8731 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:38:14.935287 master-0 kubenswrapper[8731]: E1205 12:38:14.935274 8731 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:38:14.935916 master-0 kubenswrapper[8731]: E1205 12:38:14.935355 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert podName:b13885ef-d2b5-4591-825d-446cf8729bc1 nodeName:}" failed. No retries permitted until 2025-12-05 12:38:15.435325898 +0000 UTC m=+393.739310065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert") pod "packageserver-58c5755b49-6dx4c" (UID: "b13885ef-d2b5-4591-825d-446cf8729bc1") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:38:14.935916 master-0 kubenswrapper[8731]: E1205 12:38:14.935436 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert podName:b13885ef-d2b5-4591-825d-446cf8729bc1 nodeName:}" failed. No retries permitted until 2025-12-05 12:38:15.43540572 +0000 UTC m=+393.739389887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert") pod "packageserver-58c5755b49-6dx4c" (UID: "b13885ef-d2b5-4591-825d-446cf8729bc1") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:38:14.985108 master-0 kubenswrapper[8731]: I1205 12:38:14.985045 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dmnvq"] Dec 05 12:38:14.986826 master-0 kubenswrapper[8731]: I1205 12:38:14.986789 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:14.989647 master-0 kubenswrapper[8731]: I1205 12:38:14.989595 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-kj2kk" Dec 05 12:38:14.996488 master-0 kubenswrapper[8731]: I1205 12:38:14.996426 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmnvq"] Dec 05 12:38:15.053947 master-0 kubenswrapper[8731]: I1205 12:38:15.053709 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tncxt\" (UniqueName: \"kubernetes.io/projected/ebfbe878-1796-4a20-b3f0-76165038252e-kube-api-access-tncxt\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:15.053947 master-0 kubenswrapper[8731]: I1205 12:38:15.053773 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-catalog-content\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:15.053947 master-0 kubenswrapper[8731]: I1205 12:38:15.053802 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-utilities\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:15.093704 master-0 kubenswrapper[8731]: I1205 12:38:15.093636 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 12:38:15.155568 master-0 kubenswrapper[8731]: I1205 12:38:15.155500 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tncxt\" (UniqueName: \"kubernetes.io/projected/ebfbe878-1796-4a20-b3f0-76165038252e-kube-api-access-tncxt\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:15.155927 master-0 kubenswrapper[8731]: I1205 12:38:15.155620 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-catalog-content\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:15.155927 master-0 kubenswrapper[8731]: I1205 12:38:15.155664 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-utilities\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:15.156462 master-0 kubenswrapper[8731]: I1205 12:38:15.156410 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-catalog-content\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:15.156840 master-0 kubenswrapper[8731]: I1205 12:38:15.156811 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-utilities\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:15.175868 master-0 kubenswrapper[8731]: I1205 12:38:15.175803 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tncxt\" (UniqueName: \"kubernetes.io/projected/ebfbe878-1796-4a20-b3f0-76165038252e-kube-api-access-tncxt\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:15.347245 master-0 kubenswrapper[8731]: I1205 12:38:15.347192 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:15.460781 master-0 kubenswrapper[8731]: I1205 12:38:15.460700 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:15.460781 master-0 kubenswrapper[8731]: I1205 12:38:15.460778 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:15.465923 master-0 kubenswrapper[8731]: I1205 12:38:15.465085 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:15.471288 master-0 kubenswrapper[8731]: I1205 12:38:15.469332 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:15.606612 master-0 kubenswrapper[8731]: I1205 12:38:15.606479 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:15.858279 master-0 kubenswrapper[8731]: I1205 12:38:15.857369 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-45nwc"] Dec 05 12:38:15.860570 master-0 kubenswrapper[8731]: I1205 12:38:15.858645 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:15.866435 master-0 kubenswrapper[8731]: I1205 12:38:15.862688 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-bnqtr" Dec 05 12:38:15.866435 master-0 kubenswrapper[8731]: I1205 12:38:15.862747 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 12:38:15.970449 master-0 kubenswrapper[8731]: I1205 12:38:15.970393 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95d8fb27-8b2b-4749-add3-9e9b16edb693-mcd-auth-proxy-config\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:15.970449 master-0 kubenswrapper[8731]: I1205 12:38:15.970449 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb42t\" (UniqueName: \"kubernetes.io/projected/95d8fb27-8b2b-4749-add3-9e9b16edb693-kube-api-access-fb42t\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:15.971093 master-0 kubenswrapper[8731]: I1205 12:38:15.970620 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95d8fb27-8b2b-4749-add3-9e9b16edb693-rootfs\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:15.971093 master-0 kubenswrapper[8731]: I1205 12:38:15.970700 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d8fb27-8b2b-4749-add3-9e9b16edb693-proxy-tls\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:16.072345 master-0 kubenswrapper[8731]: I1205 12:38:16.072276 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95d8fb27-8b2b-4749-add3-9e9b16edb693-rootfs\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:16.072345 master-0 kubenswrapper[8731]: I1205 12:38:16.072352 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d8fb27-8b2b-4749-add3-9e9b16edb693-proxy-tls\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:16.072666 master-0 kubenswrapper[8731]: I1205 12:38:16.072402 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95d8fb27-8b2b-4749-add3-9e9b16edb693-rootfs\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:16.072666 master-0 kubenswrapper[8731]: I1205 12:38:16.072423 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95d8fb27-8b2b-4749-add3-9e9b16edb693-mcd-auth-proxy-config\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:16.072666 master-0 kubenswrapper[8731]: I1205 12:38:16.072449 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb42t\" (UniqueName: \"kubernetes.io/projected/95d8fb27-8b2b-4749-add3-9e9b16edb693-kube-api-access-fb42t\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:16.073947 master-0 kubenswrapper[8731]: I1205 12:38:16.073898 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95d8fb27-8b2b-4749-add3-9e9b16edb693-mcd-auth-proxy-config\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:16.088604 master-0 kubenswrapper[8731]: I1205 12:38:16.076261 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d8fb27-8b2b-4749-add3-9e9b16edb693-proxy-tls\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:16.094818 master-0 kubenswrapper[8731]: I1205 12:38:16.094477 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb42t\" (UniqueName: \"kubernetes.io/projected/95d8fb27-8b2b-4749-add3-9e9b16edb693-kube-api-access-fb42t\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:16.187293 master-0 kubenswrapper[8731]: I1205 12:38:16.187143 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wfk7f"] Dec 05 12:38:16.232309 master-0 kubenswrapper[8731]: I1205 12:38:16.191976 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:16.232309 master-0 kubenswrapper[8731]: I1205 12:38:16.192815 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:38:16.232309 master-0 kubenswrapper[8731]: I1205 12:38:16.194545 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wfk7f"] Dec 05 12:38:16.234504 master-0 kubenswrapper[8731]: I1205 12:38:16.234454 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-94n4t" Dec 05 12:38:16.275703 master-0 kubenswrapper[8731]: I1205 12:38:16.275638 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czcmr\" (UniqueName: \"kubernetes.io/projected/9c31f89c-b01b-4853-a901-bccc25441a46-kube-api-access-czcmr\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:16.275984 master-0 kubenswrapper[8731]: I1205 12:38:16.275727 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-catalog-content\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:16.275984 master-0 kubenswrapper[8731]: I1205 12:38:16.275777 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-utilities\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:16.377430 master-0 kubenswrapper[8731]: I1205 12:38:16.377364 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czcmr\" (UniqueName: \"kubernetes.io/projected/9c31f89c-b01b-4853-a901-bccc25441a46-kube-api-access-czcmr\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:16.377430 master-0 kubenswrapper[8731]: I1205 12:38:16.377436 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-catalog-content\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:16.377774 master-0 kubenswrapper[8731]: I1205 12:38:16.377475 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-utilities\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:16.378078 master-0 kubenswrapper[8731]: I1205 12:38:16.378041 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-catalog-content\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:16.378141 master-0 kubenswrapper[8731]: I1205 12:38:16.378121 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-utilities\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:16.899966 master-0 kubenswrapper[8731]: I1205 12:38:16.899902 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czcmr\" (UniqueName: \"kubernetes.io/projected/9c31f89c-b01b-4853-a901-bccc25441a46-kube-api-access-czcmr\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:17.170111 master-0 kubenswrapper[8731]: I1205 12:38:17.169795 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:38:18.935092 master-0 kubenswrapper[8731]: I1205 12:38:18.935026 8731 scope.go:117] "RemoveContainer" containerID="142cac7db86d510a3cb1fe121b732aea43f370fa9eb0fe98a9655b028e584160" Dec 05 12:38:32.326073 master-0 kubenswrapper[8731]: I1205 12:38:32.325991 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn"] Dec 05 12:38:34.180984 master-0 kubenswrapper[8731]: W1205 12:38:34.180942 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95d8fb27_8b2b_4749_add3_9e9b16edb693.slice/crio-6beaecf0540643cd8682361578d468ced3e3fd0c3495c281547ab1933154b6de WatchSource:0}: Error finding container 6beaecf0540643cd8682361578d468ced3e3fd0c3495c281547ab1933154b6de: Status 404 returned error can't find the container with id 6beaecf0540643cd8682361578d468ced3e3fd0c3495c281547ab1933154b6de Dec 05 12:38:34.466100 master-0 kubenswrapper[8731]: I1205 12:38:34.466019 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-45nwc" event={"ID":"95d8fb27-8b2b-4749-add3-9e9b16edb693","Type":"ContainerStarted","Data":"6beaecf0540643cd8682361578d468ced3e3fd0c3495c281547ab1933154b6de"} Dec 05 12:38:34.471225 master-0 kubenswrapper[8731]: I1205 12:38:34.471198 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/2.log" Dec 05 12:38:34.471500 master-0 kubenswrapper[8731]: I1205 12:38:34.471475 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerStarted","Data":"9777a8bc6b6304e63d985ec731f3cc644371b2e8a1b12c0286fe36fe4b312701"} Dec 05 12:38:34.480776 master-0 kubenswrapper[8731]: I1205 12:38:34.480694 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" event={"ID":"531b8927-92db-4e9d-9a0a-12ff948cdaad","Type":"ContainerStarted","Data":"5227d615ebfc1e16e53996d356380f47ad9e8fd55349d0658112ccb54f8ef1bb"} Dec 05 12:38:34.498816 master-0 kubenswrapper[8731]: I1205 12:38:34.497754 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" event={"ID":"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f","Type":"ContainerStarted","Data":"1b430712d22ac161924beaf5505ca8d2172d739daf63d6df7781a1aff1c1828b"} Dec 05 12:38:34.542442 master-0 kubenswrapper[8731]: I1205 12:38:34.537165 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wfk7f"] Dec 05 12:38:34.542442 master-0 kubenswrapper[8731]: I1205 12:38:34.537727 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" podStartSLOduration=2.436761395 podStartE2EDuration="24.537703269s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:11.959682441 +0000 UTC m=+390.263666608" lastFinishedPulling="2025-12-05 12:38:34.060624315 +0000 UTC m=+412.364608482" observedRunningTime="2025-12-05 12:38:34.535719035 +0000 UTC m=+412.839703212" watchObservedRunningTime="2025-12-05 12:38:34.537703269 +0000 UTC m=+412.841687436" Dec 05 12:38:34.642102 master-0 kubenswrapper[8731]: I1205 12:38:34.642038 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4p8p6"] Dec 05 12:38:34.645875 master-0 kubenswrapper[8731]: W1205 12:38:34.645805 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8defe125_1529_4091_adff_e9d17a2b298f.slice/crio-010fcb3fd705e5d750eedd1adb06872aa524c08fbc85d2a921261129ee9bc96b WatchSource:0}: Error finding container 010fcb3fd705e5d750eedd1adb06872aa524c08fbc85d2a921261129ee9bc96b: Status 404 returned error can't find the container with id 010fcb3fd705e5d750eedd1adb06872aa524c08fbc85d2a921261129ee9bc96b Dec 05 12:38:34.658943 master-0 kubenswrapper[8731]: I1205 12:38:34.658882 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dmnvq"] Dec 05 12:38:34.661941 master-0 kubenswrapper[8731]: W1205 12:38:34.661892 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c31f89c_b01b_4853_a901_bccc25441a46.slice/crio-46252d0271f63a839c2cf8d137d190a08ca8c85ab8a7cd49fe478dd080504839 WatchSource:0}: Error finding container 46252d0271f63a839c2cf8d137d190a08ca8c85ab8a7cd49fe478dd080504839: Status 404 returned error can't find the container with id 46252d0271f63a839c2cf8d137d190a08ca8c85ab8a7cd49fe478dd080504839 Dec 05 12:38:34.662548 master-0 kubenswrapper[8731]: I1205 12:38:34.661997 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2pp25"] Dec 05 12:38:34.695730 master-0 kubenswrapper[8731]: W1205 12:38:34.694114 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebfbe878_1796_4a20_b3f0_76165038252e.slice/crio-fb396b2885c697fc62cb75681d56dacee81e32f235fe9f427b2f065f721f39f2 WatchSource:0}: Error finding container fb396b2885c697fc62cb75681d56dacee81e32f235fe9f427b2f065f721f39f2: Status 404 returned error can't find the container with id fb396b2885c697fc62cb75681d56dacee81e32f235fe9f427b2f065f721f39f2 Dec 05 12:38:34.837115 master-0 kubenswrapper[8731]: I1205 12:38:34.837049 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c"] Dec 05 12:38:34.889278 master-0 kubenswrapper[8731]: W1205 12:38:34.886145 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb13885ef_d2b5_4591_825d_446cf8729bc1.slice/crio-0f8a1e4d8de6a06f67857b43e08d70d6ce0e19926ff25b49cbea007cf56e4e61 WatchSource:0}: Error finding container 0f8a1e4d8de6a06f67857b43e08d70d6ce0e19926ff25b49cbea007cf56e4e61: Status 404 returned error can't find the container with id 0f8a1e4d8de6a06f67857b43e08d70d6ce0e19926ff25b49cbea007cf56e4e61 Dec 05 12:38:35.507599 master-0 kubenswrapper[8731]: I1205 12:38:35.507523 8731 generic.go:334] "Generic (PLEG): container finished" podID="8defe125-1529-4091-adff-e9d17a2b298f" containerID="539fe177f9c1deb8d425356a84818b5c05d811c1ab77b966156d70120d25eef1" exitCode=0 Dec 05 12:38:35.517936 master-0 kubenswrapper[8731]: I1205 12:38:35.507624 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p8p6" event={"ID":"8defe125-1529-4091-adff-e9d17a2b298f","Type":"ContainerDied","Data":"539fe177f9c1deb8d425356a84818b5c05d811c1ab77b966156d70120d25eef1"} Dec 05 12:38:35.517936 master-0 kubenswrapper[8731]: I1205 12:38:35.507662 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p8p6" event={"ID":"8defe125-1529-4091-adff-e9d17a2b298f","Type":"ContainerStarted","Data":"010fcb3fd705e5d750eedd1adb06872aa524c08fbc85d2a921261129ee9bc96b"} Dec 05 12:38:35.517936 master-0 kubenswrapper[8731]: I1205 12:38:35.509918 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-45nwc" event={"ID":"95d8fb27-8b2b-4749-add3-9e9b16edb693","Type":"ContainerStarted","Data":"03f0a0b0d216acb77dbcdd2122fb900482613fd273ff40215cc362d7e8dacc9f"} Dec 05 12:38:35.517936 master-0 kubenswrapper[8731]: I1205 12:38:35.509950 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-45nwc" event={"ID":"95d8fb27-8b2b-4749-add3-9e9b16edb693","Type":"ContainerStarted","Data":"8419f43f14005852c093325fa596baaf624f2da2d38299ead3523e1bbf468c70"} Dec 05 12:38:35.517936 master-0 kubenswrapper[8731]: I1205 12:38:35.512933 8731 generic.go:334] "Generic (PLEG): container finished" podID="b74e0607-6ed0-4119-8870-895b7d336830" containerID="05c868179fe699a72c6f244f8706f4870b83c4369ed24818820567f21e6d96f4" exitCode=0 Dec 05 12:38:35.517936 master-0 kubenswrapper[8731]: I1205 12:38:35.513011 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2pp25" event={"ID":"b74e0607-6ed0-4119-8870-895b7d336830","Type":"ContainerDied","Data":"05c868179fe699a72c6f244f8706f4870b83c4369ed24818820567f21e6d96f4"} Dec 05 12:38:35.517936 master-0 kubenswrapper[8731]: I1205 12:38:35.513054 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2pp25" event={"ID":"b74e0607-6ed0-4119-8870-895b7d336830","Type":"ContainerStarted","Data":"ce6d6f50d1ea16153d0bcd0e4641d90ef903c01636f33ef60f26b9dcbbaecad8"} Dec 05 12:38:35.517936 master-0 kubenswrapper[8731]: I1205 12:38:35.516433 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" event={"ID":"b13885ef-d2b5-4591-825d-446cf8729bc1","Type":"ContainerStarted","Data":"0f8a1e4d8de6a06f67857b43e08d70d6ce0e19926ff25b49cbea007cf56e4e61"} Dec 05 12:38:35.521379 master-0 kubenswrapper[8731]: I1205 12:38:35.521326 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" event={"ID":"cb32aaee-e2bd-4023-a95e-48786016725b","Type":"ContainerStarted","Data":"edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad"} Dec 05 12:38:35.521690 master-0 kubenswrapper[8731]: I1205 12:38:35.521440 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" podUID="cb32aaee-e2bd-4023-a95e-48786016725b" containerName="kube-rbac-proxy" containerID="cri-o://a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd" gracePeriod=30 Dec 05 12:38:35.521690 master-0 kubenswrapper[8731]: I1205 12:38:35.521556 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" podUID="cb32aaee-e2bd-4023-a95e-48786016725b" containerName="machine-approver-controller" containerID="cri-o://edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad" gracePeriod=30 Dec 05 12:38:35.523957 master-0 kubenswrapper[8731]: I1205 12:38:35.523907 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" event={"ID":"1ee7a76b-cf1d-4513-b314-5aa314da818d","Type":"ContainerStarted","Data":"be906a53f820b21555f2880c815b5a7120f14a015e27df21706cfb62d2b36ef4"} Dec 05 12:38:35.531643 master-0 kubenswrapper[8731]: I1205 12:38:35.531575 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rzl84" event={"ID":"ce9e2a6b-8ce7-477c-8bc7-24033243eabe","Type":"ContainerStarted","Data":"679937dc83b97301577d3c65750d4ebf2b527dc3eb9e1329443173e8480258f9"} Dec 05 12:38:35.535785 master-0 kubenswrapper[8731]: I1205 12:38:35.535622 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" event={"ID":"e5dfcb1e-1231-4f07-8c21-748965718729","Type":"ContainerStarted","Data":"2b11a7092987ce9dc3415de6986fd3fb9e8cd98ab5789b4c5b5b61519d70650e"} Dec 05 12:38:35.538793 master-0 kubenswrapper[8731]: I1205 12:38:35.538704 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" event={"ID":"a280c582-685e-47ac-bf6b-248aa0c129a9","Type":"ContainerStarted","Data":"4e53d72cb8b1cdc5f2650e124f1a5eb3f2376bad125be7582d7eaee220557d0e"} Dec 05 12:38:35.538793 master-0 kubenswrapper[8731]: I1205 12:38:35.538765 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" event={"ID":"a280c582-685e-47ac-bf6b-248aa0c129a9","Type":"ContainerStarted","Data":"502462f2915d6fff82c1a557ec2a9e24c7fbeef3a6daff0dad2cf5862df79899"} Dec 05 12:38:35.541343 master-0 kubenswrapper[8731]: I1205 12:38:35.540941 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-q9qdg" event={"ID":"a14df948-1ec4-4785-ad33-28d1e7063959","Type":"ContainerStarted","Data":"e49b3ffc20b79d61c5da13ba14b717c8ba7cb68b1431994e06693ed50d8cbba7"} Dec 05 12:38:35.543261 master-0 kubenswrapper[8731]: I1205 12:38:35.543156 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" event={"ID":"159e5ddd-ce04-491a-996f-7c7b4bcec546","Type":"ContainerStarted","Data":"39bbeea33e0f358fae8aa3014875eaeb8abee7f4103b0265a8bb92799da69dcd"} Dec 05 12:38:35.548043 master-0 kubenswrapper[8731]: I1205 12:38:35.547986 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" event={"ID":"3d96c85a-fc88-46af-83d5-6c71ec6e2c23","Type":"ContainerStarted","Data":"daa09ad85f2f2082378a9c295067a8cfb84e2945b5becb78e60f9a9831fd768e"} Dec 05 12:38:35.551023 master-0 kubenswrapper[8731]: I1205 12:38:35.550496 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" event={"ID":"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f","Type":"ContainerStarted","Data":"6239eae44fcba24c87ffe8091386b03d73857f84777d4bb210653bc8e77e499d"} Dec 05 12:38:35.557775 master-0 kubenswrapper[8731]: I1205 12:38:35.557700 8731 generic.go:334] "Generic (PLEG): container finished" podID="9c31f89c-b01b-4853-a901-bccc25441a46" containerID="ad4e3fece245cbae80f7af3bfbb0484b4d1681aae90d16b1b170f5d8af892edc" exitCode=0 Dec 05 12:38:35.557996 master-0 kubenswrapper[8731]: I1205 12:38:35.557940 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfk7f" event={"ID":"9c31f89c-b01b-4853-a901-bccc25441a46","Type":"ContainerDied","Data":"ad4e3fece245cbae80f7af3bfbb0484b4d1681aae90d16b1b170f5d8af892edc"} Dec 05 12:38:35.558044 master-0 kubenswrapper[8731]: I1205 12:38:35.558010 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfk7f" event={"ID":"9c31f89c-b01b-4853-a901-bccc25441a46","Type":"ContainerStarted","Data":"46252d0271f63a839c2cf8d137d190a08ca8c85ab8a7cd49fe478dd080504839"} Dec 05 12:38:35.564224 master-0 kubenswrapper[8731]: I1205 12:38:35.564123 8731 generic.go:334] "Generic (PLEG): container finished" podID="a45f340c-0eca-4460-8961-4ca360467eeb" containerID="06eca27e0fe884f90bd62d903b17dde7161c7cd5f8bd04b4c9959d40b8706ecb" exitCode=0 Dec 05 12:38:35.564532 master-0 kubenswrapper[8731]: I1205 12:38:35.564488 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" event={"ID":"a45f340c-0eca-4460-8961-4ca360467eeb","Type":"ContainerDied","Data":"06eca27e0fe884f90bd62d903b17dde7161c7cd5f8bd04b4c9959d40b8706ecb"} Dec 05 12:38:35.566783 master-0 kubenswrapper[8731]: I1205 12:38:35.566648 8731 generic.go:334] "Generic (PLEG): container finished" podID="ebfbe878-1796-4a20-b3f0-76165038252e" containerID="1edf9b703ee13d520466151dc6a14b9861ec98cd381dcaaddc281b34b9755005" exitCode=0 Dec 05 12:38:35.566783 master-0 kubenswrapper[8731]: I1205 12:38:35.566752 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmnvq" event={"ID":"ebfbe878-1796-4a20-b3f0-76165038252e","Type":"ContainerDied","Data":"1edf9b703ee13d520466151dc6a14b9861ec98cd381dcaaddc281b34b9755005"} Dec 05 12:38:35.566783 master-0 kubenswrapper[8731]: I1205 12:38:35.566787 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmnvq" event={"ID":"ebfbe878-1796-4a20-b3f0-76165038252e","Type":"ContainerStarted","Data":"fb396b2885c697fc62cb75681d56dacee81e32f235fe9f427b2f065f721f39f2"} Dec 05 12:38:35.569276 master-0 kubenswrapper[8731]: I1205 12:38:35.569168 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" event={"ID":"665c4362-e2e5-4f96-92c0-1746c63c7422","Type":"ContainerStarted","Data":"14e043dc9f8b3470df421fe84c1bd6ed6c94ede3d95b8d74893ae012e041f04e"} Dec 05 12:38:35.606196 master-0 kubenswrapper[8731]: I1205 12:38:35.606086 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" podStartSLOduration=3.3455808559999998 podStartE2EDuration="25.606062085s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:11.832753696 +0000 UTC m=+390.136737863" lastFinishedPulling="2025-12-05 12:38:34.093234915 +0000 UTC m=+412.397219092" observedRunningTime="2025-12-05 12:38:35.605992444 +0000 UTC m=+413.909976621" watchObservedRunningTime="2025-12-05 12:38:35.606062085 +0000 UTC m=+413.910046262" Dec 05 12:38:35.609412 master-0 kubenswrapper[8731]: I1205 12:38:35.609336 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-45nwc" podStartSLOduration=20.609318235 podStartE2EDuration="20.609318235s" podCreationTimestamp="2025-12-05 12:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:35.574372821 +0000 UTC m=+413.878357008" watchObservedRunningTime="2025-12-05 12:38:35.609318235 +0000 UTC m=+413.913302402" Dec 05 12:38:35.628218 master-0 kubenswrapper[8731]: I1205 12:38:35.628123 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" podStartSLOduration=3.582250988 podStartE2EDuration="25.628093487s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:12.181764074 +0000 UTC m=+390.485748241" lastFinishedPulling="2025-12-05 12:38:34.227606573 +0000 UTC m=+412.531590740" observedRunningTime="2025-12-05 12:38:35.620327546 +0000 UTC m=+413.924311703" watchObservedRunningTime="2025-12-05 12:38:35.628093487 +0000 UTC m=+413.932077654" Dec 05 12:38:35.654567 master-0 kubenswrapper[8731]: I1205 12:38:35.654105 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" podStartSLOduration=3.539122982 podStartE2EDuration="25.654080897s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:12.064270677 +0000 UTC m=+390.368254844" lastFinishedPulling="2025-12-05 12:38:34.179228592 +0000 UTC m=+412.483212759" observedRunningTime="2025-12-05 12:38:35.651419784 +0000 UTC m=+413.955403951" watchObservedRunningTime="2025-12-05 12:38:35.654080897 +0000 UTC m=+413.958065054" Dec 05 12:38:35.718575 master-0 kubenswrapper[8731]: I1205 12:38:35.717512 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" podStartSLOduration=3.525730156 podStartE2EDuration="25.717484328s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:11.934099143 +0000 UTC m=+390.238083310" lastFinishedPulling="2025-12-05 12:38:34.125853315 +0000 UTC m=+412.429837482" observedRunningTime="2025-12-05 12:38:35.713513339 +0000 UTC m=+414.017497506" watchObservedRunningTime="2025-12-05 12:38:35.717484328 +0000 UTC m=+414.021468495" Dec 05 12:38:35.724622 master-0 kubenswrapper[8731]: I1205 12:38:35.724202 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-55965856b6-q9qdg" podStartSLOduration=3.324416369 podStartE2EDuration="25.72416706s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:11.723319458 +0000 UTC m=+390.027303625" lastFinishedPulling="2025-12-05 12:38:34.123070149 +0000 UTC m=+412.427054316" observedRunningTime="2025-12-05 12:38:35.681883316 +0000 UTC m=+413.985867483" watchObservedRunningTime="2025-12-05 12:38:35.72416706 +0000 UTC m=+414.028151227" Dec 05 12:38:35.750864 master-0 kubenswrapper[8731]: I1205 12:38:35.750755 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:35.760891 master-0 kubenswrapper[8731]: I1205 12:38:35.757473 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" podStartSLOduration=3.716480253 podStartE2EDuration="25.757438158s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:12.000026833 +0000 UTC m=+390.304011000" lastFinishedPulling="2025-12-05 12:38:34.040984738 +0000 UTC m=+412.344968905" observedRunningTime="2025-12-05 12:38:35.747756725 +0000 UTC m=+414.051740892" watchObservedRunningTime="2025-12-05 12:38:35.757438158 +0000 UTC m=+414.061422315" Dec 05 12:38:35.831372 master-0 kubenswrapper[8731]: I1205 12:38:35.829799 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-auth-proxy-config\") pod \"cb32aaee-e2bd-4023-a95e-48786016725b\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " Dec 05 12:38:35.831372 master-0 kubenswrapper[8731]: I1205 12:38:35.829857 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-config\") pod \"cb32aaee-e2bd-4023-a95e-48786016725b\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " Dec 05 12:38:35.831372 master-0 kubenswrapper[8731]: I1205 12:38:35.829883 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx4gb\" (UniqueName: \"kubernetes.io/projected/cb32aaee-e2bd-4023-a95e-48786016725b-kube-api-access-vx4gb\") pod \"cb32aaee-e2bd-4023-a95e-48786016725b\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " Dec 05 12:38:35.831372 master-0 kubenswrapper[8731]: I1205 12:38:35.829929 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb32aaee-e2bd-4023-a95e-48786016725b-machine-approver-tls\") pod \"cb32aaee-e2bd-4023-a95e-48786016725b\" (UID: \"cb32aaee-e2bd-4023-a95e-48786016725b\") " Dec 05 12:38:35.839023 master-0 kubenswrapper[8731]: I1205 12:38:35.837894 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb32aaee-e2bd-4023-a95e-48786016725b-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "cb32aaee-e2bd-4023-a95e-48786016725b" (UID: "cb32aaee-e2bd-4023-a95e-48786016725b"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:38:35.839023 master-0 kubenswrapper[8731]: I1205 12:38:35.838344 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-config" (OuterVolumeSpecName: "config") pod "cb32aaee-e2bd-4023-a95e-48786016725b" (UID: "cb32aaee-e2bd-4023-a95e-48786016725b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:38:35.839023 master-0 kubenswrapper[8731]: I1205 12:38:35.838442 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "cb32aaee-e2bd-4023-a95e-48786016725b" (UID: "cb32aaee-e2bd-4023-a95e-48786016725b"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:38:35.864306 master-0 kubenswrapper[8731]: I1205 12:38:35.856415 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb32aaee-e2bd-4023-a95e-48786016725b-kube-api-access-vx4gb" (OuterVolumeSpecName: "kube-api-access-vx4gb") pod "cb32aaee-e2bd-4023-a95e-48786016725b" (UID: "cb32aaee-e2bd-4023-a95e-48786016725b"). InnerVolumeSpecName "kube-api-access-vx4gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:38:35.869967 master-0 kubenswrapper[8731]: I1205 12:38:35.866651 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" podStartSLOduration=3.453289668 podStartE2EDuration="25.866614149s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:11.647246832 +0000 UTC m=+389.951230999" lastFinishedPulling="2025-12-05 12:38:34.060571313 +0000 UTC m=+412.364555480" observedRunningTime="2025-12-05 12:38:35.851455876 +0000 UTC m=+414.155440043" watchObservedRunningTime="2025-12-05 12:38:35.866614149 +0000 UTC m=+414.170598316" Dec 05 12:38:35.875666 master-0 kubenswrapper[8731]: I1205 12:38:35.875548 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" podStartSLOduration=3.171630688 podStartE2EDuration="25.875516222s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:11.441458493 +0000 UTC m=+389.745442660" lastFinishedPulling="2025-12-05 12:38:34.145344037 +0000 UTC m=+412.449328194" observedRunningTime="2025-12-05 12:38:35.872088218 +0000 UTC m=+414.176072385" watchObservedRunningTime="2025-12-05 12:38:35.875516222 +0000 UTC m=+414.179500389" Dec 05 12:38:35.936424 master-0 kubenswrapper[8731]: I1205 12:38:35.932000 8731 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:35.936424 master-0 kubenswrapper[8731]: I1205 12:38:35.932050 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb32aaee-e2bd-4023-a95e-48786016725b-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:35.936424 master-0 kubenswrapper[8731]: I1205 12:38:35.932063 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx4gb\" (UniqueName: \"kubernetes.io/projected/cb32aaee-e2bd-4023-a95e-48786016725b-kube-api-access-vx4gb\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:35.936424 master-0 kubenswrapper[8731]: I1205 12:38:35.932075 8731 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cb32aaee-e2bd-4023-a95e-48786016725b-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:36.584207 master-0 kubenswrapper[8731]: I1205 12:38:36.583773 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" event={"ID":"b13885ef-d2b5-4591-825d-446cf8729bc1","Type":"ContainerStarted","Data":"8bd8ff38f53cf4940c1efaa7c62de04a6ed00058d3624f9a76bc40b03dd26c9f"} Dec 05 12:38:36.584207 master-0 kubenswrapper[8731]: I1205 12:38:36.584171 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.591030 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.595104 8731 generic.go:334] "Generic (PLEG): container finished" podID="cb32aaee-e2bd-4023-a95e-48786016725b" containerID="edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad" exitCode=0 Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.595145 8731 generic.go:334] "Generic (PLEG): container finished" podID="cb32aaee-e2bd-4023-a95e-48786016725b" containerID="a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd" exitCode=0 Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.595291 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" event={"ID":"cb32aaee-e2bd-4023-a95e-48786016725b","Type":"ContainerDied","Data":"edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad"} Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.595330 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" event={"ID":"cb32aaee-e2bd-4023-a95e-48786016725b","Type":"ContainerDied","Data":"a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd"} Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.595329 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.595363 8731 scope.go:117] "RemoveContainer" containerID="edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad" Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.595346 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn" event={"ID":"cb32aaee-e2bd-4023-a95e-48786016725b","Type":"ContainerDied","Data":"6e1d751af2ee301ad26fbf6333d9b7e721f5c4867d46226437328ecdf257aa8a"} Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.598227 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rzl84" event={"ID":"ce9e2a6b-8ce7-477c-8bc7-24033243eabe","Type":"ContainerStarted","Data":"193b5b7aa7464f9332f9efd8e29d1c5efa1e26b3892e37be477fd5522ff1eff9"} Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.598750 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.602886 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" event={"ID":"159e5ddd-ce04-491a-996f-7c7b4bcec546","Type":"ContainerStarted","Data":"a51d15e45e728f55601d210223c1170225b7261c610e39e13a78b2743bf8d55f"} Dec 05 12:38:36.603224 master-0 kubenswrapper[8731]: I1205 12:38:36.602907 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" event={"ID":"159e5ddd-ce04-491a-996f-7c7b4bcec546","Type":"ContainerStarted","Data":"5fe89aae56c62b5b32867c2f7a6508b308c6d1e0237eb5cf4d712be99d6e42d0"} Dec 05 12:38:36.611629 master-0 kubenswrapper[8731]: I1205 12:38:36.611594 8731 scope.go:117] "RemoveContainer" containerID="a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd" Dec 05 12:38:36.625423 master-0 kubenswrapper[8731]: I1205 12:38:36.625311 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" podStartSLOduration=23.625280151 podStartE2EDuration="23.625280151s" podCreationTimestamp="2025-12-05 12:38:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:36.618722133 +0000 UTC m=+414.922706300" watchObservedRunningTime="2025-12-05 12:38:36.625280151 +0000 UTC m=+414.929264318" Dec 05 12:38:36.650274 master-0 kubenswrapper[8731]: I1205 12:38:36.649138 8731 scope.go:117] "RemoveContainer" containerID="edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad" Dec 05 12:38:36.656545 master-0 kubenswrapper[8731]: E1205 12:38:36.651356 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad\": container with ID starting with edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad not found: ID does not exist" containerID="edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad" Dec 05 12:38:36.656545 master-0 kubenswrapper[8731]: I1205 12:38:36.651417 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad"} err="failed to get container status \"edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad\": rpc error: code = NotFound desc = could not find container \"edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad\": container with ID starting with edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad not found: ID does not exist" Dec 05 12:38:36.656545 master-0 kubenswrapper[8731]: I1205 12:38:36.651439 8731 scope.go:117] "RemoveContainer" containerID="a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd" Dec 05 12:38:36.656545 master-0 kubenswrapper[8731]: E1205 12:38:36.652091 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd\": container with ID starting with a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd not found: ID does not exist" containerID="a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd" Dec 05 12:38:36.656545 master-0 kubenswrapper[8731]: I1205 12:38:36.652145 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd"} err="failed to get container status \"a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd\": rpc error: code = NotFound desc = could not find container \"a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd\": container with ID starting with a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd not found: ID does not exist" Dec 05 12:38:36.656545 master-0 kubenswrapper[8731]: I1205 12:38:36.652163 8731 scope.go:117] "RemoveContainer" containerID="edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad" Dec 05 12:38:36.656545 master-0 kubenswrapper[8731]: I1205 12:38:36.653148 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad"} err="failed to get container status \"edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad\": rpc error: code = NotFound desc = could not find container \"edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad\": container with ID starting with edb22e8af776cc155ad5c289f954e176b01523a67dd86e632340c4328a5750ad not found: ID does not exist" Dec 05 12:38:36.656545 master-0 kubenswrapper[8731]: I1205 12:38:36.653189 8731 scope.go:117] "RemoveContainer" containerID="a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd" Dec 05 12:38:36.656545 master-0 kubenswrapper[8731]: I1205 12:38:36.653974 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd"} err="failed to get container status \"a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd\": rpc error: code = NotFound desc = could not find container \"a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd\": container with ID starting with a849aa89ef620dc900efacf5509fbb3bde9cc97398090f707a0c5b222fcf07cd not found: ID does not exist" Dec 05 12:38:36.679081 master-0 kubenswrapper[8731]: I1205 12:38:36.678965 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" podStartSLOduration=3.704712082 podStartE2EDuration="26.678934446s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:11.199482577 +0000 UTC m=+389.503466744" lastFinishedPulling="2025-12-05 12:38:34.173704941 +0000 UTC m=+412.477689108" observedRunningTime="2025-12-05 12:38:36.668805089 +0000 UTC m=+414.972789256" watchObservedRunningTime="2025-12-05 12:38:36.678934446 +0000 UTC m=+414.982918613" Dec 05 12:38:36.709642 master-0 kubenswrapper[8731]: I1205 12:38:36.709564 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn"] Dec 05 12:38:36.720943 master-0 kubenswrapper[8731]: I1205 12:38:36.720850 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-f797d8546-k5pmn"] Dec 05 12:38:36.732000 master-0 kubenswrapper[8731]: I1205 12:38:36.731606 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rzl84" podStartSLOduration=3.974448214 podStartE2EDuration="26.731578723s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:11.387824148 +0000 UTC m=+389.691808315" lastFinishedPulling="2025-12-05 12:38:34.144954667 +0000 UTC m=+412.448938824" observedRunningTime="2025-12-05 12:38:36.729703012 +0000 UTC m=+415.033687189" watchObservedRunningTime="2025-12-05 12:38:36.731578723 +0000 UTC m=+415.035562910" Dec 05 12:38:36.751977 master-0 kubenswrapper[8731]: I1205 12:38:36.751905 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd"] Dec 05 12:38:36.752296 master-0 kubenswrapper[8731]: E1205 12:38:36.752279 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb32aaee-e2bd-4023-a95e-48786016725b" containerName="machine-approver-controller" Dec 05 12:38:36.752381 master-0 kubenswrapper[8731]: I1205 12:38:36.752314 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb32aaee-e2bd-4023-a95e-48786016725b" containerName="machine-approver-controller" Dec 05 12:38:36.752381 master-0 kubenswrapper[8731]: E1205 12:38:36.752348 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb32aaee-e2bd-4023-a95e-48786016725b" containerName="kube-rbac-proxy" Dec 05 12:38:36.752381 master-0 kubenswrapper[8731]: I1205 12:38:36.752354 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb32aaee-e2bd-4023-a95e-48786016725b" containerName="kube-rbac-proxy" Dec 05 12:38:36.752612 master-0 kubenswrapper[8731]: I1205 12:38:36.752569 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb32aaee-e2bd-4023-a95e-48786016725b" containerName="machine-approver-controller" Dec 05 12:38:36.752612 master-0 kubenswrapper[8731]: I1205 12:38:36.752602 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb32aaee-e2bd-4023-a95e-48786016725b" containerName="kube-rbac-proxy" Dec 05 12:38:36.754069 master-0 kubenswrapper[8731]: I1205 12:38:36.753496 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:36.756292 master-0 kubenswrapper[8731]: I1205 12:38:36.756252 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 12:38:36.757261 master-0 kubenswrapper[8731]: I1205 12:38:36.757240 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 12:38:36.757356 master-0 kubenswrapper[8731]: I1205 12:38:36.757342 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 12:38:36.757434 master-0 kubenswrapper[8731]: I1205 12:38:36.757406 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 12:38:36.757535 master-0 kubenswrapper[8731]: I1205 12:38:36.757489 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 12:38:36.757824 master-0 kubenswrapper[8731]: I1205 12:38:36.757803 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-fb2xd" Dec 05 12:38:36.851387 master-0 kubenswrapper[8731]: I1205 12:38:36.850810 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nbxt\" (UniqueName: \"kubernetes.io/projected/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-kube-api-access-2nbxt\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:36.851387 master-0 kubenswrapper[8731]: I1205 12:38:36.850916 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:36.851387 master-0 kubenswrapper[8731]: I1205 12:38:36.850963 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:36.851387 master-0 kubenswrapper[8731]: I1205 12:38:36.850997 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:36.952755 master-0 kubenswrapper[8731]: I1205 12:38:36.952672 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:36.956455 master-0 kubenswrapper[8731]: I1205 12:38:36.954319 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:36.956455 master-0 kubenswrapper[8731]: I1205 12:38:36.954394 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:36.956455 master-0 kubenswrapper[8731]: I1205 12:38:36.954560 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nbxt\" (UniqueName: \"kubernetes.io/projected/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-kube-api-access-2nbxt\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:36.956455 master-0 kubenswrapper[8731]: I1205 12:38:36.955301 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:36.963101 master-0 kubenswrapper[8731]: I1205 12:38:36.958348 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:37.000262 master-0 kubenswrapper[8731]: I1205 12:38:37.000127 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:37.006129 master-0 kubenswrapper[8731]: I1205 12:38:37.004927 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nbxt\" (UniqueName: \"kubernetes.io/projected/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-kube-api-access-2nbxt\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:37.095672 master-0 kubenswrapper[8731]: I1205 12:38:37.093712 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:38:37.951444 master-0 kubenswrapper[8731]: I1205 12:38:37.951379 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb32aaee-e2bd-4023-a95e-48786016725b" path="/var/lib/kubelet/pods/cb32aaee-e2bd-4023-a95e-48786016725b/volumes" Dec 05 12:38:38.001890 master-0 kubenswrapper[8731]: W1205 12:38:38.001795 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb27bee9_3d33_4c4a_b38b_72f7cec77c7a.slice/crio-3fc55735af7e7e6d6e15c1fa34cd05fc0468a74822467cb4ea7df9c2efc6cd2f WatchSource:0}: Error finding container 3fc55735af7e7e6d6e15c1fa34cd05fc0468a74822467cb4ea7df9c2efc6cd2f: Status 404 returned error can't find the container with id 3fc55735af7e7e6d6e15c1fa34cd05fc0468a74822467cb4ea7df9c2efc6cd2f Dec 05 12:38:38.104615 master-0 kubenswrapper[8731]: I1205 12:38:38.104488 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6"] Dec 05 12:38:38.106049 master-0 kubenswrapper[8731]: I1205 12:38:38.106028 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.107250 master-0 kubenswrapper[8731]: I1205 12:38:38.107206 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6"] Dec 05 12:38:38.110984 master-0 kubenswrapper[8731]: I1205 12:38:38.110951 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-s6pqp" Dec 05 12:38:38.111200 master-0 kubenswrapper[8731]: I1205 12:38:38.111164 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 12:38:38.176312 master-0 kubenswrapper[8731]: I1205 12:38:38.176048 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1478a21e-b6ac-46fb-ad01-805ac71f0a79-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.176312 master-0 kubenswrapper[8731]: I1205 12:38:38.176260 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1478a21e-b6ac-46fb-ad01-805ac71f0a79-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.176620 master-0 kubenswrapper[8731]: I1205 12:38:38.176528 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz4q6\" (UniqueName: \"kubernetes.io/projected/1478a21e-b6ac-46fb-ad01-805ac71f0a79-kube-api-access-fz4q6\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.279514 master-0 kubenswrapper[8731]: I1205 12:38:38.277584 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz4q6\" (UniqueName: \"kubernetes.io/projected/1478a21e-b6ac-46fb-ad01-805ac71f0a79-kube-api-access-fz4q6\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.279514 master-0 kubenswrapper[8731]: I1205 12:38:38.277700 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1478a21e-b6ac-46fb-ad01-805ac71f0a79-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.279514 master-0 kubenswrapper[8731]: I1205 12:38:38.277739 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1478a21e-b6ac-46fb-ad01-805ac71f0a79-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.280820 master-0 kubenswrapper[8731]: I1205 12:38:38.280768 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1478a21e-b6ac-46fb-ad01-805ac71f0a79-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.285339 master-0 kubenswrapper[8731]: I1205 12:38:38.285294 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1478a21e-b6ac-46fb-ad01-805ac71f0a79-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.295822 master-0 kubenswrapper[8731]: I1205 12:38:38.295761 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz4q6\" (UniqueName: \"kubernetes.io/projected/1478a21e-b6ac-46fb-ad01-805ac71f0a79-kube-api-access-fz4q6\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.452677 master-0 kubenswrapper[8731]: I1205 12:38:38.452597 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:38:38.629751 master-0 kubenswrapper[8731]: I1205 12:38:38.629667 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" event={"ID":"a45f340c-0eca-4460-8961-4ca360467eeb","Type":"ContainerStarted","Data":"3a00979f1a40fa874a9b8220fac00b5191a3cf77eaa5880a179ac86b435ff29f"} Dec 05 12:38:38.630006 master-0 kubenswrapper[8731]: I1205 12:38:38.629781 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:38.633005 master-0 kubenswrapper[8731]: I1205 12:38:38.632923 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" event={"ID":"db27bee9-3d33-4c4a-b38b-72f7cec77c7a","Type":"ContainerStarted","Data":"0ddc2093e9ac31dcb8fccf79117cc3631c474d52d69ff0ebdea12838c0ae6a82"} Dec 05 12:38:38.633005 master-0 kubenswrapper[8731]: I1205 12:38:38.632963 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" event={"ID":"db27bee9-3d33-4c4a-b38b-72f7cec77c7a","Type":"ContainerStarted","Data":"3fc55735af7e7e6d6e15c1fa34cd05fc0468a74822467cb4ea7df9c2efc6cd2f"} Dec 05 12:38:40.651064 master-0 kubenswrapper[8731]: I1205 12:38:40.651001 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" event={"ID":"db27bee9-3d33-4c4a-b38b-72f7cec77c7a","Type":"ContainerStarted","Data":"e4b5bdd189732f9903e53555a7a61c0d10d37cd90596a4c760274c5cce948d5d"} Dec 05 12:38:41.388333 master-0 kubenswrapper[8731]: I1205 12:38:41.387930 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:38:41.556081 master-0 kubenswrapper[8731]: I1205 12:38:41.555895 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" podStartSLOduration=5.38150487 podStartE2EDuration="31.55586358s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:11.849576135 +0000 UTC m=+390.153560292" lastFinishedPulling="2025-12-05 12:38:38.023934835 +0000 UTC m=+416.327919002" observedRunningTime="2025-12-05 12:38:41.555732627 +0000 UTC m=+419.859716814" watchObservedRunningTime="2025-12-05 12:38:41.55586358 +0000 UTC m=+419.859847757" Dec 05 12:38:41.836075 master-0 kubenswrapper[8731]: I1205 12:38:41.835951 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6"] Dec 05 12:38:41.880684 master-0 kubenswrapper[8731]: I1205 12:38:41.880608 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t"] Dec 05 12:38:41.881487 master-0 kubenswrapper[8731]: I1205 12:38:41.881268 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="kube-rbac-proxy" containerID="cri-o://a51d15e45e728f55601d210223c1170225b7261c610e39e13a78b2743bf8d55f" gracePeriod=30 Dec 05 12:38:41.881487 master-0 kubenswrapper[8731]: I1205 12:38:41.881328 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="config-sync-controllers" containerID="cri-o://5fe89aae56c62b5b32867c2f7a6508b308c6d1e0237eb5cf4d712be99d6e42d0" gracePeriod=30 Dec 05 12:38:41.881786 master-0 kubenswrapper[8731]: I1205 12:38:41.881144 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="cluster-cloud-controller-manager" containerID="cri-o://39bbeea33e0f358fae8aa3014875eaeb8abee7f4103b0265a8bb92799da69dcd" gracePeriod=30 Dec 05 12:38:41.908841 master-0 kubenswrapper[8731]: I1205 12:38:41.908745 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" podStartSLOduration=5.908711353 podStartE2EDuration="5.908711353s" podCreationTimestamp="2025-12-05 12:38:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:41.895546503 +0000 UTC m=+420.199530680" watchObservedRunningTime="2025-12-05 12:38:41.908711353 +0000 UTC m=+420.212695520" Dec 05 12:38:42.665124 master-0 kubenswrapper[8731]: I1205 12:38:42.664986 8731 generic.go:334] "Generic (PLEG): container finished" podID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerID="a51d15e45e728f55601d210223c1170225b7261c610e39e13a78b2743bf8d55f" exitCode=0 Dec 05 12:38:42.665124 master-0 kubenswrapper[8731]: I1205 12:38:42.665023 8731 generic.go:334] "Generic (PLEG): container finished" podID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerID="5fe89aae56c62b5b32867c2f7a6508b308c6d1e0237eb5cf4d712be99d6e42d0" exitCode=0 Dec 05 12:38:42.665124 master-0 kubenswrapper[8731]: I1205 12:38:42.665031 8731 generic.go:334] "Generic (PLEG): container finished" podID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerID="39bbeea33e0f358fae8aa3014875eaeb8abee7f4103b0265a8bb92799da69dcd" exitCode=0 Dec 05 12:38:42.665124 master-0 kubenswrapper[8731]: I1205 12:38:42.665053 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" event={"ID":"159e5ddd-ce04-491a-996f-7c7b4bcec546","Type":"ContainerDied","Data":"a51d15e45e728f55601d210223c1170225b7261c610e39e13a78b2743bf8d55f"} Dec 05 12:38:42.665124 master-0 kubenswrapper[8731]: I1205 12:38:42.665083 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" event={"ID":"159e5ddd-ce04-491a-996f-7c7b4bcec546","Type":"ContainerDied","Data":"5fe89aae56c62b5b32867c2f7a6508b308c6d1e0237eb5cf4d712be99d6e42d0"} Dec 05 12:38:42.665124 master-0 kubenswrapper[8731]: I1205 12:38:42.665092 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" event={"ID":"159e5ddd-ce04-491a-996f-7c7b4bcec546","Type":"ContainerDied","Data":"39bbeea33e0f358fae8aa3014875eaeb8abee7f4103b0265a8bb92799da69dcd"} Dec 05 12:38:45.880570 master-0 kubenswrapper[8731]: W1205 12:38:45.880511 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1478a21e_b6ac_46fb_ad01_805ac71f0a79.slice/crio-6a7ef281a34ccfa6602eba73eaa73316754fbb0bb6c1935a7c44c597fce5d077 WatchSource:0}: Error finding container 6a7ef281a34ccfa6602eba73eaa73316754fbb0bb6c1935a7c44c597fce5d077: Status 404 returned error can't find the container with id 6a7ef281a34ccfa6602eba73eaa73316754fbb0bb6c1935a7c44c597fce5d077 Dec 05 12:38:46.688730 master-0 kubenswrapper[8731]: I1205 12:38:46.688661 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" event={"ID":"1478a21e-b6ac-46fb-ad01-805ac71f0a79","Type":"ContainerStarted","Data":"6a7ef281a34ccfa6602eba73eaa73316754fbb0bb6c1935a7c44c597fce5d077"} Dec 05 12:38:46.821771 master-0 kubenswrapper[8731]: I1205 12:38:46.821555 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rzl84" Dec 05 12:38:47.032236 master-0 kubenswrapper[8731]: I1205 12:38:47.032113 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-xxmfp_ba095394-1873-4793-969d-3be979fa0771/authentication-operator/2.log" Dec 05 12:38:47.054472 master-0 kubenswrapper[8731]: I1205 12:38:47.054420 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-xxmfp_ba095394-1873-4793-969d-3be979fa0771/authentication-operator/3.log" Dec 05 12:38:47.069401 master-0 kubenswrapper[8731]: I1205 12:38:47.069349 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-5bdfbf6949-2bhqv_d72b2b71-27b2-4aff-bf69-7054a9556318/fix-audit-permissions/0.log" Dec 05 12:38:47.232677 master-0 kubenswrapper[8731]: I1205 12:38:47.232616 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-5bdfbf6949-2bhqv_d72b2b71-27b2-4aff-bf69-7054a9556318/oauth-apiserver/0.log" Dec 05 12:38:47.433427 master-0 kubenswrapper[8731]: I1205 12:38:47.433145 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-7c56cf9b74-z9g7c_5f0c6889-0739-48a3-99cd-6db9d1f83242/dns-operator/0.log" Dec 05 12:38:47.630492 master-0 kubenswrapper[8731]: I1205 12:38:47.630371 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-7c56cf9b74-z9g7c_5f0c6889-0739-48a3-99cd-6db9d1f83242/kube-rbac-proxy/0.log" Dec 05 12:38:47.828591 master-0 kubenswrapper[8731]: I1205 12:38:47.828509 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rzl84_ce9e2a6b-8ce7-477c-8bc7-24033243eabe/dns/0.log" Dec 05 12:38:48.027810 master-0 kubenswrapper[8731]: I1205 12:38:48.027745 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rzl84_ce9e2a6b-8ce7-477c-8bc7-24033243eabe/kube-rbac-proxy/0.log" Dec 05 12:38:48.228076 master-0 kubenswrapper[8731]: I1205 12:38:48.227935 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-f6j7m_bc18a83a-998e-458e-87f0-d5368da52e1b/dns-node-resolver/0.log" Dec 05 12:38:48.428544 master-0 kubenswrapper[8731]: I1205 12:38:48.428489 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/2.log" Dec 05 12:38:48.629483 master-0 kubenswrapper[8731]: I1205 12:38:48.629433 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/3.log" Dec 05 12:38:48.705555 master-0 kubenswrapper[8731]: I1205 12:38:48.705474 8731 generic.go:334] "Generic (PLEG): container finished" podID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerID="bbc65050e19c8e05efbd98764627a92089e068c4fefa760a423cea0a25acab48" exitCode=0 Dec 05 12:38:48.706010 master-0 kubenswrapper[8731]: I1205 12:38:48.705590 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" event={"ID":"d72b2b71-27b2-4aff-bf69-7054a9556318","Type":"ContainerDied","Data":"bbc65050e19c8e05efbd98764627a92089e068c4fefa760a423cea0a25acab48"} Dec 05 12:38:48.828093 master-0 kubenswrapper[8731]: I1205 12:38:48.827883 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/setup/0.log" Dec 05 12:38:49.028813 master-0 kubenswrapper[8731]: I1205 12:38:49.028776 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-ensure-env-vars/0.log" Dec 05 12:38:49.228228 master-0 kubenswrapper[8731]: I1205 12:38:49.228161 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-resources-copy/0.log" Dec 05 12:38:49.428351 master-0 kubenswrapper[8731]: I1205 12:38:49.428303 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcdctl/0.log" Dec 05 12:38:49.639107 master-0 kubenswrapper[8731]: I1205 12:38:49.638936 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd/0.log" Dec 05 12:38:49.834151 master-0 kubenswrapper[8731]: I1205 12:38:49.834101 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-metrics/0.log" Dec 05 12:38:50.030497 master-0 kubenswrapper[8731]: I1205 12:38:50.030365 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-readyz/0.log" Dec 05 12:38:50.228692 master-0 kubenswrapper[8731]: I1205 12:38:50.228600 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-rev/0.log" Dec 05 12:38:50.433137 master-0 kubenswrapper[8731]: I1205 12:38:50.432909 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_96fa3513-5467-4b0f-a03d-9279d36317bd/installer/0.log" Dec 05 12:38:51.415154 master-0 kubenswrapper[8731]: I1205 12:38:51.415102 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-rw57t_807d9093-aa67-4840-b5be-7f3abcc1beed/kube-apiserver-operator/2.log" Dec 05 12:38:52.780655 master-0 kubenswrapper[8731]: I1205 12:38:52.780557 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-rw57t_807d9093-aa67-4840-b5be-7f3abcc1beed/kube-apiserver-operator/3.log" Dec 05 12:38:52.787047 master-0 kubenswrapper[8731]: I1205 12:38:52.786994 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_d75143d9bc4a2dc15781dc51ccff632a/setup/0.log" Dec 05 12:38:52.798156 master-0 kubenswrapper[8731]: I1205 12:38:52.798091 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_d75143d9bc4a2dc15781dc51ccff632a/kube-apiserver/0.log" Dec 05 12:38:52.804798 master-0 kubenswrapper[8731]: I1205 12:38:52.804724 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_d75143d9bc4a2dc15781dc51ccff632a/kube-apiserver-insecure-readyz/0.log" Dec 05 12:38:52.813695 master-0 kubenswrapper[8731]: I1205 12:38:52.813621 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_d627fcf3-2a80-4739-add9-e21ad4efc6eb/installer/0.log" Dec 05 12:38:52.820122 master-0 kubenswrapper[8731]: I1205 12:38:52.820057 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_565d5ef6-b0e7-4f04-9460-61f1d3903d37/installer/0.log" Dec 05 12:38:52.826521 master-0 kubenswrapper[8731]: I1205 12:38:52.826467 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-848f645654-g6nj5_594aaded-5615-4bed-87ee-6173059a73be/kube-controller-manager-operator/2.log" Dec 05 12:38:52.835322 master-0 kubenswrapper[8731]: I1205 12:38:52.835233 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-848f645654-g6nj5_594aaded-5615-4bed-87ee-6173059a73be/kube-controller-manager-operator/3.log" Dec 05 12:38:52.846964 master-0 kubenswrapper[8731]: I1205 12:38:52.846860 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_8b47694fcc32464ab24d09c23d6efb57/kube-controller-manager/4.log" Dec 05 12:38:52.857602 master-0 kubenswrapper[8731]: I1205 12:38:52.857553 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_8b47694fcc32464ab24d09c23d6efb57/cluster-policy-controller/0.log" Dec 05 12:38:52.873204 master-0 kubenswrapper[8731]: I1205 12:38:52.873133 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_8b47694fcc32464ab24d09c23d6efb57/kube-controller-manager/5.log" Dec 05 12:38:52.885403 master-0 kubenswrapper[8731]: I1205 12:38:52.885348 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:53.016923 master-0 kubenswrapper[8731]: I1205 12:38:53.016817 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-auth-proxy-config\") pod \"159e5ddd-ce04-491a-996f-7c7b4bcec546\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " Dec 05 12:38:53.016923 master-0 kubenswrapper[8731]: I1205 12:38:53.016924 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/159e5ddd-ce04-491a-996f-7c7b4bcec546-host-etc-kube\") pod \"159e5ddd-ce04-491a-996f-7c7b4bcec546\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " Dec 05 12:38:53.017357 master-0 kubenswrapper[8731]: I1205 12:38:53.016961 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27z6k\" (UniqueName: \"kubernetes.io/projected/159e5ddd-ce04-491a-996f-7c7b4bcec546-kube-api-access-27z6k\") pod \"159e5ddd-ce04-491a-996f-7c7b4bcec546\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " Dec 05 12:38:53.017357 master-0 kubenswrapper[8731]: I1205 12:38:53.016996 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/159e5ddd-ce04-491a-996f-7c7b4bcec546-cloud-controller-manager-operator-tls\") pod \"159e5ddd-ce04-491a-996f-7c7b4bcec546\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " Dec 05 12:38:53.017357 master-0 kubenswrapper[8731]: I1205 12:38:53.017025 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-images\") pod \"159e5ddd-ce04-491a-996f-7c7b4bcec546\" (UID: \"159e5ddd-ce04-491a-996f-7c7b4bcec546\") " Dec 05 12:38:53.017743 master-0 kubenswrapper[8731]: I1205 12:38:53.017697 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-images" (OuterVolumeSpecName: "images") pod "159e5ddd-ce04-491a-996f-7c7b4bcec546" (UID: "159e5ddd-ce04-491a-996f-7c7b4bcec546"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:38:53.017871 master-0 kubenswrapper[8731]: I1205 12:38:53.017668 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/159e5ddd-ce04-491a-996f-7c7b4bcec546-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "159e5ddd-ce04-491a-996f-7c7b4bcec546" (UID: "159e5ddd-ce04-491a-996f-7c7b4bcec546"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:38:53.018073 master-0 kubenswrapper[8731]: I1205 12:38:53.017999 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "159e5ddd-ce04-491a-996f-7c7b4bcec546" (UID: "159e5ddd-ce04-491a-996f-7c7b4bcec546"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:38:53.022076 master-0 kubenswrapper[8731]: I1205 12:38:53.022018 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159e5ddd-ce04-491a-996f-7c7b4bcec546-kube-api-access-27z6k" (OuterVolumeSpecName: "kube-api-access-27z6k") pod "159e5ddd-ce04-491a-996f-7c7b4bcec546" (UID: "159e5ddd-ce04-491a-996f-7c7b4bcec546"). InnerVolumeSpecName "kube-api-access-27z6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:38:53.023336 master-0 kubenswrapper[8731]: I1205 12:38:53.023265 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/159e5ddd-ce04-491a-996f-7c7b4bcec546-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "159e5ddd-ce04-491a-996f-7c7b4bcec546" (UID: "159e5ddd-ce04-491a-996f-7c7b4bcec546"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:38:53.038979 master-0 kubenswrapper[8731]: I1205 12:38:53.038925 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-controller-manager-master-0_8b47694fcc32464ab24d09c23d6efb57/cluster-policy-controller/1.log" Dec 05 12:38:53.118569 master-0 kubenswrapper[8731]: I1205 12:38:53.118535 8731 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:53.119440 master-0 kubenswrapper[8731]: I1205 12:38:53.119421 8731 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/159e5ddd-ce04-491a-996f-7c7b4bcec546-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:53.119539 master-0 kubenswrapper[8731]: I1205 12:38:53.119525 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27z6k\" (UniqueName: \"kubernetes.io/projected/159e5ddd-ce04-491a-996f-7c7b4bcec546-kube-api-access-27z6k\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:53.119647 master-0 kubenswrapper[8731]: I1205 12:38:53.119632 8731 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/159e5ddd-ce04-491a-996f-7c7b4bcec546-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:53.119733 master-0 kubenswrapper[8731]: I1205 12:38:53.119719 8731 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/159e5ddd-ce04-491a-996f-7c7b4bcec546-images\") on node \"master-0\" DevicePath \"\"" Dec 05 12:38:53.244717 master-0 kubenswrapper[8731]: I1205 12:38:53.242720 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_5e09e2af7200e6f9be469dbfd9bb1127/kube-scheduler/0.log" Dec 05 12:38:53.244991 master-0 kubenswrapper[8731]: I1205 12:38:53.244733 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5465c8b4db-dzlmb"] Dec 05 12:38:53.247457 master-0 kubenswrapper[8731]: E1205 12:38:53.245091 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="kube-rbac-proxy" Dec 05 12:38:53.247457 master-0 kubenswrapper[8731]: I1205 12:38:53.245166 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="kube-rbac-proxy" Dec 05 12:38:53.247457 master-0 kubenswrapper[8731]: E1205 12:38:53.245213 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="cluster-cloud-controller-manager" Dec 05 12:38:53.247457 master-0 kubenswrapper[8731]: I1205 12:38:53.245219 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="cluster-cloud-controller-manager" Dec 05 12:38:53.247457 master-0 kubenswrapper[8731]: E1205 12:38:53.245234 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="config-sync-controllers" Dec 05 12:38:53.247457 master-0 kubenswrapper[8731]: I1205 12:38:53.245241 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="config-sync-controllers" Dec 05 12:38:53.247457 master-0 kubenswrapper[8731]: I1205 12:38:53.245349 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="kube-rbac-proxy" Dec 05 12:38:53.247457 master-0 kubenswrapper[8731]: I1205 12:38:53.245362 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="cluster-cloud-controller-manager" Dec 05 12:38:53.247457 master-0 kubenswrapper[8731]: I1205 12:38:53.245370 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" containerName="config-sync-controllers" Dec 05 12:38:53.247457 master-0 kubenswrapper[8731]: I1205 12:38:53.245904 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.249262 master-0 kubenswrapper[8731]: I1205 12:38:53.248572 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 12:38:53.249262 master-0 kubenswrapper[8731]: I1205 12:38:53.248979 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-hcp7n" Dec 05 12:38:53.249262 master-0 kubenswrapper[8731]: I1205 12:38:53.249159 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 12:38:53.254203 master-0 kubenswrapper[8731]: I1205 12:38:53.249453 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 12:38:53.254203 master-0 kubenswrapper[8731]: I1205 12:38:53.249618 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 12:38:53.254203 master-0 kubenswrapper[8731]: I1205 12:38:53.249764 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 12:38:53.254203 master-0 kubenswrapper[8731]: I1205 12:38:53.250010 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 12:38:53.254203 master-0 kubenswrapper[8731]: I1205 12:38:53.251623 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr"] Dec 05 12:38:53.254203 master-0 kubenswrapper[8731]: I1205 12:38:53.252682 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:38:53.258237 master-0 kubenswrapper[8731]: I1205 12:38:53.258135 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg"] Dec 05 12:38:53.260117 master-0 kubenswrapper[8731]: I1205 12:38:53.259499 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" Dec 05 12:38:53.260117 master-0 kubenswrapper[8731]: I1205 12:38:53.260062 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-jdkkl" Dec 05 12:38:53.264212 master-0 kubenswrapper[8731]: I1205 12:38:53.260306 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 05 12:38:53.273671 master-0 kubenswrapper[8731]: I1205 12:38:53.266400 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg"] Dec 05 12:38:53.308218 master-0 kubenswrapper[8731]: I1205 12:38:53.305894 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr"] Dec 05 12:38:53.323049 master-0 kubenswrapper[8731]: I1205 12:38:53.322983 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dwm5\" (UniqueName: \"kubernetes.io/projected/20a72c8b-0f12-446b-8a42-53d98864c8f8-kube-api-access-6dwm5\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.323049 master-0 kubenswrapper[8731]: I1205 12:38:53.323034 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/909ed395-8ad3-4350-95e3-b4b19c682f92-tls-certificates\") pod \"prometheus-operator-admission-webhook-7c85c4dffd-sbvlr\" (UID: \"909ed395-8ad3-4350-95e3-b4b19c682f92\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:38:53.323384 master-0 kubenswrapper[8731]: I1205 12:38:53.323098 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-stats-auth\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.323384 master-0 kubenswrapper[8731]: I1205 12:38:53.323126 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a72c8b-0f12-446b-8a42-53d98864c8f8-service-ca-bundle\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.323384 master-0 kubenswrapper[8731]: I1205 12:38:53.323160 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-metrics-certs\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.323384 master-0 kubenswrapper[8731]: I1205 12:38:53.323198 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-default-certificate\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.424513 master-0 kubenswrapper[8731]: I1205 12:38:53.424423 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxwwh\" (UniqueName: \"kubernetes.io/projected/e2e2d968-9946-4711-aaf0-3e3a03bff415-kube-api-access-pxwwh\") pod \"network-check-source-85d8db45d4-kkllg\" (UID: \"e2e2d968-9946-4711-aaf0-3e3a03bff415\") " pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" Dec 05 12:38:53.424790 master-0 kubenswrapper[8731]: I1205 12:38:53.424648 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-metrics-certs\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.424790 master-0 kubenswrapper[8731]: I1205 12:38:53.424770 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-default-certificate\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.424882 master-0 kubenswrapper[8731]: I1205 12:38:53.424819 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dwm5\" (UniqueName: \"kubernetes.io/projected/20a72c8b-0f12-446b-8a42-53d98864c8f8-kube-api-access-6dwm5\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.425458 master-0 kubenswrapper[8731]: I1205 12:38:53.425012 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/909ed395-8ad3-4350-95e3-b4b19c682f92-tls-certificates\") pod \"prometheus-operator-admission-webhook-7c85c4dffd-sbvlr\" (UID: \"909ed395-8ad3-4350-95e3-b4b19c682f92\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:38:53.425645 master-0 kubenswrapper[8731]: I1205 12:38:53.425499 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-stats-auth\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.425645 master-0 kubenswrapper[8731]: I1205 12:38:53.425554 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a72c8b-0f12-446b-8a42-53d98864c8f8-service-ca-bundle\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.428706 master-0 kubenswrapper[8731]: I1205 12:38:53.426607 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a72c8b-0f12-446b-8a42-53d98864c8f8-service-ca-bundle\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.429766 master-0 kubenswrapper[8731]: I1205 12:38:53.429254 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/909ed395-8ad3-4350-95e3-b4b19c682f92-tls-certificates\") pod \"prometheus-operator-admission-webhook-7c85c4dffd-sbvlr\" (UID: \"909ed395-8ad3-4350-95e3-b4b19c682f92\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:38:53.429766 master-0 kubenswrapper[8731]: I1205 12:38:53.429434 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-default-certificate\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.429926 master-0 kubenswrapper[8731]: I1205 12:38:53.429779 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-metrics-certs\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.432218 master-0 kubenswrapper[8731]: I1205 12:38:53.432172 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_5e09e2af7200e6f9be469dbfd9bb1127/kube-scheduler/1.log" Dec 05 12:38:53.434555 master-0 kubenswrapper[8731]: I1205 12:38:53.434531 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-stats-auth\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.453000 master-0 kubenswrapper[8731]: I1205 12:38:53.450121 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dwm5\" (UniqueName: \"kubernetes.io/projected/20a72c8b-0f12-446b-8a42-53d98864c8f8-kube-api-access-6dwm5\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.527106 master-0 kubenswrapper[8731]: I1205 12:38:53.527032 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxwwh\" (UniqueName: \"kubernetes.io/projected/e2e2d968-9946-4711-aaf0-3e3a03bff415-kube-api-access-pxwwh\") pod \"network-check-source-85d8db45d4-kkllg\" (UID: \"e2e2d968-9946-4711-aaf0-3e3a03bff415\") " pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" Dec 05 12:38:53.545077 master-0 kubenswrapper[8731]: I1205 12:38:53.545021 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxwwh\" (UniqueName: \"kubernetes.io/projected/e2e2d968-9946-4711-aaf0-3e3a03bff415-kube-api-access-pxwwh\") pod \"network-check-source-85d8db45d4-kkllg\" (UID: \"e2e2d968-9946-4711-aaf0-3e3a03bff415\") " pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" Dec 05 12:38:53.581200 master-0 kubenswrapper[8731]: I1205 12:38:53.581048 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:53.599268 master-0 kubenswrapper[8731]: W1205 12:38:53.599214 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20a72c8b_0f12_446b_8a42_53d98864c8f8.slice/crio-c13f876ed14f7005d250ab3203aedc5ac3d9bddbbff7570300b321a40f59bd5c WatchSource:0}: Error finding container c13f876ed14f7005d250ab3203aedc5ac3d9bddbbff7570300b321a40f59bd5c: Status 404 returned error can't find the container with id c13f876ed14f7005d250ab3203aedc5ac3d9bddbbff7570300b321a40f59bd5c Dec 05 12:38:53.624548 master-0 kubenswrapper[8731]: I1205 12:38:53.624490 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:38:53.626835 master-0 kubenswrapper[8731]: I1205 12:38:53.626813 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" Dec 05 12:38:53.637544 master-0 kubenswrapper[8731]: I1205 12:38:53.637481 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_076dafdf-a5d2-4e2d-9c38-6932910f7327/installer/0.log" Dec 05 12:38:53.747502 master-0 kubenswrapper[8731]: I1205 12:38:53.747419 8731 generic.go:334] "Generic (PLEG): container finished" podID="8defe125-1529-4091-adff-e9d17a2b298f" containerID="5fa7eec5b7c19299d0ce6c87bba94066e186df6e6ba2162f840498cde3a19934" exitCode=0 Dec 05 12:38:53.747502 master-0 kubenswrapper[8731]: I1205 12:38:53.747480 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p8p6" event={"ID":"8defe125-1529-4091-adff-e9d17a2b298f","Type":"ContainerDied","Data":"5fa7eec5b7c19299d0ce6c87bba94066e186df6e6ba2162f840498cde3a19934"} Dec 05 12:38:53.760661 master-0 kubenswrapper[8731]: I1205 12:38:53.755193 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" event={"ID":"1478a21e-b6ac-46fb-ad01-805ac71f0a79","Type":"ContainerStarted","Data":"784eea1d9740e9545a6ad492de89955c83abee11f7626ae20b597773d711dc88"} Dec 05 12:38:53.760661 master-0 kubenswrapper[8731]: I1205 12:38:53.755265 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" event={"ID":"1478a21e-b6ac-46fb-ad01-805ac71f0a79","Type":"ContainerStarted","Data":"d3609111ab16a5bdb6bc2e1385cc4f52a698e20f663f1587bad8d27d757fd6be"} Dec 05 12:38:53.760661 master-0 kubenswrapper[8731]: I1205 12:38:53.757916 8731 generic.go:334] "Generic (PLEG): container finished" podID="b74e0607-6ed0-4119-8870-895b7d336830" containerID="ea572c6fcc8d460ca830971971bae224cadfb5879734d2e8d7b9add3c483a937" exitCode=0 Dec 05 12:38:53.760661 master-0 kubenswrapper[8731]: I1205 12:38:53.758973 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2pp25" event={"ID":"b74e0607-6ed0-4119-8870-895b7d336830","Type":"ContainerDied","Data":"ea572c6fcc8d460ca830971971bae224cadfb5879734d2e8d7b9add3c483a937"} Dec 05 12:38:53.765192 master-0 kubenswrapper[8731]: I1205 12:38:53.765134 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" event={"ID":"159e5ddd-ce04-491a-996f-7c7b4bcec546","Type":"ContainerDied","Data":"bccef0932d1cbf8543a5017aa6fc3ec91308392451786d0877281c1041d23958"} Dec 05 12:38:53.765280 master-0 kubenswrapper[8731]: I1205 12:38:53.765212 8731 scope.go:117] "RemoveContainer" containerID="a51d15e45e728f55601d210223c1170225b7261c610e39e13a78b2743bf8d55f" Dec 05 12:38:53.765280 master-0 kubenswrapper[8731]: I1205 12:38:53.765219 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t" Dec 05 12:38:53.798967 master-0 kubenswrapper[8731]: I1205 12:38:53.798915 8731 generic.go:334] "Generic (PLEG): container finished" podID="ebfbe878-1796-4a20-b3f0-76165038252e" containerID="6ac9a49c2a57485ce32b61b5b230ca835325f9ead86b65416a1ed194a651372b" exitCode=0 Dec 05 12:38:53.799304 master-0 kubenswrapper[8731]: I1205 12:38:53.798983 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmnvq" event={"ID":"ebfbe878-1796-4a20-b3f0-76165038252e","Type":"ContainerDied","Data":"6ac9a49c2a57485ce32b61b5b230ca835325f9ead86b65416a1ed194a651372b"} Dec 05 12:38:53.804312 master-0 kubenswrapper[8731]: I1205 12:38:53.804245 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfk7f" event={"ID":"9c31f89c-b01b-4853-a901-bccc25441a46","Type":"ContainerStarted","Data":"1f82b253446479fa5b79026be8aaeda5c0a2e403ecedf5e8aa0aa49e59d88903"} Dec 05 12:38:53.805700 master-0 kubenswrapper[8731]: I1205 12:38:53.805655 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerStarted","Data":"c13f876ed14f7005d250ab3203aedc5ac3d9bddbbff7570300b321a40f59bd5c"} Dec 05 12:38:53.809501 master-0 kubenswrapper[8731]: I1205 12:38:53.809259 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" event={"ID":"d72b2b71-27b2-4aff-bf69-7054a9556318","Type":"ContainerStarted","Data":"072364fce4151260cecc71f6271a4001a02ac2658d21c4a5606f1cd07f40e995"} Dec 05 12:38:53.816644 master-0 kubenswrapper[8731]: I1205 12:38:53.816600 8731 scope.go:117] "RemoveContainer" containerID="5fe89aae56c62b5b32867c2f7a6508b308c6d1e0237eb5cf4d712be99d6e42d0" Dec 05 12:38:53.831889 master-0 kubenswrapper[8731]: I1205 12:38:53.831745 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f85974995-4vsjv_1871a9d6-6369-4d08-816f-9c6310b61ddf/kube-scheduler-operator-container/1.log" Dec 05 12:38:53.839336 master-0 kubenswrapper[8731]: I1205 12:38:53.839298 8731 scope.go:117] "RemoveContainer" containerID="39bbeea33e0f358fae8aa3014875eaeb8abee7f4103b0265a8bb92799da69dcd" Dec 05 12:38:53.853728 master-0 kubenswrapper[8731]: I1205 12:38:53.853626 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" podStartSLOduration=15.853595877 podStartE2EDuration="15.853595877s" podCreationTimestamp="2025-12-05 12:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:53.845852565 +0000 UTC m=+432.149836742" watchObservedRunningTime="2025-12-05 12:38:53.853595877 +0000 UTC m=+432.157580044" Dec 05 12:38:53.934062 master-0 kubenswrapper[8731]: I1205 12:38:53.934002 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t"] Dec 05 12:38:53.944336 master-0 kubenswrapper[8731]: I1205 12:38:53.944281 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-74f484689c-lbt9t"] Dec 05 12:38:53.977721 master-0 kubenswrapper[8731]: I1205 12:38:53.973040 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm"] Dec 05 12:38:53.986228 master-0 kubenswrapper[8731]: I1205 12:38:53.986134 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.002900 master-0 kubenswrapper[8731]: I1205 12:38:54.002685 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 05 12:38:54.002999 master-0 kubenswrapper[8731]: I1205 12:38:54.002902 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 05 12:38:54.003491 master-0 kubenswrapper[8731]: I1205 12:38:54.002929 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 12:38:54.003808 master-0 kubenswrapper[8731]: I1205 12:38:54.002898 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 05 12:38:54.004492 master-0 kubenswrapper[8731]: I1205 12:38:54.004113 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:38:54.004492 master-0 kubenswrapper[8731]: I1205 12:38:54.004285 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-m6vhr" Dec 05 12:38:54.041316 master-0 kubenswrapper[8731]: I1205 12:38:54.032366 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f85974995-4vsjv_1871a9d6-6369-4d08-816f-9c6310b61ddf/kube-scheduler-operator-container/2.log" Dec 05 12:38:54.068487 master-0 kubenswrapper[8731]: I1205 12:38:54.068410 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr"] Dec 05 12:38:54.139581 master-0 kubenswrapper[8731]: I1205 12:38:54.139525 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/dbe144b5-3b78-4946-bbf9-b825b0e47b07-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.139773 master-0 kubenswrapper[8731]: I1205 12:38:54.139613 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbg7w\" (UniqueName: \"kubernetes.io/projected/dbe144b5-3b78-4946-bbf9-b825b0e47b07-kube-api-access-mbg7w\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.139773 master-0 kubenswrapper[8731]: I1205 12:38:54.139666 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/dbe144b5-3b78-4946-bbf9-b825b0e47b07-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.139773 master-0 kubenswrapper[8731]: I1205 12:38:54.139687 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.139773 master-0 kubenswrapper[8731]: I1205 12:38:54.139723 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.148436 master-0 kubenswrapper[8731]: I1205 12:38:54.148393 8731 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 12:38:54.182211 master-0 kubenswrapper[8731]: I1205 12:38:54.177425 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg"] Dec 05 12:38:54.234829 master-0 kubenswrapper[8731]: I1205 12:38:54.234419 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-7bf7f6b755-b2pxs_4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/openshift-apiserver-operator/1.log" Dec 05 12:38:54.241533 master-0 kubenswrapper[8731]: I1205 12:38:54.241483 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbg7w\" (UniqueName: \"kubernetes.io/projected/dbe144b5-3b78-4946-bbf9-b825b0e47b07-kube-api-access-mbg7w\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.241612 master-0 kubenswrapper[8731]: I1205 12:38:54.241564 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/dbe144b5-3b78-4946-bbf9-b825b0e47b07-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.241612 master-0 kubenswrapper[8731]: I1205 12:38:54.241600 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.241680 master-0 kubenswrapper[8731]: I1205 12:38:54.241646 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.243936 master-0 kubenswrapper[8731]: I1205 12:38:54.241713 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/dbe144b5-3b78-4946-bbf9-b825b0e47b07-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.243936 master-0 kubenswrapper[8731]: I1205 12:38:54.241849 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/dbe144b5-3b78-4946-bbf9-b825b0e47b07-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.243936 master-0 kubenswrapper[8731]: I1205 12:38:54.243600 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.244523 master-0 kubenswrapper[8731]: I1205 12:38:54.244467 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.245761 master-0 kubenswrapper[8731]: I1205 12:38:54.245719 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/dbe144b5-3b78-4946-bbf9-b825b0e47b07-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.266241 master-0 kubenswrapper[8731]: I1205 12:38:54.265831 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbg7w\" (UniqueName: \"kubernetes.io/projected/dbe144b5-3b78-4946-bbf9-b825b0e47b07-kube-api-access-mbg7w\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.314719 master-0 kubenswrapper[8731]: I1205 12:38:54.314623 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:38:54.348690 master-0 kubenswrapper[8731]: W1205 12:38:54.347705 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbe144b5_3b78_4946_bbf9_b825b0e47b07.slice/crio-95fb5697edafbf4a316d98f995b9941dad32b61de9fdb2705dcb30f672d4ab5b WatchSource:0}: Error finding container 95fb5697edafbf4a316d98f995b9941dad32b61de9fdb2705dcb30f672d4ab5b: Status 404 returned error can't find the container with id 95fb5697edafbf4a316d98f995b9941dad32b61de9fdb2705dcb30f672d4ab5b Dec 05 12:38:54.438578 master-0 kubenswrapper[8731]: I1205 12:38:54.438445 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-7bf7f6b755-b2pxs_4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/openshift-apiserver-operator/2.log" Dec 05 12:38:54.631375 master-0 kubenswrapper[8731]: I1205 12:38:54.629331 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-845d4454f8-kcq9s_f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/fix-audit-permissions/0.log" Dec 05 12:38:54.844455 master-0 kubenswrapper[8731]: I1205 12:38:54.844391 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-845d4454f8-kcq9s_f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/openshift-apiserver/0.log" Dec 05 12:38:54.854787 master-0 kubenswrapper[8731]: I1205 12:38:54.854313 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmnvq" event={"ID":"ebfbe878-1796-4a20-b3f0-76165038252e","Type":"ContainerStarted","Data":"a8cae8900ae7cce8ceb6b634d4c10f86d39b83230027bdc07c4c7d3d67f473e8"} Dec 05 12:38:54.870376 master-0 kubenswrapper[8731]: I1205 12:38:54.869959 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" event={"ID":"e2e2d968-9946-4711-aaf0-3e3a03bff415","Type":"ContainerStarted","Data":"de8c9b1dc7ded42717fa7579b2074dc4f99da101c43fbc90332e93b93966800c"} Dec 05 12:38:54.870376 master-0 kubenswrapper[8731]: I1205 12:38:54.870032 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" event={"ID":"e2e2d968-9946-4711-aaf0-3e3a03bff415","Type":"ContainerStarted","Data":"130205999d123cc10c914ecc3cb22cde267becfbe33db09ccb0559c952bdc40f"} Dec 05 12:38:54.874136 master-0 kubenswrapper[8731]: I1205 12:38:54.872652 8731 generic.go:334] "Generic (PLEG): container finished" podID="9c31f89c-b01b-4853-a901-bccc25441a46" containerID="1f82b253446479fa5b79026be8aaeda5c0a2e403ecedf5e8aa0aa49e59d88903" exitCode=0 Dec 05 12:38:54.874136 master-0 kubenswrapper[8731]: I1205 12:38:54.872694 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfk7f" event={"ID":"9c31f89c-b01b-4853-a901-bccc25441a46","Type":"ContainerDied","Data":"1f82b253446479fa5b79026be8aaeda5c0a2e403ecedf5e8aa0aa49e59d88903"} Dec 05 12:38:54.875571 master-0 kubenswrapper[8731]: I1205 12:38:54.875501 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p8p6" event={"ID":"8defe125-1529-4091-adff-e9d17a2b298f","Type":"ContainerStarted","Data":"82b30fbccb7238f44d13f70da028d33f5b6e416081362f085ebb4ebdcea4d599"} Dec 05 12:38:54.881517 master-0 kubenswrapper[8731]: I1205 12:38:54.880997 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2pp25" event={"ID":"b74e0607-6ed0-4119-8870-895b7d336830","Type":"ContainerStarted","Data":"de5c01ef20eb6b4a7a0d0edd765eb5a0d5c99c96508f7cefbb6d4334d267cd81"} Dec 05 12:38:54.888417 master-0 kubenswrapper[8731]: I1205 12:38:54.888268 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dmnvq" podStartSLOduration=22.215483623 podStartE2EDuration="40.888250474s" podCreationTimestamp="2025-12-05 12:38:14 +0000 UTC" firstStartedPulling="2025-12-05 12:38:35.568198132 +0000 UTC m=+413.872182309" lastFinishedPulling="2025-12-05 12:38:54.240964993 +0000 UTC m=+432.544949160" observedRunningTime="2025-12-05 12:38:54.884652156 +0000 UTC m=+433.188636313" watchObservedRunningTime="2025-12-05 12:38:54.888250474 +0000 UTC m=+433.192234641" Dec 05 12:38:54.889486 master-0 kubenswrapper[8731]: I1205 12:38:54.889434 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" event={"ID":"909ed395-8ad3-4350-95e3-b4b19c682f92","Type":"ContainerStarted","Data":"c5997a9e57f36847e6cb187afed936a398d9d89f0a3c5fbdaa0cdcf0b16bbffd"} Dec 05 12:38:54.895799 master-0 kubenswrapper[8731]: I1205 12:38:54.895767 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerStarted","Data":"b36175e01241a922ef57ef9968701e5af5fa8f55a7287b6d3fe1828d9e78254f"} Dec 05 12:38:54.895973 master-0 kubenswrapper[8731]: I1205 12:38:54.895809 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerStarted","Data":"95fb5697edafbf4a316d98f995b9941dad32b61de9fdb2705dcb30f672d4ab5b"} Dec 05 12:38:54.912381 master-0 kubenswrapper[8731]: I1205 12:38:54.912296 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4p8p6" podStartSLOduration=23.187266443 podStartE2EDuration="41.91227275s" podCreationTimestamp="2025-12-05 12:38:13 +0000 UTC" firstStartedPulling="2025-12-05 12:38:35.509080838 +0000 UTC m=+413.813065005" lastFinishedPulling="2025-12-05 12:38:54.234087145 +0000 UTC m=+432.538071312" observedRunningTime="2025-12-05 12:38:54.908467926 +0000 UTC m=+433.212452093" watchObservedRunningTime="2025-12-05 12:38:54.91227275 +0000 UTC m=+433.216256917" Dec 05 12:38:54.931674 master-0 kubenswrapper[8731]: I1205 12:38:54.931560 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" podStartSLOduration=508.931533726 podStartE2EDuration="8m28.931533726s" podCreationTimestamp="2025-12-05 12:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:54.925786059 +0000 UTC m=+433.229770226" watchObservedRunningTime="2025-12-05 12:38:54.931533726 +0000 UTC m=+433.235517893" Dec 05 12:38:54.951077 master-0 kubenswrapper[8731]: I1205 12:38:54.950995 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2pp25" podStartSLOduration=23.228835629 podStartE2EDuration="41.950968607s" podCreationTimestamp="2025-12-05 12:38:13 +0000 UTC" firstStartedPulling="2025-12-05 12:38:35.514338012 +0000 UTC m=+413.818322189" lastFinishedPulling="2025-12-05 12:38:54.23647099 +0000 UTC m=+432.540455167" observedRunningTime="2025-12-05 12:38:54.944347796 +0000 UTC m=+433.248331963" watchObservedRunningTime="2025-12-05 12:38:54.950968607 +0000 UTC m=+433.254952764" Dec 05 12:38:55.032985 master-0 kubenswrapper[8731]: I1205 12:38:55.029497 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-845d4454f8-kcq9s_f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/openshift-apiserver-check-endpoints/0.log" Dec 05 12:38:55.229805 master-0 kubenswrapper[8731]: I1205 12:38:55.229737 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/2.log" Dec 05 12:38:55.347702 master-0 kubenswrapper[8731]: I1205 12:38:55.347648 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:55.347702 master-0 kubenswrapper[8731]: I1205 12:38:55.347710 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:38:55.430070 master-0 kubenswrapper[8731]: I1205 12:38:55.429895 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/3.log" Dec 05 12:38:55.634035 master-0 kubenswrapper[8731]: I1205 12:38:55.633961 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/2.log" Dec 05 12:38:55.829869 master-0 kubenswrapper[8731]: I1205 12:38:55.829810 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/3.log" Dec 05 12:38:55.906342 master-0 kubenswrapper[8731]: I1205 12:38:55.906247 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerStarted","Data":"dd11d2ba2786d3d9f0ecdca93a7646fa05672ce1b1d99750eaff80844a557871"} Dec 05 12:38:55.906342 master-0 kubenswrapper[8731]: I1205 12:38:55.906334 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerStarted","Data":"88bb2ee05e17ca0ccc95842f8e427991824283668dc77c62b2a389be9423149d"} Dec 05 12:38:55.928127 master-0 kubenswrapper[8731]: I1205 12:38:55.928018 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" podStartSLOduration=2.9279918499999997 podStartE2EDuration="2.92799185s" podCreationTimestamp="2025-12-05 12:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:38:55.927793274 +0000 UTC m=+434.231777441" watchObservedRunningTime="2025-12-05 12:38:55.92799185 +0000 UTC m=+434.231976017" Dec 05 12:38:55.947237 master-0 kubenswrapper[8731]: I1205 12:38:55.946294 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159e5ddd-ce04-491a-996f-7c7b4bcec546" path="/var/lib/kubelet/pods/159e5ddd-ce04-491a-996f-7c7b4bcec546/volumes" Dec 05 12:38:56.037474 master-0 kubenswrapper[8731]: I1205 12:38:56.037410 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-675db9579f-4dcg8_7e562fda-e695-4218-a9cf-4179b8d456db/controller-manager/0.log" Dec 05 12:38:56.229564 master-0 kubenswrapper[8731]: I1205 12:38:56.229443 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-b48f6bd98-4npsq_3c753373-e1f9-457c-a134-721fce3b1575/route-controller-manager/0.log" Dec 05 12:38:56.397105 master-0 kubenswrapper[8731]: I1205 12:38:56.391995 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-dmnvq" podUID="ebfbe878-1796-4a20-b3f0-76165038252e" containerName="registry-server" probeResult="failure" output=< Dec 05 12:38:56.397105 master-0 kubenswrapper[8731]: timeout: failed to connect service ":50051" within 1s Dec 05 12:38:56.397105 master-0 kubenswrapper[8731]: > Dec 05 12:38:56.462209 master-0 kubenswrapper[8731]: I1205 12:38:56.459895 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-fbc6455c4-jmn7x_db2e54b6-4879-40f4-9359-a8b0c31e76c2/catalog-operator/0.log" Dec 05 12:38:56.483302 master-0 kubenswrapper[8731]: I1205 12:38:56.481766 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:38:56.483302 master-0 kubenswrapper[8731]: I1205 12:38:56.481852 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:38:56.498736 master-0 kubenswrapper[8731]: I1205 12:38:56.498674 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:38:56.570347 master-0 kubenswrapper[8731]: I1205 12:38:56.570279 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-4s89l"] Dec 05 12:38:56.571173 master-0 kubenswrapper[8731]: I1205 12:38:56.571139 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.572847 master-0 kubenswrapper[8731]: I1205 12:38:56.572789 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-bcwx4" Dec 05 12:38:56.574364 master-0 kubenswrapper[8731]: I1205 12:38:56.574322 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 12:38:56.574568 master-0 kubenswrapper[8731]: I1205 12:38:56.574344 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 12:38:56.589121 master-0 kubenswrapper[8731]: I1205 12:38:56.587534 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-node-bootstrap-token\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.589213 master-0 kubenswrapper[8731]: I1205 12:38:56.589148 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-certs\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.589642 master-0 kubenswrapper[8731]: I1205 12:38:56.589610 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjkjz\" (UniqueName: \"kubernetes.io/projected/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-kube-api-access-tjkjz\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.632925 master-0 kubenswrapper[8731]: I1205 12:38:56.632827 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-7cd7dbb44c-xdbtz_f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb/olm-operator/0.log" Dec 05 12:38:56.692392 master-0 kubenswrapper[8731]: I1205 12:38:56.692269 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-certs\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.692392 master-0 kubenswrapper[8731]: I1205 12:38:56.692405 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-node-bootstrap-token\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.692765 master-0 kubenswrapper[8731]: I1205 12:38:56.692447 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjkjz\" (UniqueName: \"kubernetes.io/projected/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-kube-api-access-tjkjz\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.698633 master-0 kubenswrapper[8731]: I1205 12:38:56.698567 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-certs\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.700463 master-0 kubenswrapper[8731]: I1205 12:38:56.700387 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-node-bootstrap-token\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.709294 master-0 kubenswrapper[8731]: I1205 12:38:56.709229 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjkjz\" (UniqueName: \"kubernetes.io/projected/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-kube-api-access-tjkjz\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.828682 master-0 kubenswrapper[8731]: I1205 12:38:56.828549 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-67477646d4-9vfxw_0dda6d9b-cb3a-413a-85af-ef08f15ea42e/kube-rbac-proxy/0.log" Dec 05 12:38:56.892638 master-0 kubenswrapper[8731]: I1205 12:38:56.892578 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:38:56.923245 master-0 kubenswrapper[8731]: I1205 12:38:56.923148 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:38:57.033361 master-0 kubenswrapper[8731]: I1205 12:38:57.033324 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-67477646d4-9vfxw_0dda6d9b-cb3a-413a-85af-ef08f15ea42e/package-server-manager/0.log" Dec 05 12:38:57.231697 master-0 kubenswrapper[8731]: I1205 12:38:57.231639 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-58c5755b49-6dx4c_b13885ef-d2b5-4591-825d-446cf8729bc1/packageserver/0.log" Dec 05 12:38:57.926521 master-0 kubenswrapper[8731]: I1205 12:38:57.926474 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfk7f" event={"ID":"9c31f89c-b01b-4853-a901-bccc25441a46","Type":"ContainerStarted","Data":"3d5a43caa0556605bbe96d059d3c904315c064da61dfc71414acf314ee5b2814"} Dec 05 12:38:57.927965 master-0 kubenswrapper[8731]: I1205 12:38:57.927926 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4s89l" event={"ID":"dc5db54b-094f-4c36-a0ad-042e9fc2b61d","Type":"ContainerStarted","Data":"2f534d2b0f28fdc73bcde7620f0608943fcec70ee43db7154e751cbea94562d5"} Dec 05 12:38:57.928118 master-0 kubenswrapper[8731]: I1205 12:38:57.928105 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4s89l" event={"ID":"dc5db54b-094f-4c36-a0ad-042e9fc2b61d","Type":"ContainerStarted","Data":"164d69c4a697b3689889d3ab2e5a66ca6c9ed1089292b441ab9282cdde612925"} Dec 05 12:38:57.929640 master-0 kubenswrapper[8731]: I1205 12:38:57.929623 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerStarted","Data":"8fdea4402ae8cab53b0ad7f0ecba9b1899f62586699c403d4a3f309c69f3a64e"} Dec 05 12:38:57.931345 master-0 kubenswrapper[8731]: I1205 12:38:57.931274 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" event={"ID":"909ed395-8ad3-4350-95e3-b4b19c682f92","Type":"ContainerStarted","Data":"4f714e5931c4c8cc3b7ed7099b22570199ed80bdc76778cd2533b86f1ae3c6e0"} Dec 05 12:38:58.582715 master-0 kubenswrapper[8731]: I1205 12:38:58.582632 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:38:58.587039 master-0 kubenswrapper[8731]: I1205 12:38:58.586953 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:38:58.587039 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:38:58.587039 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:38:58.587039 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:38:58.587461 master-0 kubenswrapper[8731]: I1205 12:38:58.587088 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:38:58.607965 master-0 kubenswrapper[8731]: I1205 12:38:58.607828 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wfk7f" podStartSLOduration=21.322252289 podStartE2EDuration="42.60779556s" podCreationTimestamp="2025-12-05 12:38:16 +0000 UTC" firstStartedPulling="2025-12-05 12:38:35.559356091 +0000 UTC m=+413.863340258" lastFinishedPulling="2025-12-05 12:38:56.844899362 +0000 UTC m=+435.148883529" observedRunningTime="2025-12-05 12:38:58.606238958 +0000 UTC m=+436.910223125" watchObservedRunningTime="2025-12-05 12:38:58.60779556 +0000 UTC m=+436.911779727" Dec 05 12:38:58.946864 master-0 kubenswrapper[8731]: I1205 12:38:58.946774 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:38:58.953403 master-0 kubenswrapper[8731]: I1205 12:38:58.953261 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:38:59.585558 master-0 kubenswrapper[8731]: I1205 12:38:59.585409 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:38:59.585558 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:38:59.585558 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:38:59.585558 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:38:59.586104 master-0 kubenswrapper[8731]: I1205 12:38:59.585567 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:00.585797 master-0 kubenswrapper[8731]: I1205 12:39:00.585712 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:00.585797 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:00.585797 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:00.585797 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:00.586936 master-0 kubenswrapper[8731]: I1205 12:39:00.585822 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:01.595467 master-0 kubenswrapper[8731]: I1205 12:39:01.595364 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:01.595467 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:01.595467 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:01.595467 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:01.596140 master-0 kubenswrapper[8731]: I1205 12:39:01.595503 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:02.585620 master-0 kubenswrapper[8731]: I1205 12:39:02.585445 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:02.585620 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:02.585620 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:02.585620 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:02.585620 master-0 kubenswrapper[8731]: I1205 12:39:02.585534 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:03.582024 master-0 kubenswrapper[8731]: I1205 12:39:03.581957 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:39:03.585367 master-0 kubenswrapper[8731]: I1205 12:39:03.585298 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:03.585367 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:03.585367 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:03.585367 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:03.585606 master-0 kubenswrapper[8731]: I1205 12:39:03.585388 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:03.746653 master-0 kubenswrapper[8731]: I1205 12:39:03.746577 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:39:03.746653 master-0 kubenswrapper[8731]: I1205 12:39:03.746658 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:39:03.784737 master-0 kubenswrapper[8731]: I1205 12:39:03.784676 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:39:03.917290 master-0 kubenswrapper[8731]: I1205 12:39:03.917078 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:39:03.917290 master-0 kubenswrapper[8731]: I1205 12:39:03.917231 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:39:03.983027 master-0 kubenswrapper[8731]: I1205 12:39:03.982892 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:39:04.037272 master-0 kubenswrapper[8731]: I1205 12:39:04.037168 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:39:04.039053 master-0 kubenswrapper[8731]: I1205 12:39:04.038984 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:39:04.584511 master-0 kubenswrapper[8731]: I1205 12:39:04.584431 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:04.584511 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:04.584511 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:04.584511 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:04.585291 master-0 kubenswrapper[8731]: I1205 12:39:04.584528 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:05.418745 master-0 kubenswrapper[8731]: I1205 12:39:05.418664 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:39:05.493459 master-0 kubenswrapper[8731]: I1205 12:39:05.493391 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:39:05.584529 master-0 kubenswrapper[8731]: I1205 12:39:05.584469 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:05.584529 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:05.584529 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:05.584529 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:05.585574 master-0 kubenswrapper[8731]: I1205 12:39:05.585358 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:06.585276 master-0 kubenswrapper[8731]: I1205 12:39:06.585197 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:06.585276 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:06.585276 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:06.585276 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:06.586018 master-0 kubenswrapper[8731]: I1205 12:39:06.585290 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:07.170885 master-0 kubenswrapper[8731]: I1205 12:39:07.170819 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:39:07.171234 master-0 kubenswrapper[8731]: I1205 12:39:07.171014 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:39:07.213387 master-0 kubenswrapper[8731]: I1205 12:39:07.213334 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:39:07.585122 master-0 kubenswrapper[8731]: I1205 12:39:07.585019 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:07.585122 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:07.585122 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:07.585122 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:07.586294 master-0 kubenswrapper[8731]: I1205 12:39:07.585134 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:08.075533 master-0 kubenswrapper[8731]: I1205 12:39:08.075459 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:39:08.584962 master-0 kubenswrapper[8731]: I1205 12:39:08.584810 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:08.584962 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:08.584962 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:08.584962 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:08.585413 master-0 kubenswrapper[8731]: I1205 12:39:08.584955 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:09.494487 master-0 kubenswrapper[8731]: I1205 12:39:09.489102 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podStartSLOduration=55.24707864 podStartE2EDuration="58.489073928s" podCreationTimestamp="2025-12-05 12:38:11 +0000 UTC" firstStartedPulling="2025-12-05 12:38:53.602840602 +0000 UTC m=+431.906824769" lastFinishedPulling="2025-12-05 12:38:56.84483589 +0000 UTC m=+435.148820057" observedRunningTime="2025-12-05 12:39:09.486009865 +0000 UTC m=+447.789994072" watchObservedRunningTime="2025-12-05 12:39:09.489073928 +0000 UTC m=+447.793058105" Dec 05 12:39:09.495429 master-0 kubenswrapper[8731]: I1205 12:39:09.495038 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" podStartSLOduration=56.72988535 podStartE2EDuration="59.49501756s" podCreationTimestamp="2025-12-05 12:38:10 +0000 UTC" firstStartedPulling="2025-12-05 12:38:54.078820136 +0000 UTC m=+432.382804303" lastFinishedPulling="2025-12-05 12:38:56.843952346 +0000 UTC m=+435.147936513" observedRunningTime="2025-12-05 12:39:01.596147085 +0000 UTC m=+439.900131282" watchObservedRunningTime="2025-12-05 12:39:09.49501756 +0000 UTC m=+447.799001737" Dec 05 12:39:09.547052 master-0 kubenswrapper[8731]: I1205 12:39:09.546887 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-4s89l" podStartSLOduration=13.546848575 podStartE2EDuration="13.546848575s" podCreationTimestamp="2025-12-05 12:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:39:09.517932276 +0000 UTC m=+447.821916483" watchObservedRunningTime="2025-12-05 12:39:09.546848575 +0000 UTC m=+447.850832782" Dec 05 12:39:09.584411 master-0 kubenswrapper[8731]: I1205 12:39:09.584323 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:09.584411 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:09.584411 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:09.584411 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:09.584411 master-0 kubenswrapper[8731]: I1205 12:39:09.584398 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:10.586309 master-0 kubenswrapper[8731]: I1205 12:39:10.586131 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:10.586309 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:10.586309 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:10.586309 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:10.586309 master-0 kubenswrapper[8731]: I1205 12:39:10.586294 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:11.585306 master-0 kubenswrapper[8731]: I1205 12:39:11.585168 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:11.585306 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:11.585306 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:11.585306 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:11.585865 master-0 kubenswrapper[8731]: I1205 12:39:11.585323 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:12.585274 master-0 kubenswrapper[8731]: I1205 12:39:12.585020 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:12.585274 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:12.585274 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:12.585274 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:12.585274 master-0 kubenswrapper[8731]: I1205 12:39:12.585113 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:13.586349 master-0 kubenswrapper[8731]: I1205 12:39:13.586260 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:13.586349 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:13.586349 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:13.586349 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:13.587481 master-0 kubenswrapper[8731]: I1205 12:39:13.586370 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:14.585412 master-0 kubenswrapper[8731]: I1205 12:39:14.585317 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:14.585412 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:14.585412 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:14.585412 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:14.586159 master-0 kubenswrapper[8731]: I1205 12:39:14.585417 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:15.586631 master-0 kubenswrapper[8731]: I1205 12:39:15.584868 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:15.586631 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:15.586631 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:15.586631 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:15.586631 master-0 kubenswrapper[8731]: I1205 12:39:15.584947 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:16.585921 master-0 kubenswrapper[8731]: I1205 12:39:16.585786 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:16.585921 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:16.585921 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:16.585921 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:16.586526 master-0 kubenswrapper[8731]: I1205 12:39:16.585892 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:17.586244 master-0 kubenswrapper[8731]: I1205 12:39:17.586052 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:17.586244 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:17.586244 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:17.586244 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:17.586244 master-0 kubenswrapper[8731]: I1205 12:39:17.586217 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:18.584950 master-0 kubenswrapper[8731]: I1205 12:39:18.584877 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:18.584950 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:18.584950 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:18.584950 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:18.585556 master-0 kubenswrapper[8731]: I1205 12:39:18.584961 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:19.585707 master-0 kubenswrapper[8731]: I1205 12:39:19.585603 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:19.585707 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:19.585707 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:19.585707 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:19.585705 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:20.585849 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:20.586476 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:21.587234 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:21.587328 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:22.585805 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:22.585932 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:23.585993 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:23.586122 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:24.585391 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:24.956958 master-0 kubenswrapper[8731]: I1205 12:39:24.585483 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:25.156005 master-0 kubenswrapper[8731]: I1205 12:39:25.155935 8731 generic.go:334] "Generic (PLEG): container finished" podID="5e09e2af7200e6f9be469dbfd9bb1127" containerID="ea798cf6cf2e0e8f9ed09f878b5232d0740a5bbae085c7d7f2ee3609a0190f95" exitCode=1 Dec 05 12:39:25.156005 master-0 kubenswrapper[8731]: I1205 12:39:25.156003 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerDied","Data":"ea798cf6cf2e0e8f9ed09f878b5232d0740a5bbae085c7d7f2ee3609a0190f95"} Dec 05 12:39:25.156368 master-0 kubenswrapper[8731]: I1205 12:39:25.156053 8731 scope.go:117] "RemoveContainer" containerID="8c7e83119fdbf7fba596a8756e22362ec175fbd883171a7a50b5c673c4302ba8" Dec 05 12:39:25.156727 master-0 kubenswrapper[8731]: I1205 12:39:25.156703 8731 scope.go:117] "RemoveContainer" containerID="ea798cf6cf2e0e8f9ed09f878b5232d0740a5bbae085c7d7f2ee3609a0190f95" Dec 05 12:39:25.156927 master-0 kubenswrapper[8731]: E1205 12:39:25.156902 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(5e09e2af7200e6f9be469dbfd9bb1127)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="5e09e2af7200e6f9be469dbfd9bb1127" Dec 05 12:39:25.585313 master-0 kubenswrapper[8731]: I1205 12:39:25.585249 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:25.585313 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:25.585313 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:25.585313 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:25.585744 master-0 kubenswrapper[8731]: I1205 12:39:25.585322 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:26.584989 master-0 kubenswrapper[8731]: I1205 12:39:26.584932 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:26.584989 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:26.584989 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:26.584989 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:26.585763 master-0 kubenswrapper[8731]: I1205 12:39:26.584998 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:27.586743 master-0 kubenswrapper[8731]: I1205 12:39:27.586626 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:27.586743 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:27.586743 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:27.586743 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:27.587803 master-0 kubenswrapper[8731]: I1205 12:39:27.586745 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:28.585307 master-0 kubenswrapper[8731]: I1205 12:39:28.585173 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:28.585307 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:28.585307 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:28.585307 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:28.585713 master-0 kubenswrapper[8731]: I1205 12:39:28.585301 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:29.585666 master-0 kubenswrapper[8731]: I1205 12:39:29.585580 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:29.585666 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:29.585666 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:29.585666 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:29.586463 master-0 kubenswrapper[8731]: I1205 12:39:29.585692 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:30.585175 master-0 kubenswrapper[8731]: I1205 12:39:30.585103 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:30.585175 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:30.585175 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:30.585175 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:30.585578 master-0 kubenswrapper[8731]: I1205 12:39:30.585202 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:31.197382 master-0 kubenswrapper[8731]: I1205 12:39:31.197334 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/1.log" Dec 05 12:39:31.198184 master-0 kubenswrapper[8731]: I1205 12:39:31.198157 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/0.log" Dec 05 12:39:31.198269 master-0 kubenswrapper[8731]: I1205 12:39:31.198223 8731 generic.go:334] "Generic (PLEG): container finished" podID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" containerID="35aadb9bbeac01f2246017a0a2cf81423a3d53a1924e285f6675485164555604" exitCode=1 Dec 05 12:39:31.198322 master-0 kubenswrapper[8731]: I1205 12:39:31.198262 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerDied","Data":"35aadb9bbeac01f2246017a0a2cf81423a3d53a1924e285f6675485164555604"} Dec 05 12:39:31.198322 master-0 kubenswrapper[8731]: I1205 12:39:31.198306 8731 scope.go:117] "RemoveContainer" containerID="25a1113bac1425c0d6b5254d5067b012732c090d8f467edda97019523a2d47be" Dec 05 12:39:31.199052 master-0 kubenswrapper[8731]: I1205 12:39:31.199006 8731 scope.go:117] "RemoveContainer" containerID="35aadb9bbeac01f2246017a0a2cf81423a3d53a1924e285f6675485164555604" Dec 05 12:39:31.199463 master-0 kubenswrapper[8731]: E1205 12:39:31.199427 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-7xrk6_openshift-ingress-operator(a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" Dec 05 12:39:31.583972 master-0 kubenswrapper[8731]: I1205 12:39:31.583866 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:31.583972 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:31.583972 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:31.583972 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:31.583972 master-0 kubenswrapper[8731]: I1205 12:39:31.583949 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:32.205981 master-0 kubenswrapper[8731]: I1205 12:39:32.205917 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/1.log" Dec 05 12:39:32.585090 master-0 kubenswrapper[8731]: I1205 12:39:32.584951 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:32.585090 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:32.585090 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:32.585090 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:32.585090 master-0 kubenswrapper[8731]: I1205 12:39:32.585050 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:33.585553 master-0 kubenswrapper[8731]: I1205 12:39:33.585477 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:33.585553 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:33.585553 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:33.585553 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:33.586680 master-0 kubenswrapper[8731]: I1205 12:39:33.585569 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:34.584982 master-0 kubenswrapper[8731]: I1205 12:39:34.584892 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:34.584982 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:34.584982 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:34.584982 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:34.585441 master-0 kubenswrapper[8731]: I1205 12:39:34.584998 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:35.584724 master-0 kubenswrapper[8731]: I1205 12:39:35.584649 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:35.584724 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:35.584724 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:35.584724 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:35.585367 master-0 kubenswrapper[8731]: I1205 12:39:35.584747 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:36.585787 master-0 kubenswrapper[8731]: I1205 12:39:36.585692 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:36.585787 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:36.585787 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:36.585787 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:36.585787 master-0 kubenswrapper[8731]: I1205 12:39:36.585763 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:37.584819 master-0 kubenswrapper[8731]: I1205 12:39:37.584751 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:37.584819 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:37.584819 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:37.584819 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:37.585137 master-0 kubenswrapper[8731]: I1205 12:39:37.584847 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:37.935024 master-0 kubenswrapper[8731]: I1205 12:39:37.934924 8731 scope.go:117] "RemoveContainer" containerID="ea798cf6cf2e0e8f9ed09f878b5232d0740a5bbae085c7d7f2ee3609a0190f95" Dec 05 12:39:38.251719 master-0 kubenswrapper[8731]: I1205 12:39:38.251623 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c"} Dec 05 12:39:38.585635 master-0 kubenswrapper[8731]: I1205 12:39:38.585557 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:38.585635 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:38.585635 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:38.585635 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:38.586002 master-0 kubenswrapper[8731]: I1205 12:39:38.585648 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:39.584826 master-0 kubenswrapper[8731]: I1205 12:39:39.584734 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:39.584826 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:39.584826 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:39.584826 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:39.585613 master-0 kubenswrapper[8731]: I1205 12:39:39.584843 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:40.585701 master-0 kubenswrapper[8731]: I1205 12:39:40.585558 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:40.585701 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:40.585701 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:40.585701 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:40.587537 master-0 kubenswrapper[8731]: I1205 12:39:40.585705 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:41.584479 master-0 kubenswrapper[8731]: I1205 12:39:41.584385 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:41.584479 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:41.584479 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:41.584479 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:41.584914 master-0 kubenswrapper[8731]: I1205 12:39:41.584511 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:42.584955 master-0 kubenswrapper[8731]: I1205 12:39:42.584843 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:42.584955 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:42.584955 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:42.584955 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:42.585993 master-0 kubenswrapper[8731]: I1205 12:39:42.584971 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:42.935446 master-0 kubenswrapper[8731]: I1205 12:39:42.935398 8731 scope.go:117] "RemoveContainer" containerID="35aadb9bbeac01f2246017a0a2cf81423a3d53a1924e285f6675485164555604" Dec 05 12:39:43.293651 master-0 kubenswrapper[8731]: I1205 12:39:43.293564 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/1.log" Dec 05 12:39:43.294212 master-0 kubenswrapper[8731]: I1205 12:39:43.294106 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerStarted","Data":"6f8055fdf3cea411e4a76860001f402a5742a0c41d34f8fa2265a84c73970742"} Dec 05 12:39:43.584266 master-0 kubenswrapper[8731]: I1205 12:39:43.584084 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:43.584266 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:43.584266 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:43.584266 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:43.584266 master-0 kubenswrapper[8731]: I1205 12:39:43.584224 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:44.584691 master-0 kubenswrapper[8731]: I1205 12:39:44.584607 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:44.584691 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:44.584691 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:44.584691 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:44.584691 master-0 kubenswrapper[8731]: I1205 12:39:44.584693 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:45.584948 master-0 kubenswrapper[8731]: I1205 12:39:45.584805 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:45.584948 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:45.584948 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:45.584948 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:45.585983 master-0 kubenswrapper[8731]: I1205 12:39:45.584970 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:46.584895 master-0 kubenswrapper[8731]: I1205 12:39:46.584844 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:46.584895 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:46.584895 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:46.584895 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:46.585536 master-0 kubenswrapper[8731]: I1205 12:39:46.584919 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:47.585920 master-0 kubenswrapper[8731]: I1205 12:39:47.585789 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:47.585920 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:47.585920 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:47.585920 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:47.585920 master-0 kubenswrapper[8731]: I1205 12:39:47.585891 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:48.585749 master-0 kubenswrapper[8731]: I1205 12:39:48.585690 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:48.585749 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:48.585749 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:48.585749 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:48.586928 master-0 kubenswrapper[8731]: I1205 12:39:48.586880 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:49.585503 master-0 kubenswrapper[8731]: I1205 12:39:49.585393 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:49.585503 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:49.585503 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:49.585503 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:49.586804 master-0 kubenswrapper[8731]: I1205 12:39:49.585538 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:50.586084 master-0 kubenswrapper[8731]: I1205 12:39:50.586008 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:50.586084 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:50.586084 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:50.586084 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:50.586782 master-0 kubenswrapper[8731]: I1205 12:39:50.586091 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:51.584341 master-0 kubenswrapper[8731]: I1205 12:39:51.584259 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:51.584341 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:51.584341 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:51.584341 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:51.584837 master-0 kubenswrapper[8731]: I1205 12:39:51.584367 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:52.585167 master-0 kubenswrapper[8731]: I1205 12:39:52.585074 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:52.585167 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:52.585167 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:52.585167 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:52.585929 master-0 kubenswrapper[8731]: I1205 12:39:52.585173 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:52.930554 master-0 kubenswrapper[8731]: I1205 12:39:52.930433 8731 scope.go:117] "RemoveContainer" containerID="d12e1c8bf264de03492186948f2fcb8fa30acf3e5c6ac0dd00637ed1e75cfa31" Dec 05 12:39:53.585791 master-0 kubenswrapper[8731]: I1205 12:39:53.585687 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:53.585791 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:53.585791 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:53.585791 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:53.586812 master-0 kubenswrapper[8731]: I1205 12:39:53.585818 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:54.585989 master-0 kubenswrapper[8731]: I1205 12:39:54.585891 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:54.585989 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:54.585989 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:54.585989 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:54.587008 master-0 kubenswrapper[8731]: I1205 12:39:54.586010 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:55.584292 master-0 kubenswrapper[8731]: I1205 12:39:55.584229 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:55.584292 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:55.584292 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:55.584292 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:55.584978 master-0 kubenswrapper[8731]: I1205 12:39:55.584306 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:56.584378 master-0 kubenswrapper[8731]: I1205 12:39:56.584315 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:56.584378 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:56.584378 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:56.584378 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:56.584378 master-0 kubenswrapper[8731]: I1205 12:39:56.584381 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:57.060648 master-0 kubenswrapper[8731]: I1205 12:39:57.060554 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk"] Dec 05 12:39:57.062293 master-0 kubenswrapper[8731]: I1205 12:39:57.062253 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.064923 master-0 kubenswrapper[8731]: I1205 12:39:57.064875 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 05 12:39:57.065374 master-0 kubenswrapper[8731]: I1205 12:39:57.065227 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 05 12:39:57.065461 master-0 kubenswrapper[8731]: I1205 12:39:57.065449 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 05 12:39:57.067379 master-0 kubenswrapper[8731]: I1205 12:39:57.066013 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-ljblm" Dec 05 12:39:57.073865 master-0 kubenswrapper[8731]: I1205 12:39:57.073791 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk"] Dec 05 12:39:57.172889 master-0 kubenswrapper[8731]: I1205 12:39:57.172787 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c69rc\" (UniqueName: \"kubernetes.io/projected/99996137-2621-458b-980d-584b3640d4ad-kube-api-access-c69rc\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.172889 master-0 kubenswrapper[8731]: I1205 12:39:57.172884 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.173359 master-0 kubenswrapper[8731]: I1205 12:39:57.173058 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.173359 master-0 kubenswrapper[8731]: I1205 12:39:57.173161 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99996137-2621-458b-980d-584b3640d4ad-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.275046 master-0 kubenswrapper[8731]: I1205 12:39:57.274961 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c69rc\" (UniqueName: \"kubernetes.io/projected/99996137-2621-458b-980d-584b3640d4ad-kube-api-access-c69rc\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.275330 master-0 kubenswrapper[8731]: I1205 12:39:57.275249 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.275417 master-0 kubenswrapper[8731]: I1205 12:39:57.275364 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.275481 master-0 kubenswrapper[8731]: I1205 12:39:57.275465 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99996137-2621-458b-980d-584b3640d4ad-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.276540 master-0 kubenswrapper[8731]: I1205 12:39:57.276495 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99996137-2621-458b-980d-584b3640d4ad-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.281994 master-0 kubenswrapper[8731]: I1205 12:39:57.281942 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.287547 master-0 kubenswrapper[8731]: I1205 12:39:57.287472 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.296124 master-0 kubenswrapper[8731]: I1205 12:39:57.296086 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c69rc\" (UniqueName: \"kubernetes.io/projected/99996137-2621-458b-980d-584b3640d4ad-kube-api-access-c69rc\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.385552 master-0 kubenswrapper[8731]: I1205 12:39:57.385431 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:39:57.584560 master-0 kubenswrapper[8731]: I1205 12:39:57.584494 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:57.584560 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:57.584560 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:57.584560 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:57.585311 master-0 kubenswrapper[8731]: I1205 12:39:57.584591 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:57.826934 master-0 kubenswrapper[8731]: I1205 12:39:57.826883 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk"] Dec 05 12:39:57.838714 master-0 kubenswrapper[8731]: W1205 12:39:57.838632 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99996137_2621_458b_980d_584b3640d4ad.slice/crio-570d4cae37b4f398ab8be13ab3899c325813f0073ace4d7fbe1d38d0fbd654b9 WatchSource:0}: Error finding container 570d4cae37b4f398ab8be13ab3899c325813f0073ace4d7fbe1d38d0fbd654b9: Status 404 returned error can't find the container with id 570d4cae37b4f398ab8be13ab3899c325813f0073ace4d7fbe1d38d0fbd654b9 Dec 05 12:39:58.399975 master-0 kubenswrapper[8731]: I1205 12:39:58.399897 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" event={"ID":"99996137-2621-458b-980d-584b3640d4ad","Type":"ContainerStarted","Data":"570d4cae37b4f398ab8be13ab3899c325813f0073ace4d7fbe1d38d0fbd654b9"} Dec 05 12:39:58.584269 master-0 kubenswrapper[8731]: I1205 12:39:58.584209 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:58.584269 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:58.584269 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:58.584269 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:58.584586 master-0 kubenswrapper[8731]: I1205 12:39:58.584287 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:39:59.409421 master-0 kubenswrapper[8731]: I1205 12:39:59.409291 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" event={"ID":"99996137-2621-458b-980d-584b3640d4ad","Type":"ContainerStarted","Data":"4d39f87ec0ab3e7d43386497849fc0b62dfc1564ab50782064167f0cb951ca1d"} Dec 05 12:39:59.409421 master-0 kubenswrapper[8731]: I1205 12:39:59.409356 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" event={"ID":"99996137-2621-458b-980d-584b3640d4ad","Type":"ContainerStarted","Data":"d8e8a1073abbdf051f404a2a4f1613aeacac287ee90a5af14a8006b5d070a015"} Dec 05 12:39:59.427644 master-0 kubenswrapper[8731]: I1205 12:39:59.427555 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" podStartSLOduration=33.126347782 podStartE2EDuration="34.427535726s" podCreationTimestamp="2025-12-05 12:39:25 +0000 UTC" firstStartedPulling="2025-12-05 12:39:57.840703194 +0000 UTC m=+496.144687371" lastFinishedPulling="2025-12-05 12:39:59.141891148 +0000 UTC m=+497.445875315" observedRunningTime="2025-12-05 12:39:59.424592576 +0000 UTC m=+497.728576743" watchObservedRunningTime="2025-12-05 12:39:59.427535726 +0000 UTC m=+497.731519893" Dec 05 12:39:59.585008 master-0 kubenswrapper[8731]: I1205 12:39:59.584928 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:39:59.585008 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:39:59.585008 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:39:59.585008 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:39:59.585008 master-0 kubenswrapper[8731]: I1205 12:39:59.584998 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:00.584021 master-0 kubenswrapper[8731]: I1205 12:40:00.583973 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:00.584021 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:00.584021 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:00.584021 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:00.584434 master-0 kubenswrapper[8731]: I1205 12:40:00.584364 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:01.585915 master-0 kubenswrapper[8731]: I1205 12:40:01.585851 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:01.585915 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:01.585915 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:01.585915 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:01.586624 master-0 kubenswrapper[8731]: I1205 12:40:01.585925 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:01.659598 master-0 kubenswrapper[8731]: I1205 12:40:01.659495 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z"] Dec 05 12:40:01.661131 master-0 kubenswrapper[8731]: I1205 12:40:01.661091 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.663375 master-0 kubenswrapper[8731]: I1205 12:40:01.663334 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 05 12:40:01.663445 master-0 kubenswrapper[8731]: I1205 12:40:01.663388 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 05 12:40:01.663445 master-0 kubenswrapper[8731]: I1205 12:40:01.663433 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-bd2pn" Dec 05 12:40:01.678268 master-0 kubenswrapper[8731]: I1205 12:40:01.678217 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z"] Dec 05 12:40:01.693213 master-0 kubenswrapper[8731]: I1205 12:40:01.692036 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-z2nmc"] Dec 05 12:40:01.693467 master-0 kubenswrapper[8731]: I1205 12:40:01.693255 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.697204 master-0 kubenswrapper[8731]: I1205 12:40:01.696485 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-4cwgg" Dec 05 12:40:01.697204 master-0 kubenswrapper[8731]: I1205 12:40:01.696672 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 05 12:40:01.697204 master-0 kubenswrapper[8731]: I1205 12:40:01.696687 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 05 12:40:01.723342 master-0 kubenswrapper[8731]: I1205 12:40:01.722240 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-5857974f64-8p9n7"] Dec 05 12:40:01.727210 master-0 kubenswrapper[8731]: I1205 12:40:01.723741 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.727210 master-0 kubenswrapper[8731]: I1205 12:40:01.726279 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-4gpnc" Dec 05 12:40:01.727210 master-0 kubenswrapper[8731]: I1205 12:40:01.726473 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 05 12:40:01.727210 master-0 kubenswrapper[8731]: I1205 12:40:01.726607 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 05 12:40:01.727210 master-0 kubenswrapper[8731]: I1205 12:40:01.726953 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.735638 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bjs8\" (UniqueName: \"kubernetes.io/projected/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-api-access-4bjs8\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.735691 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqvfm\" (UniqueName: \"kubernetes.io/projected/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-kube-api-access-nqvfm\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.735726 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-wtmp\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.735748 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-root\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.735773 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.735796 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-textfile\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.735823 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-sys\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736109 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736180 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736218 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736242 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-volume-directive-shadow\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736273 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60327040-f782-4cda-a32d-52a4f183073c-metrics-client-ca\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736323 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp957\" (UniqueName: \"kubernetes.io/projected/60327040-f782-4cda-a32d-52a4f183073c-kube-api-access-zp957\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736349 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736373 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736413 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736439 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.740267 master-0 kubenswrapper[8731]: I1205 12:40:01.736464 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.750242 master-0 kubenswrapper[8731]: I1205 12:40:01.749647 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-5857974f64-8p9n7"] Dec 05 12:40:01.838006 master-0 kubenswrapper[8731]: I1205 12:40:01.837850 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-wtmp\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.838006 master-0 kubenswrapper[8731]: I1205 12:40:01.837914 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-root\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.838006 master-0 kubenswrapper[8731]: I1205 12:40:01.837949 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.838006 master-0 kubenswrapper[8731]: I1205 12:40:01.837972 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-textfile\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.838399 master-0 kubenswrapper[8731]: I1205 12:40:01.838138 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-sys\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.838399 master-0 kubenswrapper[8731]: I1205 12:40:01.838257 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-wtmp\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.838399 master-0 kubenswrapper[8731]: I1205 12:40:01.838300 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-sys\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.838570 master-0 kubenswrapper[8731]: I1205 12:40:01.838451 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-root\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.838620 master-0 kubenswrapper[8731]: I1205 12:40:01.838565 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.838620 master-0 kubenswrapper[8731]: I1205 12:40:01.838608 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: E1205 12:40:01.838573 8731 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.838631 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: E1205 12:40:01.838716 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls podName:5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76 nodeName:}" failed. No retries permitted until 2025-12-05 12:40:02.338691902 +0000 UTC m=+500.642676059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls") pod "openshift-state-metrics-5974b6b869-w9l2z" (UID: "5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76") : secret "openshift-state-metrics-tls" not found Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: E1205 12:40:01.838792 8731 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.838802 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-volume-directive-shadow\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: E1205 12:40:01.838836 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls podName:60327040-f782-4cda-a32d-52a4f183073c nodeName:}" failed. No retries permitted until 2025-12-05 12:40:02.338824316 +0000 UTC m=+500.642808483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls") pod "node-exporter-z2nmc" (UID: "60327040-f782-4cda-a32d-52a4f183073c") : secret "node-exporter-tls" not found Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.838872 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60327040-f782-4cda-a32d-52a4f183073c-metrics-client-ca\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.838947 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-textfile\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.839028 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp957\" (UniqueName: \"kubernetes.io/projected/60327040-f782-4cda-a32d-52a4f183073c-kube-api-access-zp957\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.839061 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.839087 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.839115 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.839284 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.839337 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.839400 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bjs8\" (UniqueName: \"kubernetes.io/projected/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-api-access-4bjs8\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.839426 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqvfm\" (UniqueName: \"kubernetes.io/projected/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-kube-api-access-nqvfm\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.839533 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-volume-directive-shadow\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: E1205 12:40:01.839661 8731 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: E1205 12:40:01.839736 8731 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls podName:4e9ba71a-d1b5-4986-babe-2c15c19f9cc2 nodeName:}" failed. No retries permitted until 2025-12-05 12:40:02.339709629 +0000 UTC m=+500.643693816 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls") pod "kube-state-metrics-5857974f64-8p9n7" (UID: "4e9ba71a-d1b5-4986-babe-2c15c19f9cc2") : secret "kube-state-metrics-tls" not found Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.840002 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60327040-f782-4cda-a32d-52a4f183073c-metrics-client-ca\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.840032 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.840260 master-0 kubenswrapper[8731]: I1205 12:40:01.840064 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.841775 master-0 kubenswrapper[8731]: I1205 12:40:01.840321 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.842934 master-0 kubenswrapper[8731]: I1205 12:40:01.842888 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:01.843996 master-0 kubenswrapper[8731]: I1205 12:40:01.843955 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:01.848519 master-0 kubenswrapper[8731]: I1205 12:40:01.848484 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:02.041464 master-0 kubenswrapper[8731]: I1205 12:40:02.041404 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bjs8\" (UniqueName: \"kubernetes.io/projected/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-api-access-4bjs8\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:02.047201 master-0 kubenswrapper[8731]: I1205 12:40:02.047132 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqvfm\" (UniqueName: \"kubernetes.io/projected/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-kube-api-access-nqvfm\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:02.052091 master-0 kubenswrapper[8731]: I1205 12:40:02.052034 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp957\" (UniqueName: \"kubernetes.io/projected/60327040-f782-4cda-a32d-52a4f183073c-kube-api-access-zp957\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:02.354685 master-0 kubenswrapper[8731]: I1205 12:40:02.354643 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:02.354803 master-0 kubenswrapper[8731]: I1205 12:40:02.354779 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:02.354904 master-0 kubenswrapper[8731]: I1205 12:40:02.354869 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:02.358421 master-0 kubenswrapper[8731]: I1205 12:40:02.358384 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:02.358515 master-0 kubenswrapper[8731]: I1205 12:40:02.358494 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:02.358671 master-0 kubenswrapper[8731]: I1205 12:40:02.358611 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:02.580086 master-0 kubenswrapper[8731]: I1205 12:40:02.580010 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:40:02.584416 master-0 kubenswrapper[8731]: I1205 12:40:02.584382 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:02.584416 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:02.584416 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:02.584416 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:02.584596 master-0 kubenswrapper[8731]: I1205 12:40:02.584445 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:02.627131 master-0 kubenswrapper[8731]: I1205 12:40:02.627070 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:40:02.657697 master-0 kubenswrapper[8731]: I1205 12:40:02.657659 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:40:03.021210 master-0 kubenswrapper[8731]: I1205 12:40:03.021137 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z"] Dec 05 12:40:03.123020 master-0 kubenswrapper[8731]: I1205 12:40:03.121814 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-5857974f64-8p9n7"] Dec 05 12:40:03.436648 master-0 kubenswrapper[8731]: I1205 12:40:03.436537 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z2nmc" event={"ID":"60327040-f782-4cda-a32d-52a4f183073c","Type":"ContainerStarted","Data":"fa27a4561538d102c835ff1b231e3510011f63fe691f54410ca3547822dc8742"} Dec 05 12:40:03.440412 master-0 kubenswrapper[8731]: I1205 12:40:03.440356 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" event={"ID":"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76","Type":"ContainerStarted","Data":"3bd2ebecae58df5657c5f3c9fc768f7de5c16550901b835bef03d24d93582761"} Dec 05 12:40:03.440516 master-0 kubenswrapper[8731]: I1205 12:40:03.440427 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" event={"ID":"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76","Type":"ContainerStarted","Data":"6e36fea081a65f76f7b44518c2dc8f9952033f7a8d733e7f0dc464daba9c2867"} Dec 05 12:40:03.440516 master-0 kubenswrapper[8731]: I1205 12:40:03.440444 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" event={"ID":"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76","Type":"ContainerStarted","Data":"86aa525c2c153f5cbd8c5b3603c3c0fdcde107672a7bd7aeacc117267683bb33"} Dec 05 12:40:03.441716 master-0 kubenswrapper[8731]: I1205 12:40:03.441659 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" event={"ID":"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2","Type":"ContainerStarted","Data":"fafe50d6690c2fbac658b4db9e7e7d0a871a9941f8ee2fd5f2fce340df7fd5f6"} Dec 05 12:40:03.585272 master-0 kubenswrapper[8731]: I1205 12:40:03.585126 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:03.585272 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:03.585272 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:03.585272 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:03.585272 master-0 kubenswrapper[8731]: I1205 12:40:03.585248 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:04.585879 master-0 kubenswrapper[8731]: I1205 12:40:04.585761 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:04.585879 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:04.585879 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:04.585879 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:04.585879 master-0 kubenswrapper[8731]: I1205 12:40:04.585854 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:05.456307 master-0 kubenswrapper[8731]: I1205 12:40:05.456232 8731 generic.go:334] "Generic (PLEG): container finished" podID="60327040-f782-4cda-a32d-52a4f183073c" containerID="5021d0ebd02a2ebd7ed1f4a980629b114fcca13491901c53a97391580abdd083" exitCode=0 Dec 05 12:40:05.456760 master-0 kubenswrapper[8731]: I1205 12:40:05.456322 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z2nmc" event={"ID":"60327040-f782-4cda-a32d-52a4f183073c","Type":"ContainerDied","Data":"5021d0ebd02a2ebd7ed1f4a980629b114fcca13491901c53a97391580abdd083"} Dec 05 12:40:05.458905 master-0 kubenswrapper[8731]: I1205 12:40:05.458813 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" event={"ID":"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76","Type":"ContainerStarted","Data":"26d489dc1c2c45db44d32cda974f33865505518c1962dcd743305c17c6b0f5bb"} Dec 05 12:40:05.461937 master-0 kubenswrapper[8731]: I1205 12:40:05.461865 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" event={"ID":"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2","Type":"ContainerStarted","Data":"acc93650e1b2b844988ef7bf696d586f1fa71b30b85d3a240aab334218886cb9"} Dec 05 12:40:05.461937 master-0 kubenswrapper[8731]: I1205 12:40:05.461926 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" event={"ID":"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2","Type":"ContainerStarted","Data":"02cbd5be726f383c3fff717aa896f2f5f9edd3ef8d5f5444366eb4982a31e95b"} Dec 05 12:40:05.461937 master-0 kubenswrapper[8731]: I1205 12:40:05.461940 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" event={"ID":"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2","Type":"ContainerStarted","Data":"6156788076b5ad36c99009735d59fdd236497a34b3d72e5f7f9563e6ec82cdb6"} Dec 05 12:40:05.495050 master-0 kubenswrapper[8731]: I1205 12:40:05.494973 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" podStartSLOduration=2.749529189 podStartE2EDuration="4.494951461s" podCreationTimestamp="2025-12-05 12:40:01 +0000 UTC" firstStartedPulling="2025-12-05 12:40:03.310661268 +0000 UTC m=+501.614645435" lastFinishedPulling="2025-12-05 12:40:05.05608354 +0000 UTC m=+503.360067707" observedRunningTime="2025-12-05 12:40:05.493323396 +0000 UTC m=+503.797307573" watchObservedRunningTime="2025-12-05 12:40:05.494951461 +0000 UTC m=+503.798935638" Dec 05 12:40:05.519204 master-0 kubenswrapper[8731]: I1205 12:40:05.518937 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" podStartSLOduration=3.3094905470000002 podStartE2EDuration="4.518864434s" podCreationTimestamp="2025-12-05 12:40:01 +0000 UTC" firstStartedPulling="2025-12-05 12:40:03.141174041 +0000 UTC m=+501.445158208" lastFinishedPulling="2025-12-05 12:40:04.350547928 +0000 UTC m=+502.654532095" observedRunningTime="2025-12-05 12:40:05.517470496 +0000 UTC m=+503.821454673" watchObservedRunningTime="2025-12-05 12:40:05.518864434 +0000 UTC m=+503.822848611" Dec 05 12:40:05.584480 master-0 kubenswrapper[8731]: I1205 12:40:05.584380 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:05.584480 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:05.584480 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:05.584480 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:05.584480 master-0 kubenswrapper[8731]: I1205 12:40:05.584463 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:06.476528 master-0 kubenswrapper[8731]: I1205 12:40:06.476447 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z2nmc" event={"ID":"60327040-f782-4cda-a32d-52a4f183073c","Type":"ContainerStarted","Data":"cdeb1ea490ea5701b2a95b0146cfc27c895466411f8fb26720209d1edb7876cb"} Dec 05 12:40:06.476528 master-0 kubenswrapper[8731]: I1205 12:40:06.476523 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z2nmc" event={"ID":"60327040-f782-4cda-a32d-52a4f183073c","Type":"ContainerStarted","Data":"cfb51eb4003c8f464c1a8f16b5c32a07dfa0f9b4a935f9263c448d3754ceed40"} Dec 05 12:40:06.503337 master-0 kubenswrapper[8731]: I1205 12:40:06.502925 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-z2nmc" podStartSLOduration=3.815327306 podStartE2EDuration="5.502899278s" podCreationTimestamp="2025-12-05 12:40:01 +0000 UTC" firstStartedPulling="2025-12-05 12:40:02.659557562 +0000 UTC m=+500.963541729" lastFinishedPulling="2025-12-05 12:40:04.347129534 +0000 UTC m=+502.651113701" observedRunningTime="2025-12-05 12:40:06.500161724 +0000 UTC m=+504.804145891" watchObservedRunningTime="2025-12-05 12:40:06.502899278 +0000 UTC m=+504.806883455" Dec 05 12:40:06.585456 master-0 kubenswrapper[8731]: I1205 12:40:06.585393 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:06.585456 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:06.585456 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:06.585456 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:06.586389 master-0 kubenswrapper[8731]: I1205 12:40:06.585474 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:07.584995 master-0 kubenswrapper[8731]: I1205 12:40:07.584884 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:07.584995 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:07.584995 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:07.584995 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:07.586320 master-0 kubenswrapper[8731]: I1205 12:40:07.585030 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:07.687996 master-0 kubenswrapper[8731]: I1205 12:40:07.687913 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-54c5748c8c-kqs7s"] Dec 05 12:40:07.688841 master-0 kubenswrapper[8731]: I1205 12:40:07.688802 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.692358 master-0 kubenswrapper[8731]: I1205 12:40:07.692286 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 05 12:40:07.692528 master-0 kubenswrapper[8731]: I1205 12:40:07.692362 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 05 12:40:07.692621 master-0 kubenswrapper[8731]: I1205 12:40:07.692506 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-sxj7j" Dec 05 12:40:07.692621 master-0 kubenswrapper[8731]: I1205 12:40:07.692587 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-e63soeg91on8p" Dec 05 12:40:07.692725 master-0 kubenswrapper[8731]: I1205 12:40:07.692640 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 05 12:40:07.693561 master-0 kubenswrapper[8731]: I1205 12:40:07.693456 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 05 12:40:07.707316 master-0 kubenswrapper[8731]: I1205 12:40:07.707163 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-54c5748c8c-kqs7s"] Dec 05 12:40:07.744146 master-0 kubenswrapper[8731]: I1205 12:40:07.744066 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.744146 master-0 kubenswrapper[8731]: I1205 12:40:07.744124 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnwdh\" (UniqueName: \"kubernetes.io/projected/a5338041-f213-46ef-9d81-248567ba958d-kube-api-access-bnwdh\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.744146 master-0 kubenswrapper[8731]: I1205 12:40:07.744164 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a5338041-f213-46ef-9d81-248567ba958d-audit-log\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.744547 master-0 kubenswrapper[8731]: I1205 12:40:07.744357 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.744547 master-0 kubenswrapper[8731]: I1205 12:40:07.744409 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.744547 master-0 kubenswrapper[8731]: I1205 12:40:07.744482 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.744648 master-0 kubenswrapper[8731]: I1205 12:40:07.744547 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.846148 master-0 kubenswrapper[8731]: I1205 12:40:07.845954 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.846475 master-0 kubenswrapper[8731]: I1205 12:40:07.846306 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnwdh\" (UniqueName: \"kubernetes.io/projected/a5338041-f213-46ef-9d81-248567ba958d-kube-api-access-bnwdh\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.846475 master-0 kubenswrapper[8731]: I1205 12:40:07.846399 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a5338041-f213-46ef-9d81-248567ba958d-audit-log\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.846611 master-0 kubenswrapper[8731]: I1205 12:40:07.846566 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.846611 master-0 kubenswrapper[8731]: I1205 12:40:07.846605 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.846715 master-0 kubenswrapper[8731]: I1205 12:40:07.846691 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.846780 master-0 kubenswrapper[8731]: I1205 12:40:07.846762 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.848255 master-0 kubenswrapper[8731]: I1205 12:40:07.847115 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a5338041-f213-46ef-9d81-248567ba958d-audit-log\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.848255 master-0 kubenswrapper[8731]: I1205 12:40:07.848087 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.848672 master-0 kubenswrapper[8731]: I1205 12:40:07.848618 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.849726 master-0 kubenswrapper[8731]: I1205 12:40:07.849669 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.850950 master-0 kubenswrapper[8731]: I1205 12:40:07.850878 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.851128 master-0 kubenswrapper[8731]: I1205 12:40:07.851074 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:07.864417 master-0 kubenswrapper[8731]: I1205 12:40:07.864316 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnwdh\" (UniqueName: \"kubernetes.io/projected/a5338041-f213-46ef-9d81-248567ba958d-kube-api-access-bnwdh\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:08.018453 master-0 kubenswrapper[8731]: I1205 12:40:08.017746 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:08.431424 master-0 kubenswrapper[8731]: I1205 12:40:08.431367 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-54c5748c8c-kqs7s"] Dec 05 12:40:08.436829 master-0 kubenswrapper[8731]: W1205 12:40:08.436783 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5338041_f213_46ef_9d81_248567ba958d.slice/crio-1234ab8fb98aae2372aaa8236a21f36a20e417c28feeae32f634a7022c473171 WatchSource:0}: Error finding container 1234ab8fb98aae2372aaa8236a21f36a20e417c28feeae32f634a7022c473171: Status 404 returned error can't find the container with id 1234ab8fb98aae2372aaa8236a21f36a20e417c28feeae32f634a7022c473171 Dec 05 12:40:08.489167 master-0 kubenswrapper[8731]: I1205 12:40:08.489066 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" event={"ID":"a5338041-f213-46ef-9d81-248567ba958d","Type":"ContainerStarted","Data":"1234ab8fb98aae2372aaa8236a21f36a20e417c28feeae32f634a7022c473171"} Dec 05 12:40:08.584764 master-0 kubenswrapper[8731]: I1205 12:40:08.584687 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:08.584764 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:08.584764 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:08.584764 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:08.585710 master-0 kubenswrapper[8731]: I1205 12:40:08.584783 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:09.585604 master-0 kubenswrapper[8731]: I1205 12:40:09.585505 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:09.585604 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:09.585604 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:09.585604 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:09.586285 master-0 kubenswrapper[8731]: I1205 12:40:09.585610 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:10.584586 master-0 kubenswrapper[8731]: I1205 12:40:10.584506 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:10.584586 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:10.584586 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:10.584586 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:10.584586 master-0 kubenswrapper[8731]: I1205 12:40:10.584588 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:11.526275 master-0 kubenswrapper[8731]: I1205 12:40:11.526165 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" event={"ID":"a5338041-f213-46ef-9d81-248567ba958d","Type":"ContainerStarted","Data":"8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853"} Dec 05 12:40:11.553371 master-0 kubenswrapper[8731]: I1205 12:40:11.553268 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" podStartSLOduration=1.920128759 podStartE2EDuration="4.553244417s" podCreationTimestamp="2025-12-05 12:40:07 +0000 UTC" firstStartedPulling="2025-12-05 12:40:08.440517227 +0000 UTC m=+506.744501394" lastFinishedPulling="2025-12-05 12:40:11.073632865 +0000 UTC m=+509.377617052" observedRunningTime="2025-12-05 12:40:11.551326063 +0000 UTC m=+509.855310230" watchObservedRunningTime="2025-12-05 12:40:11.553244417 +0000 UTC m=+509.857228604" Dec 05 12:40:11.585696 master-0 kubenswrapper[8731]: I1205 12:40:11.585628 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:11.585696 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:11.585696 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:11.585696 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:11.586259 master-0 kubenswrapper[8731]: I1205 12:40:11.585711 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:12.585010 master-0 kubenswrapper[8731]: I1205 12:40:12.584932 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:12.585010 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:12.585010 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:12.585010 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:12.585671 master-0 kubenswrapper[8731]: I1205 12:40:12.585016 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:13.585151 master-0 kubenswrapper[8731]: I1205 12:40:13.585051 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:13.585151 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:13.585151 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:13.585151 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:13.586441 master-0 kubenswrapper[8731]: I1205 12:40:13.585167 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:14.584871 master-0 kubenswrapper[8731]: I1205 12:40:14.584792 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:14.584871 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:14.584871 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:14.584871 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:14.585552 master-0 kubenswrapper[8731]: I1205 12:40:14.584879 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:15.585719 master-0 kubenswrapper[8731]: I1205 12:40:15.585651 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:15.585719 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:15.585719 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:15.585719 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:15.586769 master-0 kubenswrapper[8731]: I1205 12:40:15.585745 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:16.585428 master-0 kubenswrapper[8731]: I1205 12:40:16.585282 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:16.585428 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:16.585428 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:16.585428 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:16.586250 master-0 kubenswrapper[8731]: I1205 12:40:16.585464 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:17.585344 master-0 kubenswrapper[8731]: I1205 12:40:17.585262 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:17.585344 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:17.585344 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:17.585344 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:17.585753 master-0 kubenswrapper[8731]: I1205 12:40:17.585350 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:18.584311 master-0 kubenswrapper[8731]: I1205 12:40:18.584241 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:18.584311 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:18.584311 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:18.584311 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:18.584863 master-0 kubenswrapper[8731]: I1205 12:40:18.584338 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:19.584738 master-0 kubenswrapper[8731]: I1205 12:40:19.584651 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:19.584738 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:19.584738 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:19.584738 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:19.586034 master-0 kubenswrapper[8731]: I1205 12:40:19.584746 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:20.585611 master-0 kubenswrapper[8731]: I1205 12:40:20.585478 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:20.585611 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:20.585611 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:20.585611 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:20.586290 master-0 kubenswrapper[8731]: I1205 12:40:20.585666 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:21.584942 master-0 kubenswrapper[8731]: I1205 12:40:21.584867 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:21.584942 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:21.584942 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:21.584942 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:21.585468 master-0 kubenswrapper[8731]: I1205 12:40:21.584951 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:22.584571 master-0 kubenswrapper[8731]: I1205 12:40:22.584489 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:22.584571 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:22.584571 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:22.584571 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:22.585311 master-0 kubenswrapper[8731]: I1205 12:40:22.584589 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:23.585349 master-0 kubenswrapper[8731]: I1205 12:40:23.585257 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:23.585349 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:23.585349 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:23.585349 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:23.585923 master-0 kubenswrapper[8731]: I1205 12:40:23.585372 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:24.584923 master-0 kubenswrapper[8731]: I1205 12:40:24.584789 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:24.584923 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:24.584923 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:24.584923 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:24.584923 master-0 kubenswrapper[8731]: I1205 12:40:24.584901 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:25.584926 master-0 kubenswrapper[8731]: I1205 12:40:25.584829 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:25.584926 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:25.584926 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:25.584926 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:25.584926 master-0 kubenswrapper[8731]: I1205 12:40:25.584926 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:26.585965 master-0 kubenswrapper[8731]: I1205 12:40:26.585838 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:26.585965 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:26.585965 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:26.585965 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:26.585965 master-0 kubenswrapper[8731]: I1205 12:40:26.585937 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:27.585707 master-0 kubenswrapper[8731]: I1205 12:40:27.585635 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:27.585707 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:27.585707 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:27.585707 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:27.586301 master-0 kubenswrapper[8731]: I1205 12:40:27.586255 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:28.019080 master-0 kubenswrapper[8731]: I1205 12:40:28.018998 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:28.019080 master-0 kubenswrapper[8731]: I1205 12:40:28.019092 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:28.586641 master-0 kubenswrapper[8731]: I1205 12:40:28.586537 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:28.586641 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:28.586641 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:28.586641 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:28.587094 master-0 kubenswrapper[8731]: I1205 12:40:28.586665 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:29.585844 master-0 kubenswrapper[8731]: I1205 12:40:29.585729 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:29.585844 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:29.585844 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:29.585844 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:29.585844 master-0 kubenswrapper[8731]: I1205 12:40:29.585824 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:30.585921 master-0 kubenswrapper[8731]: I1205 12:40:30.585799 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:30.585921 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:30.585921 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:30.585921 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:30.585921 master-0 kubenswrapper[8731]: I1205 12:40:30.585913 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:31.585574 master-0 kubenswrapper[8731]: I1205 12:40:31.585475 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:31.585574 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:31.585574 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:31.585574 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:31.586590 master-0 kubenswrapper[8731]: I1205 12:40:31.585574 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:32.584114 master-0 kubenswrapper[8731]: I1205 12:40:32.584009 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:32.584114 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:32.584114 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:32.584114 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:32.584508 master-0 kubenswrapper[8731]: I1205 12:40:32.584125 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:33.585464 master-0 kubenswrapper[8731]: I1205 12:40:33.585356 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:33.585464 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:33.585464 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:33.585464 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:33.585464 master-0 kubenswrapper[8731]: I1205 12:40:33.585452 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:34.584483 master-0 kubenswrapper[8731]: I1205 12:40:34.584409 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:34.584483 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:34.584483 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:34.584483 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:34.584778 master-0 kubenswrapper[8731]: I1205 12:40:34.584515 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:35.584456 master-0 kubenswrapper[8731]: I1205 12:40:35.584382 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:35.584456 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:35.584456 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:35.584456 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:35.585331 master-0 kubenswrapper[8731]: I1205 12:40:35.585291 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:36.585575 master-0 kubenswrapper[8731]: I1205 12:40:36.585476 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:36.585575 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:36.585575 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:36.585575 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:36.586763 master-0 kubenswrapper[8731]: I1205 12:40:36.585581 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:37.584308 master-0 kubenswrapper[8731]: I1205 12:40:37.584265 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:37.584308 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:37.584308 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:37.584308 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:37.584784 master-0 kubenswrapper[8731]: I1205 12:40:37.584752 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:38.585624 master-0 kubenswrapper[8731]: I1205 12:40:38.585528 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:38.585624 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:38.585624 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:38.585624 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:38.586398 master-0 kubenswrapper[8731]: I1205 12:40:38.585673 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:39.585016 master-0 kubenswrapper[8731]: I1205 12:40:39.584920 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:39.585016 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:39.585016 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:39.585016 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:39.585304 master-0 kubenswrapper[8731]: I1205 12:40:39.585033 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:40.584410 master-0 kubenswrapper[8731]: I1205 12:40:40.584347 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:40.584410 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:40.584410 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:40.584410 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:40.585030 master-0 kubenswrapper[8731]: I1205 12:40:40.584443 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:41.585240 master-0 kubenswrapper[8731]: I1205 12:40:41.585171 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:41.585240 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:41.585240 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:41.585240 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:41.585875 master-0 kubenswrapper[8731]: I1205 12:40:41.585254 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:42.585063 master-0 kubenswrapper[8731]: I1205 12:40:42.584964 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:42.585063 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:42.585063 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:42.585063 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:42.586317 master-0 kubenswrapper[8731]: I1205 12:40:42.585087 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:43.585215 master-0 kubenswrapper[8731]: I1205 12:40:43.585117 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:43.585215 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:43.585215 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:43.585215 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:43.586152 master-0 kubenswrapper[8731]: I1205 12:40:43.585252 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:44.584963 master-0 kubenswrapper[8731]: I1205 12:40:44.584905 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:44.584963 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:44.584963 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:44.584963 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:44.585537 master-0 kubenswrapper[8731]: I1205 12:40:44.585496 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:45.584277 master-0 kubenswrapper[8731]: I1205 12:40:45.584128 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:45.584277 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:45.584277 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:45.584277 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:45.584277 master-0 kubenswrapper[8731]: I1205 12:40:45.584274 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:46.584872 master-0 kubenswrapper[8731]: I1205 12:40:46.584799 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:46.584872 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:46.584872 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:46.584872 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:46.584872 master-0 kubenswrapper[8731]: I1205 12:40:46.584865 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:47.584159 master-0 kubenswrapper[8731]: I1205 12:40:47.584102 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:47.584159 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:47.584159 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:47.584159 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:47.584465 master-0 kubenswrapper[8731]: I1205 12:40:47.584173 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:48.024637 master-0 kubenswrapper[8731]: I1205 12:40:48.024552 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:48.032450 master-0 kubenswrapper[8731]: I1205 12:40:48.032377 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:40:48.585080 master-0 kubenswrapper[8731]: I1205 12:40:48.584974 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:48.585080 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:48.585080 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:48.585080 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:48.585080 master-0 kubenswrapper[8731]: I1205 12:40:48.585073 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:49.584606 master-0 kubenswrapper[8731]: I1205 12:40:49.584515 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:49.584606 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:49.584606 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:49.584606 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:49.584606 master-0 kubenswrapper[8731]: I1205 12:40:49.584611 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:50.585480 master-0 kubenswrapper[8731]: I1205 12:40:50.585386 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:50.585480 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:50.585480 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:50.585480 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:50.586774 master-0 kubenswrapper[8731]: I1205 12:40:50.585525 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:51.585133 master-0 kubenswrapper[8731]: I1205 12:40:51.585067 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:51.585133 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:51.585133 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:51.585133 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:51.585452 master-0 kubenswrapper[8731]: I1205 12:40:51.585176 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:52.584016 master-0 kubenswrapper[8731]: I1205 12:40:52.583929 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:52.584016 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:52.584016 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:52.584016 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:52.584622 master-0 kubenswrapper[8731]: I1205 12:40:52.584030 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:53.585659 master-0 kubenswrapper[8731]: I1205 12:40:53.585577 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:53.585659 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:53.585659 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:53.585659 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:53.586493 master-0 kubenswrapper[8731]: I1205 12:40:53.585670 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:54.586283 master-0 kubenswrapper[8731]: I1205 12:40:54.586215 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:54.586283 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:54.586283 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:54.586283 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:54.586978 master-0 kubenswrapper[8731]: I1205 12:40:54.586307 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:55.585253 master-0 kubenswrapper[8731]: I1205 12:40:55.585100 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:55.585253 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:55.585253 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:55.585253 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:55.585253 master-0 kubenswrapper[8731]: I1205 12:40:55.585248 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:56.584910 master-0 kubenswrapper[8731]: I1205 12:40:56.584829 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:56.584910 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:56.584910 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:56.584910 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:56.585852 master-0 kubenswrapper[8731]: I1205 12:40:56.584926 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:57.584813 master-0 kubenswrapper[8731]: I1205 12:40:57.584740 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:40:57.584813 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:40:57.584813 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:40:57.584813 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:40:57.585449 master-0 kubenswrapper[8731]: I1205 12:40:57.584829 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:40:57.585449 master-0 kubenswrapper[8731]: I1205 12:40:57.584888 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:40:57.585576 master-0 kubenswrapper[8731]: I1205 12:40:57.585540 8731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"8fdea4402ae8cab53b0ad7f0ecba9b1899f62586699c403d4a3f309c69f3a64e"} pod="openshift-ingress/router-default-5465c8b4db-dzlmb" containerMessage="Container router failed startup probe, will be restarted" Dec 05 12:40:57.585620 master-0 kubenswrapper[8731]: I1205 12:40:57.585588 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" containerID="cri-o://8fdea4402ae8cab53b0ad7f0ecba9b1899f62586699c403d4a3f309c69f3a64e" gracePeriod=3600 Dec 05 12:41:44.226549 master-0 kubenswrapper[8731]: I1205 12:41:44.226444 8731 generic.go:334] "Generic (PLEG): container finished" podID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerID="8fdea4402ae8cab53b0ad7f0ecba9b1899f62586699c403d4a3f309c69f3a64e" exitCode=0 Dec 05 12:41:44.226549 master-0 kubenswrapper[8731]: I1205 12:41:44.226543 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerDied","Data":"8fdea4402ae8cab53b0ad7f0ecba9b1899f62586699c403d4a3f309c69f3a64e"} Dec 05 12:41:44.227346 master-0 kubenswrapper[8731]: I1205 12:41:44.226583 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerStarted","Data":"8fbcd680e597e847a58340dd3596ea3cc035b2de307cd72ebb1304a012ac892d"} Dec 05 12:41:44.231602 master-0 kubenswrapper[8731]: I1205 12:41:44.231559 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/2.log" Dec 05 12:41:44.240019 master-0 kubenswrapper[8731]: I1205 12:41:44.239940 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/1.log" Dec 05 12:41:44.241251 master-0 kubenswrapper[8731]: I1205 12:41:44.241103 8731 generic.go:334] "Generic (PLEG): container finished" podID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" containerID="6f8055fdf3cea411e4a76860001f402a5742a0c41d34f8fa2265a84c73970742" exitCode=1 Dec 05 12:41:44.241251 master-0 kubenswrapper[8731]: I1205 12:41:44.241162 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerDied","Data":"6f8055fdf3cea411e4a76860001f402a5742a0c41d34f8fa2265a84c73970742"} Dec 05 12:41:44.241588 master-0 kubenswrapper[8731]: I1205 12:41:44.241319 8731 scope.go:117] "RemoveContainer" containerID="35aadb9bbeac01f2246017a0a2cf81423a3d53a1924e285f6675485164555604" Dec 05 12:41:44.242037 master-0 kubenswrapper[8731]: I1205 12:41:44.242002 8731 scope.go:117] "RemoveContainer" containerID="6f8055fdf3cea411e4a76860001f402a5742a0c41d34f8fa2265a84c73970742" Dec 05 12:41:44.242418 master-0 kubenswrapper[8731]: E1205 12:41:44.242385 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-7xrk6_openshift-ingress-operator(a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" Dec 05 12:41:44.582473 master-0 kubenswrapper[8731]: I1205 12:41:44.582277 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:41:44.586328 master-0 kubenswrapper[8731]: I1205 12:41:44.586229 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:44.586328 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:44.586328 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:44.586328 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:44.586605 master-0 kubenswrapper[8731]: I1205 12:41:44.586373 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:45.249986 master-0 kubenswrapper[8731]: I1205 12:41:45.249899 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/2.log" Dec 05 12:41:45.586306 master-0 kubenswrapper[8731]: I1205 12:41:45.585481 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:45.586306 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:45.586306 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:45.586306 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:45.586306 master-0 kubenswrapper[8731]: I1205 12:41:45.585596 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:46.585637 master-0 kubenswrapper[8731]: I1205 12:41:46.585527 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:46.585637 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:46.585637 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:46.585637 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:46.585637 master-0 kubenswrapper[8731]: I1205 12:41:46.585623 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:47.585255 master-0 kubenswrapper[8731]: I1205 12:41:47.585147 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:47.585255 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:47.585255 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:47.585255 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:47.586331 master-0 kubenswrapper[8731]: I1205 12:41:47.585276 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:48.585211 master-0 kubenswrapper[8731]: I1205 12:41:48.585121 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:48.585211 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:48.585211 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:48.585211 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:48.586209 master-0 kubenswrapper[8731]: I1205 12:41:48.585224 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:49.585079 master-0 kubenswrapper[8731]: I1205 12:41:49.584970 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:49.585079 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:49.585079 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:49.585079 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:49.585079 master-0 kubenswrapper[8731]: I1205 12:41:49.585062 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:50.585096 master-0 kubenswrapper[8731]: I1205 12:41:50.584991 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:50.585096 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:50.585096 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:50.585096 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:50.585096 master-0 kubenswrapper[8731]: I1205 12:41:50.585084 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:51.585955 master-0 kubenswrapper[8731]: I1205 12:41:51.585883 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:51.585955 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:51.585955 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:51.585955 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:51.586554 master-0 kubenswrapper[8731]: I1205 12:41:51.585984 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:52.586701 master-0 kubenswrapper[8731]: I1205 12:41:52.585565 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:52.586701 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:52.586701 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:52.586701 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:52.586701 master-0 kubenswrapper[8731]: I1205 12:41:52.585655 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:53.581991 master-0 kubenswrapper[8731]: I1205 12:41:53.581877 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:41:53.584731 master-0 kubenswrapper[8731]: I1205 12:41:53.584688 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:53.584731 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:53.584731 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:53.584731 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:53.585048 master-0 kubenswrapper[8731]: I1205 12:41:53.584750 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:54.585406 master-0 kubenswrapper[8731]: I1205 12:41:54.585318 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:54.585406 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:54.585406 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:54.585406 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:54.586604 master-0 kubenswrapper[8731]: I1205 12:41:54.585411 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:54.934888 master-0 kubenswrapper[8731]: I1205 12:41:54.934725 8731 scope.go:117] "RemoveContainer" containerID="6f8055fdf3cea411e4a76860001f402a5742a0c41d34f8fa2265a84c73970742" Dec 05 12:41:54.935312 master-0 kubenswrapper[8731]: E1205 12:41:54.935257 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-7xrk6_openshift-ingress-operator(a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" Dec 05 12:41:55.585549 master-0 kubenswrapper[8731]: I1205 12:41:55.585472 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:55.585549 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:55.585549 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:55.585549 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:55.586406 master-0 kubenswrapper[8731]: I1205 12:41:55.585550 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:56.585752 master-0 kubenswrapper[8731]: I1205 12:41:56.585629 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:56.585752 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:56.585752 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:56.585752 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:56.586776 master-0 kubenswrapper[8731]: I1205 12:41:56.585768 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:57.585419 master-0 kubenswrapper[8731]: I1205 12:41:57.585336 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:57.585419 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:57.585419 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:57.585419 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:57.589780 master-0 kubenswrapper[8731]: I1205 12:41:57.585435 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:58.585520 master-0 kubenswrapper[8731]: I1205 12:41:58.585442 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:58.585520 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:58.585520 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:58.585520 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:58.586072 master-0 kubenswrapper[8731]: I1205 12:41:58.585540 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:41:59.584822 master-0 kubenswrapper[8731]: I1205 12:41:59.584722 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:41:59.584822 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:41:59.584822 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:41:59.584822 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:41:59.585568 master-0 kubenswrapper[8731]: I1205 12:41:59.584842 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:00.585067 master-0 kubenswrapper[8731]: I1205 12:42:00.584961 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:00.585067 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:00.585067 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:00.585067 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:00.585067 master-0 kubenswrapper[8731]: I1205 12:42:00.585059 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:01.585711 master-0 kubenswrapper[8731]: I1205 12:42:01.585588 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:01.585711 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:01.585711 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:01.585711 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:01.586997 master-0 kubenswrapper[8731]: I1205 12:42:01.585710 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:02.585630 master-0 kubenswrapper[8731]: I1205 12:42:02.585524 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:02.585630 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:02.585630 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:02.585630 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:02.585630 master-0 kubenswrapper[8731]: I1205 12:42:02.585618 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:03.586403 master-0 kubenswrapper[8731]: I1205 12:42:03.586075 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:03.586403 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:03.586403 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:03.586403 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:03.586403 master-0 kubenswrapper[8731]: I1205 12:42:03.586252 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:04.584726 master-0 kubenswrapper[8731]: I1205 12:42:04.584646 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:04.584726 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:04.584726 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:04.584726 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:04.585093 master-0 kubenswrapper[8731]: I1205 12:42:04.584732 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:05.584496 master-0 kubenswrapper[8731]: I1205 12:42:05.584406 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:05.584496 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:05.584496 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:05.584496 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:05.584496 master-0 kubenswrapper[8731]: I1205 12:42:05.584486 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:06.585108 master-0 kubenswrapper[8731]: I1205 12:42:06.584992 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:06.585108 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:06.585108 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:06.585108 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:06.585108 master-0 kubenswrapper[8731]: I1205 12:42:06.585071 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:07.584782 master-0 kubenswrapper[8731]: I1205 12:42:07.584695 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:07.584782 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:07.584782 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:07.584782 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:07.584782 master-0 kubenswrapper[8731]: I1205 12:42:07.584765 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:07.935525 master-0 kubenswrapper[8731]: I1205 12:42:07.935217 8731 scope.go:117] "RemoveContainer" containerID="6f8055fdf3cea411e4a76860001f402a5742a0c41d34f8fa2265a84c73970742" Dec 05 12:42:08.417711 master-0 kubenswrapper[8731]: I1205 12:42:08.417624 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/2.log" Dec 05 12:42:08.418316 master-0 kubenswrapper[8731]: I1205 12:42:08.418275 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerStarted","Data":"9a04a03647acf84231bb505b6acd2c588670eb0bd70e0221386d9b53a3261e61"} Dec 05 12:42:08.584719 master-0 kubenswrapper[8731]: I1205 12:42:08.584574 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:08.584719 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:08.584719 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:08.584719 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:08.584719 master-0 kubenswrapper[8731]: I1205 12:42:08.584661 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:09.584419 master-0 kubenswrapper[8731]: I1205 12:42:09.584160 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:09.584419 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:09.584419 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:09.584419 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:09.585535 master-0 kubenswrapper[8731]: I1205 12:42:09.584449 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:10.585230 master-0 kubenswrapper[8731]: I1205 12:42:10.585074 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:10.585230 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:10.585230 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:10.585230 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:10.585230 master-0 kubenswrapper[8731]: I1205 12:42:10.585168 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:11.585696 master-0 kubenswrapper[8731]: I1205 12:42:11.585305 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:11.585696 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:11.585696 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:11.585696 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:11.585696 master-0 kubenswrapper[8731]: I1205 12:42:11.585381 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:12.585209 master-0 kubenswrapper[8731]: I1205 12:42:12.585118 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:12.585209 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:12.585209 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:12.585209 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:12.585625 master-0 kubenswrapper[8731]: I1205 12:42:12.585261 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:13.584980 master-0 kubenswrapper[8731]: I1205 12:42:13.584885 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:13.584980 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:13.584980 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:13.584980 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:13.585841 master-0 kubenswrapper[8731]: I1205 12:42:13.585260 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:14.586703 master-0 kubenswrapper[8731]: I1205 12:42:14.585409 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:14.586703 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:14.586703 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:14.586703 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:14.587868 master-0 kubenswrapper[8731]: I1205 12:42:14.586704 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:15.585210 master-0 kubenswrapper[8731]: I1205 12:42:15.585114 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:15.585210 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:15.585210 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:15.585210 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:15.585671 master-0 kubenswrapper[8731]: I1205 12:42:15.585247 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:16.585768 master-0 kubenswrapper[8731]: I1205 12:42:16.585686 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:16.585768 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:16.585768 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:16.585768 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:16.586769 master-0 kubenswrapper[8731]: I1205 12:42:16.585790 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:17.584881 master-0 kubenswrapper[8731]: I1205 12:42:17.584785 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:17.584881 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:17.584881 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:17.584881 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:17.585225 master-0 kubenswrapper[8731]: I1205 12:42:17.584915 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:18.585184 master-0 kubenswrapper[8731]: I1205 12:42:18.585069 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:18.585184 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:18.585184 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:18.585184 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:18.586498 master-0 kubenswrapper[8731]: I1205 12:42:18.585259 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:19.584602 master-0 kubenswrapper[8731]: I1205 12:42:19.584534 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:19.584602 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:19.584602 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:19.584602 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:19.585128 master-0 kubenswrapper[8731]: I1205 12:42:19.584618 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:20.584958 master-0 kubenswrapper[8731]: I1205 12:42:20.584870 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:20.584958 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:20.584958 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:20.584958 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:20.585932 master-0 kubenswrapper[8731]: I1205 12:42:20.584976 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:21.584989 master-0 kubenswrapper[8731]: I1205 12:42:21.584889 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:21.584989 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:21.584989 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:21.584989 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:21.584989 master-0 kubenswrapper[8731]: I1205 12:42:21.584957 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:22.585673 master-0 kubenswrapper[8731]: I1205 12:42:22.585600 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:22.585673 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:22.585673 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:22.585673 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:22.586425 master-0 kubenswrapper[8731]: I1205 12:42:22.585715 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:23.585325 master-0 kubenswrapper[8731]: I1205 12:42:23.585239 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:23.585325 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:23.585325 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:23.585325 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:23.585690 master-0 kubenswrapper[8731]: I1205 12:42:23.585425 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:24.585602 master-0 kubenswrapper[8731]: I1205 12:42:24.585514 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:24.585602 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:24.585602 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:24.585602 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:24.586799 master-0 kubenswrapper[8731]: I1205 12:42:24.585629 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:25.584839 master-0 kubenswrapper[8731]: I1205 12:42:25.584760 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:25.584839 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:25.584839 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:25.584839 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:25.585292 master-0 kubenswrapper[8731]: I1205 12:42:25.584850 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:26.585543 master-0 kubenswrapper[8731]: I1205 12:42:26.585433 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:26.585543 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:26.585543 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:26.585543 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:26.586650 master-0 kubenswrapper[8731]: I1205 12:42:26.585549 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:27.585464 master-0 kubenswrapper[8731]: I1205 12:42:27.585370 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:27.585464 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:27.585464 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:27.585464 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:27.585464 master-0 kubenswrapper[8731]: I1205 12:42:27.585448 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:28.587359 master-0 kubenswrapper[8731]: I1205 12:42:28.587262 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:28.587359 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:28.587359 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:28.587359 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:28.588053 master-0 kubenswrapper[8731]: I1205 12:42:28.587385 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:29.584628 master-0 kubenswrapper[8731]: I1205 12:42:29.584552 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:29.584628 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:29.584628 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:29.584628 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:29.584628 master-0 kubenswrapper[8731]: I1205 12:42:29.584615 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:30.584481 master-0 kubenswrapper[8731]: I1205 12:42:30.584409 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:30.584481 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:30.584481 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:30.584481 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:30.585359 master-0 kubenswrapper[8731]: I1205 12:42:30.584495 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:31.584889 master-0 kubenswrapper[8731]: I1205 12:42:31.584806 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:31.584889 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:31.584889 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:31.584889 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:31.584889 master-0 kubenswrapper[8731]: I1205 12:42:31.584883 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:32.584974 master-0 kubenswrapper[8731]: I1205 12:42:32.584900 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:32.584974 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:32.584974 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:32.584974 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:32.586306 master-0 kubenswrapper[8731]: I1205 12:42:32.584995 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:33.585226 master-0 kubenswrapper[8731]: I1205 12:42:33.585127 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:33.585226 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:33.585226 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:33.585226 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:33.586214 master-0 kubenswrapper[8731]: I1205 12:42:33.585250 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:34.586030 master-0 kubenswrapper[8731]: I1205 12:42:34.585927 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:34.586030 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:34.586030 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:34.586030 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:34.587128 master-0 kubenswrapper[8731]: I1205 12:42:34.586051 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:35.584785 master-0 kubenswrapper[8731]: I1205 12:42:35.584723 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:35.584785 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:35.584785 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:35.584785 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:35.585210 master-0 kubenswrapper[8731]: I1205 12:42:35.585157 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:36.584700 master-0 kubenswrapper[8731]: I1205 12:42:36.584611 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:36.584700 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:36.584700 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:36.584700 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:36.585692 master-0 kubenswrapper[8731]: I1205 12:42:36.584710 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:37.585349 master-0 kubenswrapper[8731]: I1205 12:42:37.585251 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:37.585349 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:37.585349 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:37.585349 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:37.586371 master-0 kubenswrapper[8731]: I1205 12:42:37.585377 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:38.585116 master-0 kubenswrapper[8731]: I1205 12:42:38.585031 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:38.585116 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:38.585116 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:38.585116 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:38.586129 master-0 kubenswrapper[8731]: I1205 12:42:38.585142 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:39.586096 master-0 kubenswrapper[8731]: I1205 12:42:39.585994 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:39.586096 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:39.586096 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:39.586096 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:39.586924 master-0 kubenswrapper[8731]: I1205 12:42:39.586122 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:40.584988 master-0 kubenswrapper[8731]: I1205 12:42:40.584871 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:40.584988 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:40.584988 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:40.584988 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:40.584988 master-0 kubenswrapper[8731]: I1205 12:42:40.584950 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:41.585605 master-0 kubenswrapper[8731]: I1205 12:42:41.585512 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:41.585605 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:41.585605 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:41.585605 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:41.586875 master-0 kubenswrapper[8731]: I1205 12:42:41.586807 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:42.585053 master-0 kubenswrapper[8731]: I1205 12:42:42.584945 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:42.585053 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:42.585053 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:42.585053 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:42.585053 master-0 kubenswrapper[8731]: I1205 12:42:42.585039 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:43.585732 master-0 kubenswrapper[8731]: I1205 12:42:43.585634 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:43.585732 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:43.585732 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:43.585732 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:43.586334 master-0 kubenswrapper[8731]: I1205 12:42:43.585751 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:44.585807 master-0 kubenswrapper[8731]: I1205 12:42:44.585686 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:44.585807 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:44.585807 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:44.585807 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:44.586607 master-0 kubenswrapper[8731]: I1205 12:42:44.585879 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:45.585601 master-0 kubenswrapper[8731]: I1205 12:42:45.585517 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:45.585601 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:45.585601 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:45.585601 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:45.586440 master-0 kubenswrapper[8731]: I1205 12:42:45.585611 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:46.584752 master-0 kubenswrapper[8731]: I1205 12:42:46.584637 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:46.584752 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:46.584752 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:46.584752 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:46.584752 master-0 kubenswrapper[8731]: I1205 12:42:46.584748 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:47.585201 master-0 kubenswrapper[8731]: I1205 12:42:47.585084 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:47.585201 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:47.585201 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:47.585201 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:47.585990 master-0 kubenswrapper[8731]: I1205 12:42:47.585215 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:48.584734 master-0 kubenswrapper[8731]: I1205 12:42:48.584649 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:48.584734 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:48.584734 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:48.584734 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:48.585211 master-0 kubenswrapper[8731]: I1205 12:42:48.584795 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:49.586953 master-0 kubenswrapper[8731]: I1205 12:42:49.586856 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:49.586953 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:49.586953 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:49.586953 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:49.587825 master-0 kubenswrapper[8731]: I1205 12:42:49.586963 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:50.584373 master-0 kubenswrapper[8731]: I1205 12:42:50.584303 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:50.584373 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:50.584373 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:50.584373 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:50.584727 master-0 kubenswrapper[8731]: I1205 12:42:50.584382 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:51.585113 master-0 kubenswrapper[8731]: I1205 12:42:51.585035 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:51.585113 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:51.585113 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:51.585113 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:51.585113 master-0 kubenswrapper[8731]: I1205 12:42:51.585119 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:52.585553 master-0 kubenswrapper[8731]: I1205 12:42:52.585449 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:52.585553 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:52.585553 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:52.585553 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:52.587105 master-0 kubenswrapper[8731]: I1205 12:42:52.587038 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:53.585139 master-0 kubenswrapper[8731]: I1205 12:42:53.585049 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:53.585139 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:53.585139 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:53.585139 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:53.585492 master-0 kubenswrapper[8731]: I1205 12:42:53.585154 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:54.584934 master-0 kubenswrapper[8731]: I1205 12:42:54.584851 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:54.584934 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:54.584934 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:54.584934 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:54.585705 master-0 kubenswrapper[8731]: I1205 12:42:54.584947 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:55.329363 master-0 kubenswrapper[8731]: I1205 12:42:55.329280 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 05 12:42:55.331081 master-0 kubenswrapper[8731]: I1205 12:42:55.331059 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:55.334779 master-0 kubenswrapper[8731]: I1205 12:42:55.334693 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-n67k2" Dec 05 12:42:55.334779 master-0 kubenswrapper[8731]: I1205 12:42:55.334749 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Dec 05 12:42:55.343574 master-0 kubenswrapper[8731]: I1205 12:42:55.343522 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 05 12:42:55.480263 master-0 kubenswrapper[8731]: I1205 12:42:55.480156 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:55.480263 master-0 kubenswrapper[8731]: I1205 12:42:55.480263 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2415969-33ad-418b-9df0-4a6c7bb279db-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:55.480720 master-0 kubenswrapper[8731]: I1205 12:42:55.480602 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-var-lock\") pod \"installer-2-master-0\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:55.582365 master-0 kubenswrapper[8731]: I1205 12:42:55.582159 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-var-lock\") pod \"installer-2-master-0\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:55.582766 master-0 kubenswrapper[8731]: I1205 12:42:55.582739 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:55.582895 master-0 kubenswrapper[8731]: I1205 12:42:55.582874 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2415969-33ad-418b-9df0-4a6c7bb279db-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:55.583108 master-0 kubenswrapper[8731]: I1205 12:42:55.582823 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:55.583250 master-0 kubenswrapper[8731]: I1205 12:42:55.582409 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-var-lock\") pod \"installer-2-master-0\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:55.584927 master-0 kubenswrapper[8731]: I1205 12:42:55.584824 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:55.584927 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:55.584927 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:55.584927 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:55.585836 master-0 kubenswrapper[8731]: I1205 12:42:55.584964 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:55.607161 master-0 kubenswrapper[8731]: I1205 12:42:55.607112 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2415969-33ad-418b-9df0-4a6c7bb279db-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:55.654528 master-0 kubenswrapper[8731]: I1205 12:42:55.654424 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 05 12:42:56.160898 master-0 kubenswrapper[8731]: I1205 12:42:56.158507 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 05 12:42:56.589260 master-0 kubenswrapper[8731]: I1205 12:42:56.584936 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:56.589260 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:56.589260 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:56.589260 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:56.589260 master-0 kubenswrapper[8731]: I1205 12:42:56.585039 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:56.777839 master-0 kubenswrapper[8731]: I1205 12:42:56.777403 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"c2415969-33ad-418b-9df0-4a6c7bb279db","Type":"ContainerStarted","Data":"cff910884ebcba45ebf5c933f29645e420a54c688eec01b58f8fb05ec723cbe8"} Dec 05 12:42:57.585304 master-0 kubenswrapper[8731]: I1205 12:42:57.585219 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:57.585304 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:57.585304 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:57.585304 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:57.585802 master-0 kubenswrapper[8731]: I1205 12:42:57.585329 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:58.584460 master-0 kubenswrapper[8731]: I1205 12:42:58.584360 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:58.584460 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:58.584460 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:58.584460 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:58.585359 master-0 kubenswrapper[8731]: I1205 12:42:58.584509 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:42:58.798472 master-0 kubenswrapper[8731]: I1205 12:42:58.798392 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"c2415969-33ad-418b-9df0-4a6c7bb279db","Type":"ContainerStarted","Data":"dcccd3d0ecb79fd6fed3cade29b8b0d3ae9e791a686c5e6c4f661b34fd1efb10"} Dec 05 12:42:58.830069 master-0 kubenswrapper[8731]: I1205 12:42:58.829962 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=3.8299423299999997 podStartE2EDuration="3.82994233s" podCreationTimestamp="2025-12-05 12:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:42:58.827173579 +0000 UTC m=+677.131157746" watchObservedRunningTime="2025-12-05 12:42:58.82994233 +0000 UTC m=+677.133926507" Dec 05 12:42:59.584577 master-0 kubenswrapper[8731]: I1205 12:42:59.584499 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:42:59.584577 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:42:59.584577 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:42:59.584577 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:42:59.585345 master-0 kubenswrapper[8731]: I1205 12:42:59.584587 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:00.585213 master-0 kubenswrapper[8731]: I1205 12:43:00.585131 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:00.585213 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:00.585213 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:00.585213 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:00.585994 master-0 kubenswrapper[8731]: I1205 12:43:00.585219 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:01.585048 master-0 kubenswrapper[8731]: I1205 12:43:01.584982 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:01.585048 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:01.585048 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:01.585048 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:01.585814 master-0 kubenswrapper[8731]: I1205 12:43:01.585085 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:01.617360 master-0 kubenswrapper[8731]: I1205 12:43:01.617272 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-675db9579f-4dcg8"] Dec 05 12:43:01.617690 master-0 kubenswrapper[8731]: I1205 12:43:01.617650 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" podUID="7e562fda-e695-4218-a9cf-4179b8d456db" containerName="controller-manager" containerID="cri-o://226d693739ba7f1f0405d228dca51a2e2771f758fde843b579c82652f63d7ed6" gracePeriod=30 Dec 05 12:43:01.636153 master-0 kubenswrapper[8731]: I1205 12:43:01.636072 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq"] Dec 05 12:43:01.636501 master-0 kubenswrapper[8731]: I1205 12:43:01.636361 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" podUID="3c753373-e1f9-457c-a134-721fce3b1575" containerName="route-controller-manager" containerID="cri-o://762ea77408b7f5a306a93d15bedda329d28149d43e08750a9562ca5f23cd1973" gracePeriod=30 Dec 05 12:43:01.821393 master-0 kubenswrapper[8731]: I1205 12:43:01.821341 8731 generic.go:334] "Generic (PLEG): container finished" podID="3c753373-e1f9-457c-a134-721fce3b1575" containerID="762ea77408b7f5a306a93d15bedda329d28149d43e08750a9562ca5f23cd1973" exitCode=0 Dec 05 12:43:01.821588 master-0 kubenswrapper[8731]: I1205 12:43:01.821405 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" event={"ID":"3c753373-e1f9-457c-a134-721fce3b1575","Type":"ContainerDied","Data":"762ea77408b7f5a306a93d15bedda329d28149d43e08750a9562ca5f23cd1973"} Dec 05 12:43:01.823367 master-0 kubenswrapper[8731]: I1205 12:43:01.823334 8731 generic.go:334] "Generic (PLEG): container finished" podID="7e562fda-e695-4218-a9cf-4179b8d456db" containerID="226d693739ba7f1f0405d228dca51a2e2771f758fde843b579c82652f63d7ed6" exitCode=0 Dec 05 12:43:01.823367 master-0 kubenswrapper[8731]: I1205 12:43:01.823360 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" event={"ID":"7e562fda-e695-4218-a9cf-4179b8d456db","Type":"ContainerDied","Data":"226d693739ba7f1f0405d228dca51a2e2771f758fde843b579c82652f63d7ed6"} Dec 05 12:43:02.135645 master-0 kubenswrapper[8731]: I1205 12:43:02.135261 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:43:02.143114 master-0 kubenswrapper[8731]: I1205 12:43:02.143041 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:43:02.201308 master-0 kubenswrapper[8731]: I1205 12:43:02.199627 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-client-ca\") pod \"3c753373-e1f9-457c-a134-721fce3b1575\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " Dec 05 12:43:02.201308 master-0 kubenswrapper[8731]: I1205 12:43:02.199708 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-config\") pod \"7e562fda-e695-4218-a9cf-4179b8d456db\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " Dec 05 12:43:02.201308 master-0 kubenswrapper[8731]: I1205 12:43:02.199735 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c753373-e1f9-457c-a134-721fce3b1575-serving-cert\") pod \"3c753373-e1f9-457c-a134-721fce3b1575\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " Dec 05 12:43:02.201308 master-0 kubenswrapper[8731]: I1205 12:43:02.199764 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-proxy-ca-bundles\") pod \"7e562fda-e695-4218-a9cf-4179b8d456db\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " Dec 05 12:43:02.201308 master-0 kubenswrapper[8731]: I1205 12:43:02.199846 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-client-ca\") pod \"7e562fda-e695-4218-a9cf-4179b8d456db\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " Dec 05 12:43:02.201308 master-0 kubenswrapper[8731]: I1205 12:43:02.199868 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vvmn\" (UniqueName: \"kubernetes.io/projected/3c753373-e1f9-457c-a134-721fce3b1575-kube-api-access-9vvmn\") pod \"3c753373-e1f9-457c-a134-721fce3b1575\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " Dec 05 12:43:02.201308 master-0 kubenswrapper[8731]: I1205 12:43:02.199891 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e562fda-e695-4218-a9cf-4179b8d456db-serving-cert\") pod \"7e562fda-e695-4218-a9cf-4179b8d456db\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " Dec 05 12:43:02.201308 master-0 kubenswrapper[8731]: I1205 12:43:02.199943 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") pod \"7e562fda-e695-4218-a9cf-4179b8d456db\" (UID: \"7e562fda-e695-4218-a9cf-4179b8d456db\") " Dec 05 12:43:02.201308 master-0 kubenswrapper[8731]: I1205 12:43:02.199999 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-config\") pod \"3c753373-e1f9-457c-a134-721fce3b1575\" (UID: \"3c753373-e1f9-457c-a134-721fce3b1575\") " Dec 05 12:43:02.201308 master-0 kubenswrapper[8731]: I1205 12:43:02.200856 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-config" (OuterVolumeSpecName: "config") pod "3c753373-e1f9-457c-a134-721fce3b1575" (UID: "3c753373-e1f9-457c-a134-721fce3b1575"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:43:02.201808 master-0 kubenswrapper[8731]: I1205 12:43:02.201372 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-client-ca" (OuterVolumeSpecName: "client-ca") pod "3c753373-e1f9-457c-a134-721fce3b1575" (UID: "3c753373-e1f9-457c-a134-721fce3b1575"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:43:02.202054 master-0 kubenswrapper[8731]: I1205 12:43:02.201858 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e562fda-e695-4218-a9cf-4179b8d456db" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:43:02.202054 master-0 kubenswrapper[8731]: I1205 12:43:02.201893 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-config" (OuterVolumeSpecName: "config") pod "7e562fda-e695-4218-a9cf-4179b8d456db" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:43:02.202283 master-0 kubenswrapper[8731]: I1205 12:43:02.202202 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7e562fda-e695-4218-a9cf-4179b8d456db" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:43:02.208437 master-0 kubenswrapper[8731]: I1205 12:43:02.208393 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c753373-e1f9-457c-a134-721fce3b1575-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3c753373-e1f9-457c-a134-721fce3b1575" (UID: "3c753373-e1f9-457c-a134-721fce3b1575"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:43:02.210046 master-0 kubenswrapper[8731]: I1205 12:43:02.209994 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e562fda-e695-4218-a9cf-4179b8d456db-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e562fda-e695-4218-a9cf-4179b8d456db" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:43:02.217832 master-0 kubenswrapper[8731]: I1205 12:43:02.217756 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c753373-e1f9-457c-a134-721fce3b1575-kube-api-access-9vvmn" (OuterVolumeSpecName: "kube-api-access-9vvmn") pod "3c753373-e1f9-457c-a134-721fce3b1575" (UID: "3c753373-e1f9-457c-a134-721fce3b1575"). InnerVolumeSpecName "kube-api-access-9vvmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:43:02.218063 master-0 kubenswrapper[8731]: I1205 12:43:02.217920 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66" (OuterVolumeSpecName: "kube-api-access-bll66") pod "7e562fda-e695-4218-a9cf-4179b8d456db" (UID: "7e562fda-e695-4218-a9cf-4179b8d456db"). InnerVolumeSpecName "kube-api-access-bll66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:43:02.302034 master-0 kubenswrapper[8731]: I1205 12:43:02.301947 8731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:02.302034 master-0 kubenswrapper[8731]: I1205 12:43:02.302004 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:02.302034 master-0 kubenswrapper[8731]: I1205 12:43:02.302017 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c753373-e1f9-457c-a134-721fce3b1575-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:02.302034 master-0 kubenswrapper[8731]: I1205 12:43:02.302060 8731 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:02.302547 master-0 kubenswrapper[8731]: I1205 12:43:02.302076 8731 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e562fda-e695-4218-a9cf-4179b8d456db-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:02.302547 master-0 kubenswrapper[8731]: I1205 12:43:02.302090 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vvmn\" (UniqueName: \"kubernetes.io/projected/3c753373-e1f9-457c-a134-721fce3b1575-kube-api-access-9vvmn\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:02.302547 master-0 kubenswrapper[8731]: I1205 12:43:02.302124 8731 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e562fda-e695-4218-a9cf-4179b8d456db-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:02.302547 master-0 kubenswrapper[8731]: I1205 12:43:02.302138 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bll66\" (UniqueName: \"kubernetes.io/projected/7e562fda-e695-4218-a9cf-4179b8d456db-kube-api-access-bll66\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:02.302547 master-0 kubenswrapper[8731]: I1205 12:43:02.302150 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c753373-e1f9-457c-a134-721fce3b1575-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:02.584634 master-0 kubenswrapper[8731]: I1205 12:43:02.584562 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:02.584634 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:02.584634 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:02.584634 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:02.585324 master-0 kubenswrapper[8731]: I1205 12:43:02.584660 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:02.839453 master-0 kubenswrapper[8731]: I1205 12:43:02.839395 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" event={"ID":"3c753373-e1f9-457c-a134-721fce3b1575","Type":"ContainerDied","Data":"78f3bd1c55cef923965fac9726d2f9b634cbb09d4860b2d5a0f0d35bb16ca8fb"} Dec 05 12:43:02.839453 master-0 kubenswrapper[8731]: I1205 12:43:02.839465 8731 scope.go:117] "RemoveContainer" containerID="762ea77408b7f5a306a93d15bedda329d28149d43e08750a9562ca5f23cd1973" Dec 05 12:43:02.839722 master-0 kubenswrapper[8731]: I1205 12:43:02.839504 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq" Dec 05 12:43:02.842855 master-0 kubenswrapper[8731]: I1205 12:43:02.842796 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" event={"ID":"7e562fda-e695-4218-a9cf-4179b8d456db","Type":"ContainerDied","Data":"46c71a14a0f9590da88fc8567ffce1570ccabc57f819c41e45925415e66120f4"} Dec 05 12:43:02.842977 master-0 kubenswrapper[8731]: I1205 12:43:02.842893 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-675db9579f-4dcg8" Dec 05 12:43:02.862130 master-0 kubenswrapper[8731]: I1205 12:43:02.862082 8731 scope.go:117] "RemoveContainer" containerID="226d693739ba7f1f0405d228dca51a2e2771f758fde843b579c82652f63d7ed6" Dec 05 12:43:02.887252 master-0 kubenswrapper[8731]: I1205 12:43:02.887167 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-675db9579f-4dcg8"] Dec 05 12:43:02.891089 master-0 kubenswrapper[8731]: I1205 12:43:02.891048 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-675db9579f-4dcg8"] Dec 05 12:43:02.903949 master-0 kubenswrapper[8731]: I1205 12:43:02.903889 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq"] Dec 05 12:43:02.915634 master-0 kubenswrapper[8731]: I1205 12:43:02.915534 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b48f6bd98-4npsq"] Dec 05 12:43:03.219154 master-0 kubenswrapper[8731]: I1205 12:43:03.218857 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw"] Dec 05 12:43:03.219673 master-0 kubenswrapper[8731]: E1205 12:43:03.219628 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c753373-e1f9-457c-a134-721fce3b1575" containerName="route-controller-manager" Dec 05 12:43:03.219673 master-0 kubenswrapper[8731]: I1205 12:43:03.219662 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c753373-e1f9-457c-a134-721fce3b1575" containerName="route-controller-manager" Dec 05 12:43:03.219860 master-0 kubenswrapper[8731]: E1205 12:43:03.219703 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e562fda-e695-4218-a9cf-4179b8d456db" containerName="controller-manager" Dec 05 12:43:03.219860 master-0 kubenswrapper[8731]: I1205 12:43:03.219716 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e562fda-e695-4218-a9cf-4179b8d456db" containerName="controller-manager" Dec 05 12:43:03.220080 master-0 kubenswrapper[8731]: I1205 12:43:03.219926 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c753373-e1f9-457c-a134-721fce3b1575" containerName="route-controller-manager" Dec 05 12:43:03.220080 master-0 kubenswrapper[8731]: I1205 12:43:03.219963 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e562fda-e695-4218-a9cf-4179b8d456db" containerName="controller-manager" Dec 05 12:43:03.220778 master-0 kubenswrapper[8731]: I1205 12:43:03.220738 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.222937 master-0 kubenswrapper[8731]: I1205 12:43:03.222876 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 12:43:03.223925 master-0 kubenswrapper[8731]: I1205 12:43:03.223873 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 12:43:03.224474 master-0 kubenswrapper[8731]: I1205 12:43:03.224426 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx"] Dec 05 12:43:03.225773 master-0 kubenswrapper[8731]: I1205 12:43:03.225712 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.227908 master-0 kubenswrapper[8731]: I1205 12:43:03.227862 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 12:43:03.227908 master-0 kubenswrapper[8731]: I1205 12:43:03.227891 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 12:43:03.228050 master-0 kubenswrapper[8731]: I1205 12:43:03.227930 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 12:43:03.232656 master-0 kubenswrapper[8731]: I1205 12:43:03.232579 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 12:43:03.232656 master-0 kubenswrapper[8731]: I1205 12:43:03.232637 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw"] Dec 05 12:43:03.232874 master-0 kubenswrapper[8731]: I1205 12:43:03.232756 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-z9sgn" Dec 05 12:43:03.232874 master-0 kubenswrapper[8731]: I1205 12:43:03.232764 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 12:43:03.233217 master-0 kubenswrapper[8731]: I1205 12:43:03.233157 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 12:43:03.233310 master-0 kubenswrapper[8731]: I1205 12:43:03.233169 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-fqrhd" Dec 05 12:43:03.233310 master-0 kubenswrapper[8731]: I1205 12:43:03.233249 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 12:43:03.233458 master-0 kubenswrapper[8731]: I1205 12:43:03.233305 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 12:43:03.236929 master-0 kubenswrapper[8731]: I1205 12:43:03.236884 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 12:43:03.238076 master-0 kubenswrapper[8731]: I1205 12:43:03.238037 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx"] Dec 05 12:43:03.316630 master-0 kubenswrapper[8731]: I1205 12:43:03.316545 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.316630 master-0 kubenswrapper[8731]: I1205 12:43:03.316619 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.317049 master-0 kubenswrapper[8731]: I1205 12:43:03.316674 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.317049 master-0 kubenswrapper[8731]: I1205 12:43:03.316808 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.317224 master-0 kubenswrapper[8731]: I1205 12:43:03.317195 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.317275 master-0 kubenswrapper[8731]: I1205 12:43:03.317232 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfknz\" (UniqueName: \"kubernetes.io/projected/e943438b-1de8-435c-8a19-accd6a6292a4-kube-api-access-lfknz\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.317312 master-0 kubenswrapper[8731]: I1205 12:43:03.317299 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.317422 master-0 kubenswrapper[8731]: I1205 12:43:03.317383 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr9jd\" (UniqueName: \"kubernetes.io/projected/c39d2089-d3bf-4556-b6ef-c362a08c21a2-kube-api-access-mr9jd\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.317462 master-0 kubenswrapper[8731]: I1205 12:43:03.317426 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.419193 master-0 kubenswrapper[8731]: I1205 12:43:03.419088 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.419193 master-0 kubenswrapper[8731]: I1205 12:43:03.419162 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfknz\" (UniqueName: \"kubernetes.io/projected/e943438b-1de8-435c-8a19-accd6a6292a4-kube-api-access-lfknz\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.419555 master-0 kubenswrapper[8731]: I1205 12:43:03.419485 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.419712 master-0 kubenswrapper[8731]: I1205 12:43:03.419673 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr9jd\" (UniqueName: \"kubernetes.io/projected/c39d2089-d3bf-4556-b6ef-c362a08c21a2-kube-api-access-mr9jd\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.420005 master-0 kubenswrapper[8731]: I1205 12:43:03.419945 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.420064 master-0 kubenswrapper[8731]: I1205 12:43:03.420038 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.420112 master-0 kubenswrapper[8731]: I1205 12:43:03.420090 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.420250 master-0 kubenswrapper[8731]: I1205 12:43:03.420221 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.420332 master-0 kubenswrapper[8731]: I1205 12:43:03.420304 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.420799 master-0 kubenswrapper[8731]: I1205 12:43:03.420757 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.421322 master-0 kubenswrapper[8731]: I1205 12:43:03.421269 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.421322 master-0 kubenswrapper[8731]: I1205 12:43:03.421291 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.422073 master-0 kubenswrapper[8731]: I1205 12:43:03.422008 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.422365 master-0 kubenswrapper[8731]: I1205 12:43:03.422159 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.424381 master-0 kubenswrapper[8731]: I1205 12:43:03.424334 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.425676 master-0 kubenswrapper[8731]: I1205 12:43:03.425631 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.440132 master-0 kubenswrapper[8731]: I1205 12:43:03.439441 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfknz\" (UniqueName: \"kubernetes.io/projected/e943438b-1de8-435c-8a19-accd6a6292a4-kube-api-access-lfknz\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.440132 master-0 kubenswrapper[8731]: I1205 12:43:03.440046 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr9jd\" (UniqueName: \"kubernetes.io/projected/c39d2089-d3bf-4556-b6ef-c362a08c21a2-kube-api-access-mr9jd\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.555085 master-0 kubenswrapper[8731]: I1205 12:43:03.554966 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:03.580925 master-0 kubenswrapper[8731]: I1205 12:43:03.580861 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:03.587594 master-0 kubenswrapper[8731]: I1205 12:43:03.587510 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:03.587594 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:03.587594 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:03.587594 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:03.588362 master-0 kubenswrapper[8731]: I1205 12:43:03.587619 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:03.873116 master-0 kubenswrapper[8731]: I1205 12:43:03.871567 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx"] Dec 05 12:43:03.878902 master-0 kubenswrapper[8731]: W1205 12:43:03.878839 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode943438b_1de8_435c_8a19_accd6a6292a4.slice/crio-77da36c6bf5d09d68dbf2de017a655a5a15b25fda32cba3288a3d8b2cc4b44c0 WatchSource:0}: Error finding container 77da36c6bf5d09d68dbf2de017a655a5a15b25fda32cba3288a3d8b2cc4b44c0: Status 404 returned error can't find the container with id 77da36c6bf5d09d68dbf2de017a655a5a15b25fda32cba3288a3d8b2cc4b44c0 Dec 05 12:43:03.954124 master-0 kubenswrapper[8731]: I1205 12:43:03.954080 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c753373-e1f9-457c-a134-721fce3b1575" path="/var/lib/kubelet/pods/3c753373-e1f9-457c-a134-721fce3b1575/volumes" Dec 05 12:43:03.954653 master-0 kubenswrapper[8731]: I1205 12:43:03.954632 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e562fda-e695-4218-a9cf-4179b8d456db" path="/var/lib/kubelet/pods/7e562fda-e695-4218-a9cf-4179b8d456db/volumes" Dec 05 12:43:04.063212 master-0 kubenswrapper[8731]: I1205 12:43:04.062053 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw"] Dec 05 12:43:04.067590 master-0 kubenswrapper[8731]: W1205 12:43:04.067534 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc39d2089_d3bf_4556_b6ef_c362a08c21a2.slice/crio-ae3644549c6caccb0e5b76cf093dd16f97c66829b7bc2c724be0d4328e24c56e WatchSource:0}: Error finding container ae3644549c6caccb0e5b76cf093dd16f97c66829b7bc2c724be0d4328e24c56e: Status 404 returned error can't find the container with id ae3644549c6caccb0e5b76cf093dd16f97c66829b7bc2c724be0d4328e24c56e Dec 05 12:43:04.584812 master-0 kubenswrapper[8731]: I1205 12:43:04.584750 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:04.584812 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:04.584812 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:04.584812 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:04.585285 master-0 kubenswrapper[8731]: I1205 12:43:04.585253 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:04.885360 master-0 kubenswrapper[8731]: I1205 12:43:04.885256 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" event={"ID":"c39d2089-d3bf-4556-b6ef-c362a08c21a2","Type":"ContainerStarted","Data":"fa4f02496398ccdc5c55acbb60e75e3c69d9850820e087e65cbe9d00bf63d07e"} Dec 05 12:43:04.885869 master-0 kubenswrapper[8731]: I1205 12:43:04.885852 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:04.885933 master-0 kubenswrapper[8731]: I1205 12:43:04.885921 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" event={"ID":"c39d2089-d3bf-4556-b6ef-c362a08c21a2","Type":"ContainerStarted","Data":"ae3644549c6caccb0e5b76cf093dd16f97c66829b7bc2c724be0d4328e24c56e"} Dec 05 12:43:04.887638 master-0 kubenswrapper[8731]: I1205 12:43:04.887617 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" event={"ID":"e943438b-1de8-435c-8a19-accd6a6292a4","Type":"ContainerStarted","Data":"d9a47a4e65ab5cf4baf6b36c8ce1ba7fd5756eae201f48950bc988deec039fe0"} Dec 05 12:43:04.887771 master-0 kubenswrapper[8731]: I1205 12:43:04.887740 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" event={"ID":"e943438b-1de8-435c-8a19-accd6a6292a4","Type":"ContainerStarted","Data":"77da36c6bf5d09d68dbf2de017a655a5a15b25fda32cba3288a3d8b2cc4b44c0"} Dec 05 12:43:04.888157 master-0 kubenswrapper[8731]: I1205 12:43:04.888088 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:04.893215 master-0 kubenswrapper[8731]: I1205 12:43:04.893145 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:43:04.893638 master-0 kubenswrapper[8731]: I1205 12:43:04.893585 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:43:04.908019 master-0 kubenswrapper[8731]: I1205 12:43:04.907917 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" podStartSLOduration=3.907889155 podStartE2EDuration="3.907889155s" podCreationTimestamp="2025-12-05 12:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:43:04.904568377 +0000 UTC m=+683.208552554" watchObservedRunningTime="2025-12-05 12:43:04.907889155 +0000 UTC m=+683.211873322" Dec 05 12:43:04.928150 master-0 kubenswrapper[8731]: I1205 12:43:04.927490 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" podStartSLOduration=3.927464776 podStartE2EDuration="3.927464776s" podCreationTimestamp="2025-12-05 12:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:43:04.925698374 +0000 UTC m=+683.229682581" watchObservedRunningTime="2025-12-05 12:43:04.927464776 +0000 UTC m=+683.231448943" Dec 05 12:43:05.584219 master-0 kubenswrapper[8731]: I1205 12:43:05.584099 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:05.584219 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:05.584219 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:05.584219 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:05.584660 master-0 kubenswrapper[8731]: I1205 12:43:05.584267 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:06.584509 master-0 kubenswrapper[8731]: I1205 12:43:06.584403 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:06.584509 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:06.584509 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:06.584509 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:06.585590 master-0 kubenswrapper[8731]: I1205 12:43:06.584522 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:07.585093 master-0 kubenswrapper[8731]: I1205 12:43:07.585013 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:07.585093 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:07.585093 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:07.585093 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:07.585645 master-0 kubenswrapper[8731]: I1205 12:43:07.585121 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:08.584888 master-0 kubenswrapper[8731]: I1205 12:43:08.584782 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:08.584888 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:08.584888 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:08.584888 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:08.585736 master-0 kubenswrapper[8731]: I1205 12:43:08.584934 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:09.585752 master-0 kubenswrapper[8731]: I1205 12:43:09.585643 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:09.585752 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:09.585752 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:09.585752 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:09.586511 master-0 kubenswrapper[8731]: I1205 12:43:09.585777 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:10.586159 master-0 kubenswrapper[8731]: I1205 12:43:10.586072 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:10.586159 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:10.586159 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:10.586159 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:10.586159 master-0 kubenswrapper[8731]: I1205 12:43:10.586150 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:11.586296 master-0 kubenswrapper[8731]: I1205 12:43:11.586190 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:11.586296 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:11.586296 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:11.586296 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:11.587056 master-0 kubenswrapper[8731]: I1205 12:43:11.586306 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:12.585468 master-0 kubenswrapper[8731]: I1205 12:43:12.585381 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:12.585468 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:12.585468 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:12.585468 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:12.585938 master-0 kubenswrapper[8731]: I1205 12:43:12.585495 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:13.584824 master-0 kubenswrapper[8731]: I1205 12:43:13.584749 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:13.584824 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:13.584824 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:13.584824 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:13.585659 master-0 kubenswrapper[8731]: I1205 12:43:13.584842 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:14.585271 master-0 kubenswrapper[8731]: I1205 12:43:14.585157 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:14.585271 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:14.585271 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:14.585271 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:14.585993 master-0 kubenswrapper[8731]: I1205 12:43:14.585279 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:15.585119 master-0 kubenswrapper[8731]: I1205 12:43:15.585060 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:15.585119 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:15.585119 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:15.585119 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:15.585740 master-0 kubenswrapper[8731]: I1205 12:43:15.585135 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:16.584839 master-0 kubenswrapper[8731]: I1205 12:43:16.584732 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:16.584839 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:16.584839 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:16.584839 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:16.585237 master-0 kubenswrapper[8731]: I1205 12:43:16.584881 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:17.585461 master-0 kubenswrapper[8731]: I1205 12:43:17.585329 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:17.585461 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:17.585461 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:17.585461 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:17.586384 master-0 kubenswrapper[8731]: I1205 12:43:17.585467 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:18.590834 master-0 kubenswrapper[8731]: I1205 12:43:18.589585 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:18.590834 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:18.590834 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:18.590834 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:18.590834 master-0 kubenswrapper[8731]: I1205 12:43:18.589682 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:19.585632 master-0 kubenswrapper[8731]: I1205 12:43:19.585547 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:19.585632 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:19.585632 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:19.585632 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:19.586087 master-0 kubenswrapper[8731]: I1205 12:43:19.585665 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:20.585499 master-0 kubenswrapper[8731]: I1205 12:43:20.585405 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:20.585499 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:20.585499 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:20.585499 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:20.586405 master-0 kubenswrapper[8731]: I1205 12:43:20.585548 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:21.584027 master-0 kubenswrapper[8731]: I1205 12:43:21.583971 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:21.584027 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:21.584027 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:21.584027 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:21.584389 master-0 kubenswrapper[8731]: I1205 12:43:21.584036 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:22.585141 master-0 kubenswrapper[8731]: I1205 12:43:22.585062 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:22.585141 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:22.585141 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:22.585141 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:22.586160 master-0 kubenswrapper[8731]: I1205 12:43:22.585170 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:23.586859 master-0 kubenswrapper[8731]: I1205 12:43:23.586773 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:23.586859 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:23.586859 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:23.586859 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:23.587944 master-0 kubenswrapper[8731]: I1205 12:43:23.586887 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:23.993319 master-0 kubenswrapper[8731]: I1205 12:43:23.993232 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 05 12:43:23.994568 master-0 kubenswrapper[8731]: I1205 12:43:23.994524 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:23.997318 master-0 kubenswrapper[8731]: I1205 12:43:23.997270 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 12:43:23.997485 master-0 kubenswrapper[8731]: I1205 12:43:23.997427 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kkk9n" Dec 05 12:43:24.002038 master-0 kubenswrapper[8731]: I1205 12:43:24.001993 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 05 12:43:24.115501 master-0 kubenswrapper[8731]: I1205 12:43:24.115417 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kube-api-access\") pod \"installer-2-master-0\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:24.115501 master-0 kubenswrapper[8731]: I1205 12:43:24.115476 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-var-lock\") pod \"installer-2-master-0\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:24.115793 master-0 kubenswrapper[8731]: I1205 12:43:24.115697 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:24.218078 master-0 kubenswrapper[8731]: I1205 12:43:24.217992 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kube-api-access\") pod \"installer-2-master-0\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:24.218655 master-0 kubenswrapper[8731]: I1205 12:43:24.218619 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-var-lock\") pod \"installer-2-master-0\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:24.218912 master-0 kubenswrapper[8731]: I1205 12:43:24.218703 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-var-lock\") pod \"installer-2-master-0\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:24.219166 master-0 kubenswrapper[8731]: I1205 12:43:24.219094 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:24.219474 master-0 kubenswrapper[8731]: I1205 12:43:24.219250 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:24.235818 master-0 kubenswrapper[8731]: I1205 12:43:24.235755 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kube-api-access\") pod \"installer-2-master-0\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:24.316932 master-0 kubenswrapper[8731]: I1205 12:43:24.316784 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:43:24.584279 master-0 kubenswrapper[8731]: I1205 12:43:24.584080 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:24.584279 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:24.584279 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:24.584279 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:24.584279 master-0 kubenswrapper[8731]: I1205 12:43:24.584225 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:24.754708 master-0 kubenswrapper[8731]: I1205 12:43:24.754629 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 05 12:43:25.126608 master-0 kubenswrapper[8731]: I1205 12:43:25.126501 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"af196a48-6fcc-47d1-95ac-7c0acd63dd21","Type":"ContainerStarted","Data":"64823040ff2341cdbb93a061017009c800e15a20e0d4a4f9a76225782f444caf"} Dec 05 12:43:25.585889 master-0 kubenswrapper[8731]: I1205 12:43:25.585816 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:25.585889 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:25.585889 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:25.585889 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:25.586522 master-0 kubenswrapper[8731]: I1205 12:43:25.586473 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:26.140006 master-0 kubenswrapper[8731]: I1205 12:43:26.139916 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"af196a48-6fcc-47d1-95ac-7c0acd63dd21","Type":"ContainerStarted","Data":"7ece5c46267ba23e02ffd4ff25da31145731b6190d3fd68ce186c1aedbb31e5d"} Dec 05 12:43:26.161435 master-0 kubenswrapper[8731]: I1205 12:43:26.161317 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=3.161286825 podStartE2EDuration="3.161286825s" podCreationTimestamp="2025-12-05 12:43:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:43:26.155945259 +0000 UTC m=+704.459929466" watchObservedRunningTime="2025-12-05 12:43:26.161286825 +0000 UTC m=+704.465271002" Dec 05 12:43:26.585312 master-0 kubenswrapper[8731]: I1205 12:43:26.585199 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:26.585312 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:26.585312 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:26.585312 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:26.585312 master-0 kubenswrapper[8731]: I1205 12:43:26.585287 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:27.584903 master-0 kubenswrapper[8731]: I1205 12:43:27.584817 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:27.584903 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:27.584903 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:27.584903 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:27.585502 master-0 kubenswrapper[8731]: I1205 12:43:27.584985 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:28.586427 master-0 kubenswrapper[8731]: I1205 12:43:28.586302 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:28.586427 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:28.586427 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:28.586427 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:28.587288 master-0 kubenswrapper[8731]: I1205 12:43:28.586474 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:29.011456 master-0 kubenswrapper[8731]: I1205 12:43:29.011366 8731 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Dec 05 12:43:29.011830 master-0 kubenswrapper[8731]: I1205 12:43:29.011779 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcdctl" containerID="cri-o://dd2d6c7cdd5eab77e600768a9929fb4a53e0d7ace9dc3035b564a4d26b57a2ca" gracePeriod=30 Dec 05 12:43:29.011879 master-0 kubenswrapper[8731]: I1205 12:43:29.011789 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-readyz" containerID="cri-o://073eeb295461d6cfe17793b727c36b1a0795b59c33714c128e08740e09c87106" gracePeriod=30 Dec 05 12:43:29.011879 master-0 kubenswrapper[8731]: I1205 12:43:29.011848 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-metrics" containerID="cri-o://00f11e6defd30a3258a136b83ab00d656bb56a377cbe07aa4f6425fb339a65fe" gracePeriod=30 Dec 05 12:43:29.011952 master-0 kubenswrapper[8731]: I1205 12:43:29.011919 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-rev" containerID="cri-o://765a08d8e028edbc45c6c2083dfc23ad6392f98fa33616533c24f46e8e8af646" gracePeriod=30 Dec 05 12:43:29.012047 master-0 kubenswrapper[8731]: I1205 12:43:29.011956 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd" containerID="cri-o://b6aa6b1922706ed7b2ddfb61ba9e6938912e45e804a0e3f6e5253251e33b6f4e" gracePeriod=30 Dec 05 12:43:29.023663 master-0 kubenswrapper[8731]: I1205 12:43:29.023587 8731 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Dec 05 12:43:29.023999 master-0 kubenswrapper[8731]: E1205 12:43:29.023967 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-readyz" Dec 05 12:43:29.024056 master-0 kubenswrapper[8731]: I1205 12:43:29.024002 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-readyz" Dec 05 12:43:29.024056 master-0 kubenswrapper[8731]: E1205 12:43:29.024020 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-resources-copy" Dec 05 12:43:29.024056 master-0 kubenswrapper[8731]: I1205 12:43:29.024032 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-resources-copy" Dec 05 12:43:29.024056 master-0 kubenswrapper[8731]: E1205 12:43:29.024047 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-rev" Dec 05 12:43:29.024056 master-0 kubenswrapper[8731]: I1205 12:43:29.024059 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-rev" Dec 05 12:43:29.024218 master-0 kubenswrapper[8731]: E1205 12:43:29.024089 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-ensure-env-vars" Dec 05 12:43:29.024218 master-0 kubenswrapper[8731]: I1205 12:43:29.024102 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-ensure-env-vars" Dec 05 12:43:29.024218 master-0 kubenswrapper[8731]: E1205 12:43:29.024120 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcdctl" Dec 05 12:43:29.024218 master-0 kubenswrapper[8731]: I1205 12:43:29.024131 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcdctl" Dec 05 12:43:29.024327 master-0 kubenswrapper[8731]: E1205 12:43:29.024173 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="setup" Dec 05 12:43:29.024327 master-0 kubenswrapper[8731]: I1205 12:43:29.024259 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="setup" Dec 05 12:43:29.024327 master-0 kubenswrapper[8731]: E1205 12:43:29.024273 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd" Dec 05 12:43:29.024327 master-0 kubenswrapper[8731]: I1205 12:43:29.024284 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd" Dec 05 12:43:29.024327 master-0 kubenswrapper[8731]: E1205 12:43:29.024305 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-metrics" Dec 05 12:43:29.024327 master-0 kubenswrapper[8731]: I1205 12:43:29.024316 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-metrics" Dec 05 12:43:29.024561 master-0 kubenswrapper[8731]: I1205 12:43:29.024506 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-rev" Dec 05 12:43:29.024561 master-0 kubenswrapper[8731]: I1205 12:43:29.024540 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd" Dec 05 12:43:29.024561 master-0 kubenswrapper[8731]: I1205 12:43:29.024554 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-readyz" Dec 05 12:43:29.024682 master-0 kubenswrapper[8731]: I1205 12:43:29.024586 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcdctl" Dec 05 12:43:29.024682 master-0 kubenswrapper[8731]: I1205 12:43:29.024603 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24e01603234fe8003f8aae8171b0065" containerName="etcd-metrics" Dec 05 12:43:29.120375 master-0 kubenswrapper[8731]: I1205 12:43:29.120288 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.120375 master-0 kubenswrapper[8731]: I1205 12:43:29.120357 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.120375 master-0 kubenswrapper[8731]: I1205 12:43:29.120383 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.120895 master-0 kubenswrapper[8731]: I1205 12:43:29.120441 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.120895 master-0 kubenswrapper[8731]: I1205 12:43:29.120497 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.120895 master-0 kubenswrapper[8731]: I1205 12:43:29.120539 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.159635 master-0 kubenswrapper[8731]: I1205 12:43:29.159554 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-rev/0.log" Dec 05 12:43:29.160801 master-0 kubenswrapper[8731]: I1205 12:43:29.160695 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-metrics/0.log" Dec 05 12:43:29.163295 master-0 kubenswrapper[8731]: I1205 12:43:29.163158 8731 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="765a08d8e028edbc45c6c2083dfc23ad6392f98fa33616533c24f46e8e8af646" exitCode=2 Dec 05 12:43:29.163295 master-0 kubenswrapper[8731]: I1205 12:43:29.163239 8731 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="073eeb295461d6cfe17793b727c36b1a0795b59c33714c128e08740e09c87106" exitCode=0 Dec 05 12:43:29.163295 master-0 kubenswrapper[8731]: I1205 12:43:29.163263 8731 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="00f11e6defd30a3258a136b83ab00d656bb56a377cbe07aa4f6425fb339a65fe" exitCode=2 Dec 05 12:43:29.222976 master-0 kubenswrapper[8731]: I1205 12:43:29.222812 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.222976 master-0 kubenswrapper[8731]: I1205 12:43:29.222953 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.223496 master-0 kubenswrapper[8731]: I1205 12:43:29.223007 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.223496 master-0 kubenswrapper[8731]: I1205 12:43:29.223152 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.223496 master-0 kubenswrapper[8731]: I1205 12:43:29.223241 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.223496 master-0 kubenswrapper[8731]: I1205 12:43:29.223212 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.223674 master-0 kubenswrapper[8731]: I1205 12:43:29.223543 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.223674 master-0 kubenswrapper[8731]: I1205 12:43:29.223651 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.223837 master-0 kubenswrapper[8731]: I1205 12:43:29.223756 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.223888 master-0 kubenswrapper[8731]: I1205 12:43:29.223839 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.223999 master-0 kubenswrapper[8731]: I1205 12:43:29.223947 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.223999 master-0 kubenswrapper[8731]: I1205 12:43:29.223985 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:43:29.584774 master-0 kubenswrapper[8731]: I1205 12:43:29.584687 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:29.584774 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:29.584774 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:29.584774 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:29.585084 master-0 kubenswrapper[8731]: I1205 12:43:29.584822 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:30.584990 master-0 kubenswrapper[8731]: I1205 12:43:30.584898 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:30.584990 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:30.584990 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:30.584990 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:30.585830 master-0 kubenswrapper[8731]: I1205 12:43:30.585013 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:31.585448 master-0 kubenswrapper[8731]: I1205 12:43:31.585376 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:31.585448 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:31.585448 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:31.585448 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:31.586497 master-0 kubenswrapper[8731]: I1205 12:43:31.585468 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:32.585255 master-0 kubenswrapper[8731]: I1205 12:43:32.585078 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:32.585255 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:32.585255 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:32.585255 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:32.585255 master-0 kubenswrapper[8731]: I1205 12:43:32.585219 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:33.583809 master-0 kubenswrapper[8731]: I1205 12:43:33.583673 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:33.583809 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:33.583809 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:33.583809 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:33.583809 master-0 kubenswrapper[8731]: I1205 12:43:33.583781 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:34.585707 master-0 kubenswrapper[8731]: I1205 12:43:34.585633 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:34.585707 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:34.585707 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:34.585707 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:34.587230 master-0 kubenswrapper[8731]: I1205 12:43:34.587109 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:35.585248 master-0 kubenswrapper[8731]: I1205 12:43:35.585158 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:35.585248 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:35.585248 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:35.585248 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:35.585766 master-0 kubenswrapper[8731]: I1205 12:43:35.585721 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:36.584909 master-0 kubenswrapper[8731]: I1205 12:43:36.584829 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:36.584909 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:36.584909 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:36.584909 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:36.584909 master-0 kubenswrapper[8731]: I1205 12:43:36.584910 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:37.584949 master-0 kubenswrapper[8731]: I1205 12:43:37.584871 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:37.584949 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:37.584949 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:37.584949 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:37.585743 master-0 kubenswrapper[8731]: I1205 12:43:37.584979 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:38.584294 master-0 kubenswrapper[8731]: I1205 12:43:38.584243 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:38.584294 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:38.584294 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:38.584294 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:38.584746 master-0 kubenswrapper[8731]: I1205 12:43:38.584688 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:39.584841 master-0 kubenswrapper[8731]: I1205 12:43:39.584744 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:39.584841 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:39.584841 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:39.584841 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:39.585825 master-0 kubenswrapper[8731]: I1205 12:43:39.584869 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:40.584726 master-0 kubenswrapper[8731]: I1205 12:43:40.584668 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:40.584726 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:40.584726 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:40.584726 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:40.585936 master-0 kubenswrapper[8731]: I1205 12:43:40.584728 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:40.851025 master-0 kubenswrapper[8731]: E1205 12:43:40.850843 8731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:43:41.584524 master-0 kubenswrapper[8731]: I1205 12:43:41.584437 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:41.584524 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:41.584524 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:41.584524 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:41.584914 master-0 kubenswrapper[8731]: I1205 12:43:41.584529 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:42.584717 master-0 kubenswrapper[8731]: I1205 12:43:42.584594 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:42.584717 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:42.584717 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:42.584717 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:42.584717 master-0 kubenswrapper[8731]: I1205 12:43:42.584687 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:43.292610 master-0 kubenswrapper[8731]: I1205 12:43:43.292522 8731 generic.go:334] "Generic (PLEG): container finished" podID="c2415969-33ad-418b-9df0-4a6c7bb279db" containerID="dcccd3d0ecb79fd6fed3cade29b8b0d3ae9e791a686c5e6c4f661b34fd1efb10" exitCode=0 Dec 05 12:43:43.292610 master-0 kubenswrapper[8731]: I1205 12:43:43.292594 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"c2415969-33ad-418b-9df0-4a6c7bb279db","Type":"ContainerDied","Data":"dcccd3d0ecb79fd6fed3cade29b8b0d3ae9e791a686c5e6c4f661b34fd1efb10"} Dec 05 12:43:43.584174 master-0 kubenswrapper[8731]: I1205 12:43:43.584030 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:43:43.584174 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:43:43.584174 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:43:43.584174 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:43:43.584703 master-0 kubenswrapper[8731]: I1205 12:43:43.584659 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:43:43.584895 master-0 kubenswrapper[8731]: I1205 12:43:43.584871 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:43:43.586519 master-0 kubenswrapper[8731]: I1205 12:43:43.586483 8731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"8fbcd680e597e847a58340dd3596ea3cc035b2de307cd72ebb1304a012ac892d"} pod="openshift-ingress/router-default-5465c8b4db-dzlmb" containerMessage="Container router failed startup probe, will be restarted" Dec 05 12:43:43.586725 master-0 kubenswrapper[8731]: I1205 12:43:43.586694 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" containerID="cri-o://8fbcd680e597e847a58340dd3596ea3cc035b2de307cd72ebb1304a012ac892d" gracePeriod=3600 Dec 05 12:43:44.303727 master-0 kubenswrapper[8731]: I1205 12:43:44.303623 8731 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" exitCode=1 Dec 05 12:43:44.303998 master-0 kubenswrapper[8731]: I1205 12:43:44.303759 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f"} Dec 05 12:43:44.303998 master-0 kubenswrapper[8731]: I1205 12:43:44.303878 8731 scope.go:117] "RemoveContainer" containerID="ac54524887aecec21958b7b4fb65da11e780a16d7a6537965df0e9b00dd407c3" Dec 05 12:43:44.304415 master-0 kubenswrapper[8731]: I1205 12:43:44.304297 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:43:44.304532 master-0 kubenswrapper[8731]: E1205 12:43:44.304510 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:43:44.696920 master-0 kubenswrapper[8731]: I1205 12:43:44.696807 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 05 12:43:44.880115 master-0 kubenswrapper[8731]: I1205 12:43:44.879926 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-kubelet-dir\") pod \"c2415969-33ad-418b-9df0-4a6c7bb279db\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " Dec 05 12:43:44.880115 master-0 kubenswrapper[8731]: I1205 12:43:44.880074 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-var-lock\") pod \"c2415969-33ad-418b-9df0-4a6c7bb279db\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " Dec 05 12:43:44.880460 master-0 kubenswrapper[8731]: I1205 12:43:44.880113 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c2415969-33ad-418b-9df0-4a6c7bb279db" (UID: "c2415969-33ad-418b-9df0-4a6c7bb279db"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:43:44.880460 master-0 kubenswrapper[8731]: I1205 12:43:44.880229 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2415969-33ad-418b-9df0-4a6c7bb279db-kube-api-access\") pod \"c2415969-33ad-418b-9df0-4a6c7bb279db\" (UID: \"c2415969-33ad-418b-9df0-4a6c7bb279db\") " Dec 05 12:43:44.880460 master-0 kubenswrapper[8731]: I1205 12:43:44.880278 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-var-lock" (OuterVolumeSpecName: "var-lock") pod "c2415969-33ad-418b-9df0-4a6c7bb279db" (UID: "c2415969-33ad-418b-9df0-4a6c7bb279db"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:43:44.880673 master-0 kubenswrapper[8731]: I1205 12:43:44.880640 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:44.880673 master-0 kubenswrapper[8731]: I1205 12:43:44.880666 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c2415969-33ad-418b-9df0-4a6c7bb279db-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:44.885719 master-0 kubenswrapper[8731]: I1205 12:43:44.885423 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2415969-33ad-418b-9df0-4a6c7bb279db-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c2415969-33ad-418b-9df0-4a6c7bb279db" (UID: "c2415969-33ad-418b-9df0-4a6c7bb279db"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:43:44.982477 master-0 kubenswrapper[8731]: I1205 12:43:44.982378 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c2415969-33ad-418b-9df0-4a6c7bb279db-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:45.243442 master-0 kubenswrapper[8731]: I1205 12:43:45.243352 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:43:45.314436 master-0 kubenswrapper[8731]: I1205 12:43:45.314365 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:43:45.314776 master-0 kubenswrapper[8731]: E1205 12:43:45.314636 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:43:45.316600 master-0 kubenswrapper[8731]: I1205 12:43:45.316559 8731 generic.go:334] "Generic (PLEG): container finished" podID="5e09e2af7200e6f9be469dbfd9bb1127" containerID="10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c" exitCode=1 Dec 05 12:43:45.316672 master-0 kubenswrapper[8731]: I1205 12:43:45.316629 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerDied","Data":"10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c"} Dec 05 12:43:45.316718 master-0 kubenswrapper[8731]: I1205 12:43:45.316673 8731 scope.go:117] "RemoveContainer" containerID="ea798cf6cf2e0e8f9ed09f878b5232d0740a5bbae085c7d7f2ee3609a0190f95" Dec 05 12:43:45.317026 master-0 kubenswrapper[8731]: I1205 12:43:45.316994 8731 scope.go:117] "RemoveContainer" containerID="10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c" Dec 05 12:43:45.317211 master-0 kubenswrapper[8731]: E1205 12:43:45.317163 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(5e09e2af7200e6f9be469dbfd9bb1127)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="5e09e2af7200e6f9be469dbfd9bb1127" Dec 05 12:43:45.319537 master-0 kubenswrapper[8731]: I1205 12:43:45.319496 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"c2415969-33ad-418b-9df0-4a6c7bb279db","Type":"ContainerDied","Data":"cff910884ebcba45ebf5c933f29645e420a54c688eec01b58f8fb05ec723cbe8"} Dec 05 12:43:45.319537 master-0 kubenswrapper[8731]: I1205 12:43:45.319529 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff910884ebcba45ebf5c933f29645e420a54c688eec01b58f8fb05ec723cbe8" Dec 05 12:43:45.319658 master-0 kubenswrapper[8731]: I1205 12:43:45.319600 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 05 12:43:50.515685 master-0 kubenswrapper[8731]: I1205 12:43:50.515457 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:43:50.516967 master-0 kubenswrapper[8731]: I1205 12:43:50.516213 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:43:50.516967 master-0 kubenswrapper[8731]: E1205 12:43:50.516617 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:43:50.852577 master-0 kubenswrapper[8731]: E1205 12:43:50.852282 8731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:43:51.460943 master-0 kubenswrapper[8731]: I1205 12:43:51.460860 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:43:51.461730 master-0 kubenswrapper[8731]: I1205 12:43:51.461701 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:43:51.462081 master-0 kubenswrapper[8731]: E1205 12:43:51.462045 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:43:58.935162 master-0 kubenswrapper[8731]: I1205 12:43:58.935080 8731 scope.go:117] "RemoveContainer" containerID="10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c" Dec 05 12:43:58.936044 master-0 kubenswrapper[8731]: E1205 12:43:58.935374 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(5e09e2af7200e6f9be469dbfd9bb1127)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="5e09e2af7200e6f9be469dbfd9bb1127" Dec 05 12:43:59.430512 master-0 kubenswrapper[8731]: I1205 12:43:59.430447 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-rev/0.log" Dec 05 12:43:59.431872 master-0 kubenswrapper[8731]: I1205 12:43:59.431842 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-metrics/0.log" Dec 05 12:43:59.432584 master-0 kubenswrapper[8731]: I1205 12:43:59.432542 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd/0.log" Dec 05 12:43:59.433161 master-0 kubenswrapper[8731]: I1205 12:43:59.433136 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcdctl/0.log" Dec 05 12:43:59.435715 master-0 kubenswrapper[8731]: I1205 12:43:59.435680 8731 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="b6aa6b1922706ed7b2ddfb61ba9e6938912e45e804a0e3f6e5253251e33b6f4e" exitCode=137 Dec 05 12:43:59.435860 master-0 kubenswrapper[8731]: I1205 12:43:59.435840 8731 generic.go:334] "Generic (PLEG): container finished" podID="c24e01603234fe8003f8aae8171b0065" containerID="dd2d6c7cdd5eab77e600768a9929fb4a53e0d7ace9dc3035b564a4d26b57a2ca" exitCode=137 Dec 05 12:43:59.594105 master-0 kubenswrapper[8731]: I1205 12:43:59.593829 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-rev/0.log" Dec 05 12:43:59.595221 master-0 kubenswrapper[8731]: I1205 12:43:59.595195 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-metrics/0.log" Dec 05 12:43:59.597253 master-0 kubenswrapper[8731]: I1205 12:43:59.597203 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd/0.log" Dec 05 12:43:59.597778 master-0 kubenswrapper[8731]: I1205 12:43:59.597709 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcdctl/0.log" Dec 05 12:43:59.598886 master-0 kubenswrapper[8731]: I1205 12:43:59.598863 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 05 12:43:59.721517 master-0 kubenswrapper[8731]: I1205 12:43:59.721443 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 05 12:43:59.721763 master-0 kubenswrapper[8731]: I1205 12:43:59.721586 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 05 12:43:59.721763 master-0 kubenswrapper[8731]: I1205 12:43:59.721623 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 05 12:43:59.721763 master-0 kubenswrapper[8731]: I1205 12:43:59.721630 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:43:59.721763 master-0 kubenswrapper[8731]: I1205 12:43:59.721661 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 05 12:43:59.721763 master-0 kubenswrapper[8731]: I1205 12:43:59.721701 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:43:59.721763 master-0 kubenswrapper[8731]: I1205 12:43:59.721715 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 05 12:43:59.721763 master-0 kubenswrapper[8731]: I1205 12:43:59.721752 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin\") pod \"c24e01603234fe8003f8aae8171b0065\" (UID: \"c24e01603234fe8003f8aae8171b0065\") " Dec 05 12:43:59.722170 master-0 kubenswrapper[8731]: I1205 12:43:59.721720 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir" (OuterVolumeSpecName: "data-dir") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:43:59.722170 master-0 kubenswrapper[8731]: I1205 12:43:59.721732 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:43:59.722170 master-0 kubenswrapper[8731]: I1205 12:43:59.721751 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir" (OuterVolumeSpecName: "log-dir") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:43:59.722170 master-0 kubenswrapper[8731]: I1205 12:43:59.722091 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "c24e01603234fe8003f8aae8171b0065" (UID: "c24e01603234fe8003f8aae8171b0065"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:43:59.722433 master-0 kubenswrapper[8731]: I1205 12:43:59.722398 8731 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-data-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:59.722433 master-0 kubenswrapper[8731]: I1205 12:43:59.722425 8731 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:59.722501 master-0 kubenswrapper[8731]: I1205 12:43:59.722437 8731 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-log-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:59.722501 master-0 kubenswrapper[8731]: I1205 12:43:59.722449 8731 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:59.722501 master-0 kubenswrapper[8731]: I1205 12:43:59.722460 8731 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:59.722501 master-0 kubenswrapper[8731]: I1205 12:43:59.722470 8731 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/c24e01603234fe8003f8aae8171b0065-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:43:59.945802 master-0 kubenswrapper[8731]: I1205 12:43:59.945732 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24e01603234fe8003f8aae8171b0065" path="/var/lib/kubelet/pods/c24e01603234fe8003f8aae8171b0065/volumes" Dec 05 12:44:00.447041 master-0 kubenswrapper[8731]: I1205 12:44:00.446963 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-rev/0.log" Dec 05 12:44:00.449013 master-0 kubenswrapper[8731]: I1205 12:44:00.448953 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd-metrics/0.log" Dec 05 12:44:00.449967 master-0 kubenswrapper[8731]: I1205 12:44:00.449943 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcd/0.log" Dec 05 12:44:00.450734 master-0 kubenswrapper[8731]: I1205 12:44:00.450686 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_c24e01603234fe8003f8aae8171b0065/etcdctl/0.log" Dec 05 12:44:00.452242 master-0 kubenswrapper[8731]: I1205 12:44:00.452217 8731 scope.go:117] "RemoveContainer" containerID="765a08d8e028edbc45c6c2083dfc23ad6392f98fa33616533c24f46e8e8af646" Dec 05 12:44:00.452542 master-0 kubenswrapper[8731]: I1205 12:44:00.452488 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 05 12:44:00.477886 master-0 kubenswrapper[8731]: I1205 12:44:00.477815 8731 scope.go:117] "RemoveContainer" containerID="073eeb295461d6cfe17793b727c36b1a0795b59c33714c128e08740e09c87106" Dec 05 12:44:00.505966 master-0 kubenswrapper[8731]: I1205 12:44:00.505906 8731 scope.go:117] "RemoveContainer" containerID="00f11e6defd30a3258a136b83ab00d656bb56a377cbe07aa4f6425fb339a65fe" Dec 05 12:44:00.525921 master-0 kubenswrapper[8731]: I1205 12:44:00.525840 8731 scope.go:117] "RemoveContainer" containerID="b6aa6b1922706ed7b2ddfb61ba9e6938912e45e804a0e3f6e5253251e33b6f4e" Dec 05 12:44:00.546019 master-0 kubenswrapper[8731]: I1205 12:44:00.545967 8731 scope.go:117] "RemoveContainer" containerID="dd2d6c7cdd5eab77e600768a9929fb4a53e0d7ace9dc3035b564a4d26b57a2ca" Dec 05 12:44:00.561440 master-0 kubenswrapper[8731]: I1205 12:44:00.561388 8731 scope.go:117] "RemoveContainer" containerID="49ca67aa7902f9104b46e18f411e1fcfcd3bd696757b09b6ab811180664a0848" Dec 05 12:44:00.584512 master-0 kubenswrapper[8731]: I1205 12:44:00.584463 8731 scope.go:117] "RemoveContainer" containerID="bbac3062d171e964c6a10b8a9a51c923e56d399e294dc2e11516a9c8232774c1" Dec 05 12:44:00.611069 master-0 kubenswrapper[8731]: I1205 12:44:00.611023 8731 scope.go:117] "RemoveContainer" containerID="e6fb13503e825480506895b04ab6a86f432b8d4ca2560cfbca6f20c4af8b50db" Dec 05 12:44:00.853923 master-0 kubenswrapper[8731]: E1205 12:44:00.853693 8731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:44:03.025054 master-0 kubenswrapper[8731]: E1205 12:44:03.024814 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.187e52505884dfc8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:c24e01603234fe8003f8aae8171b0065,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Killing,Message:Stopping container etcd-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:43:29.01177748 +0000 UTC m=+707.315761657,LastTimestamp:2025-12-05 12:43:29.01177748 +0000 UTC m=+707.315761657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:44:05.935230 master-0 kubenswrapper[8731]: I1205 12:44:05.935136 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:44:05.936073 master-0 kubenswrapper[8731]: E1205 12:44:05.935495 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:44:07.933952 master-0 kubenswrapper[8731]: I1205 12:44:07.933862 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 05 12:44:07.967789 master-0 kubenswrapper[8731]: I1205 12:44:07.967693 8731 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:44:07.967789 master-0 kubenswrapper[8731]: I1205 12:44:07.967768 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:44:09.525418 master-0 kubenswrapper[8731]: I1205 12:44:09.525350 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/3.log" Dec 05 12:44:09.526331 master-0 kubenswrapper[8731]: I1205 12:44:09.526274 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/2.log" Dec 05 12:44:09.527032 master-0 kubenswrapper[8731]: I1205 12:44:09.526987 8731 generic.go:334] "Generic (PLEG): container finished" podID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" containerID="9a04a03647acf84231bb505b6acd2c588670eb0bd70e0221386d9b53a3261e61" exitCode=1 Dec 05 12:44:09.527103 master-0 kubenswrapper[8731]: I1205 12:44:09.527038 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerDied","Data":"9a04a03647acf84231bb505b6acd2c588670eb0bd70e0221386d9b53a3261e61"} Dec 05 12:44:09.527103 master-0 kubenswrapper[8731]: I1205 12:44:09.527093 8731 scope.go:117] "RemoveContainer" containerID="6f8055fdf3cea411e4a76860001f402a5742a0c41d34f8fa2265a84c73970742" Dec 05 12:44:09.528155 master-0 kubenswrapper[8731]: I1205 12:44:09.527846 8731 scope.go:117] "RemoveContainer" containerID="9a04a03647acf84231bb505b6acd2c588670eb0bd70e0221386d9b53a3261e61" Dec 05 12:44:09.528155 master-0 kubenswrapper[8731]: E1205 12:44:09.528093 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-7xrk6_openshift-ingress-operator(a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" Dec 05 12:44:09.935333 master-0 kubenswrapper[8731]: I1205 12:44:09.935090 8731 scope.go:117] "RemoveContainer" containerID="10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c" Dec 05 12:44:10.539514 master-0 kubenswrapper[8731]: I1205 12:44:10.539344 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"5e09e2af7200e6f9be469dbfd9bb1127","Type":"ContainerStarted","Data":"89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4"} Dec 05 12:44:10.543903 master-0 kubenswrapper[8731]: I1205 12:44:10.543821 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/3.log" Dec 05 12:44:10.546708 master-0 kubenswrapper[8731]: I1205 12:44:10.546639 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_af196a48-6fcc-47d1-95ac-7c0acd63dd21/installer/0.log" Dec 05 12:44:10.546839 master-0 kubenswrapper[8731]: I1205 12:44:10.546726 8731 generic.go:334] "Generic (PLEG): container finished" podID="af196a48-6fcc-47d1-95ac-7c0acd63dd21" containerID="7ece5c46267ba23e02ffd4ff25da31145731b6190d3fd68ce186c1aedbb31e5d" exitCode=1 Dec 05 12:44:10.546839 master-0 kubenswrapper[8731]: I1205 12:44:10.546786 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"af196a48-6fcc-47d1-95ac-7c0acd63dd21","Type":"ContainerDied","Data":"7ece5c46267ba23e02ffd4ff25da31145731b6190d3fd68ce186c1aedbb31e5d"} Dec 05 12:44:10.854443 master-0 kubenswrapper[8731]: E1205 12:44:10.854298 8731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:44:11.842735 master-0 kubenswrapper[8731]: I1205 12:44:11.842694 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_af196a48-6fcc-47d1-95ac-7c0acd63dd21/installer/0.log" Dec 05 12:44:11.843538 master-0 kubenswrapper[8731]: I1205 12:44:11.843518 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:44:11.867022 master-0 kubenswrapper[8731]: I1205 12:44:11.866972 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kubelet-dir\") pod \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " Dec 05 12:44:11.867611 master-0 kubenswrapper[8731]: I1205 12:44:11.867333 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "af196a48-6fcc-47d1-95ac-7c0acd63dd21" (UID: "af196a48-6fcc-47d1-95ac-7c0acd63dd21"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:44:11.867700 master-0 kubenswrapper[8731]: I1205 12:44:11.867595 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kube-api-access\") pod \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " Dec 05 12:44:11.867786 master-0 kubenswrapper[8731]: I1205 12:44:11.867774 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-var-lock\") pod \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\" (UID: \"af196a48-6fcc-47d1-95ac-7c0acd63dd21\") " Dec 05 12:44:11.867865 master-0 kubenswrapper[8731]: I1205 12:44:11.867828 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-var-lock" (OuterVolumeSpecName: "var-lock") pod "af196a48-6fcc-47d1-95ac-7c0acd63dd21" (UID: "af196a48-6fcc-47d1-95ac-7c0acd63dd21"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:44:11.868241 master-0 kubenswrapper[8731]: I1205 12:44:11.868225 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:44:11.868324 master-0 kubenswrapper[8731]: I1205 12:44:11.868310 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:44:11.871306 master-0 kubenswrapper[8731]: I1205 12:44:11.871265 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "af196a48-6fcc-47d1-95ac-7c0acd63dd21" (UID: "af196a48-6fcc-47d1-95ac-7c0acd63dd21"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:44:11.970017 master-0 kubenswrapper[8731]: I1205 12:44:11.969980 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af196a48-6fcc-47d1-95ac-7c0acd63dd21-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:44:12.560382 master-0 kubenswrapper[8731]: I1205 12:44:12.560308 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_af196a48-6fcc-47d1-95ac-7c0acd63dd21/installer/0.log" Dec 05 12:44:12.560382 master-0 kubenswrapper[8731]: I1205 12:44:12.560398 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"af196a48-6fcc-47d1-95ac-7c0acd63dd21","Type":"ContainerDied","Data":"64823040ff2341cdbb93a061017009c800e15a20e0d4a4f9a76225782f444caf"} Dec 05 12:44:12.560382 master-0 kubenswrapper[8731]: I1205 12:44:12.560425 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64823040ff2341cdbb93a061017009c800e15a20e0d4a4f9a76225782f444caf" Dec 05 12:44:12.560819 master-0 kubenswrapper[8731]: I1205 12:44:12.560518 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:44:16.935687 master-0 kubenswrapper[8731]: I1205 12:44:16.935601 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:44:16.936457 master-0 kubenswrapper[8731]: E1205 12:44:16.936075 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:44:20.855944 master-0 kubenswrapper[8731]: E1205 12:44:20.855829 8731 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:44:20.855944 master-0 kubenswrapper[8731]: I1205 12:44:20.855916 8731 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 12:44:23.935022 master-0 kubenswrapper[8731]: I1205 12:44:23.934937 8731 scope.go:117] "RemoveContainer" containerID="9a04a03647acf84231bb505b6acd2c588670eb0bd70e0221386d9b53a3261e61" Dec 05 12:44:23.935681 master-0 kubenswrapper[8731]: E1205 12:44:23.935474 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-7xrk6_openshift-ingress-operator(a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" Dec 05 12:44:30.689914 master-0 kubenswrapper[8731]: I1205 12:44:30.689824 8731 generic.go:334] "Generic (PLEG): container finished" podID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerID="8fbcd680e597e847a58340dd3596ea3cc035b2de307cd72ebb1304a012ac892d" exitCode=0 Dec 05 12:44:30.689914 master-0 kubenswrapper[8731]: I1205 12:44:30.689901 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerDied","Data":"8fbcd680e597e847a58340dd3596ea3cc035b2de307cd72ebb1304a012ac892d"} Dec 05 12:44:30.690927 master-0 kubenswrapper[8731]: I1205 12:44:30.689955 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerStarted","Data":"eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8"} Dec 05 12:44:30.690927 master-0 kubenswrapper[8731]: I1205 12:44:30.689983 8731 scope.go:117] "RemoveContainer" containerID="8fdea4402ae8cab53b0ad7f0ecba9b1899f62586699c403d4a3f309c69f3a64e" Dec 05 12:44:30.856976 master-0 kubenswrapper[8731]: E1205 12:44:30.856907 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Dec 05 12:44:31.582427 master-0 kubenswrapper[8731]: I1205 12:44:31.582230 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:44:31.586539 master-0 kubenswrapper[8731]: I1205 12:44:31.586477 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:31.586539 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:31.586539 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:31.586539 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:31.586778 master-0 kubenswrapper[8731]: I1205 12:44:31.586564 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:31.699447 master-0 kubenswrapper[8731]: I1205 12:44:31.699374 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xwx26_b8233dad-bd19-4842-a4d5-cfa84f1feb83/approver/1.log" Dec 05 12:44:31.700326 master-0 kubenswrapper[8731]: I1205 12:44:31.699930 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xwx26_b8233dad-bd19-4842-a4d5-cfa84f1feb83/approver/0.log" Dec 05 12:44:31.700326 master-0 kubenswrapper[8731]: I1205 12:44:31.700240 8731 generic.go:334] "Generic (PLEG): container finished" podID="b8233dad-bd19-4842-a4d5-cfa84f1feb83" containerID="efd0af11329fc9886861d20bcf790f4afa476fb62b8a37aabf75eec470dca7ba" exitCode=1 Dec 05 12:44:31.700326 master-0 kubenswrapper[8731]: I1205 12:44:31.700305 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerDied","Data":"efd0af11329fc9886861d20bcf790f4afa476fb62b8a37aabf75eec470dca7ba"} Dec 05 12:44:31.700530 master-0 kubenswrapper[8731]: I1205 12:44:31.700354 8731 scope.go:117] "RemoveContainer" containerID="41718b57d6d2e36d2cb94e43774b239e600e6619dc10d3c14a0345e610d821c2" Dec 05 12:44:31.700990 master-0 kubenswrapper[8731]: I1205 12:44:31.700962 8731 scope.go:117] "RemoveContainer" containerID="efd0af11329fc9886861d20bcf790f4afa476fb62b8a37aabf75eec470dca7ba" Dec 05 12:44:31.935697 master-0 kubenswrapper[8731]: I1205 12:44:31.935591 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:44:31.936104 master-0 kubenswrapper[8731]: E1205 12:44:31.936040 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:44:32.586394 master-0 kubenswrapper[8731]: I1205 12:44:32.586314 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:32.586394 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:32.586394 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:32.586394 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:32.586808 master-0 kubenswrapper[8731]: I1205 12:44:32.586403 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:32.716640 master-0 kubenswrapper[8731]: I1205 12:44:32.716587 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xwx26_b8233dad-bd19-4842-a4d5-cfa84f1feb83/approver/1.log" Dec 05 12:44:32.717462 master-0 kubenswrapper[8731]: I1205 12:44:32.717234 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerStarted","Data":"247a3b2c777f8fe2f346367a22bb39c214fdb1a922b3b827dcfce8dd159f9390"} Dec 05 12:44:33.581813 master-0 kubenswrapper[8731]: I1205 12:44:33.581698 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:44:33.584623 master-0 kubenswrapper[8731]: I1205 12:44:33.584566 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:33.584623 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:33.584623 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:33.584623 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:33.584871 master-0 kubenswrapper[8731]: I1205 12:44:33.584637 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:34.586427 master-0 kubenswrapper[8731]: I1205 12:44:34.586272 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:34.586427 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:34.586427 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:34.586427 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:34.586427 master-0 kubenswrapper[8731]: I1205 12:44:34.586425 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:35.584935 master-0 kubenswrapper[8731]: I1205 12:44:35.584800 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:35.584935 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:35.584935 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:35.584935 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:35.584935 master-0 kubenswrapper[8731]: I1205 12:44:35.584904 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:36.585494 master-0 kubenswrapper[8731]: I1205 12:44:36.585446 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:36.585494 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:36.585494 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:36.585494 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:36.586166 master-0 kubenswrapper[8731]: I1205 12:44:36.585506 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:36.935782 master-0 kubenswrapper[8731]: I1205 12:44:36.935580 8731 scope.go:117] "RemoveContainer" containerID="9a04a03647acf84231bb505b6acd2c588670eb0bd70e0221386d9b53a3261e61" Dec 05 12:44:36.936090 master-0 kubenswrapper[8731]: E1205 12:44:36.936034 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-7xrk6_openshift-ingress-operator(a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" Dec 05 12:44:37.029389 master-0 kubenswrapper[8731]: E1205 12:44:37.029167 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Dec 05 12:44:37.029389 master-0 kubenswrapper[8731]: &Event{ObjectMeta:{router-default-5465c8b4db-dzlmb.187e521161f2d710 openshift-ingress 10245 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-ingress,Name:router-default-5465c8b4db-dzlmb,UID:20a72c8b-0f12-446b-8a42-53d98864c8f8,APIVersion:v1,ResourceVersion:10008,FieldPath:spec.containers{router},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Dec 05 12:44:37.029389 master-0 kubenswrapper[8731]: body: [-]backend-http failed: reason withheld Dec 05 12:44:37.029389 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:37.029389 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:37.029389 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:37.029389 master-0 kubenswrapper[8731]: Dec 05 12:44:37.029389 master-0 kubenswrapper[8731]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:38:58 +0000 UTC,LastTimestamp:2025-12-05 12:43:29.58477686 +0000 UTC m=+707.888761067,Count:226,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Dec 05 12:44:37.029389 master-0 kubenswrapper[8731]: > Dec 05 12:44:37.586364 master-0 kubenswrapper[8731]: I1205 12:44:37.586255 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:37.586364 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:37.586364 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:37.586364 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:37.587483 master-0 kubenswrapper[8731]: I1205 12:44:37.586411 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:38.585554 master-0 kubenswrapper[8731]: I1205 12:44:38.585503 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:38.585554 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:38.585554 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:38.585554 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:38.586014 master-0 kubenswrapper[8731]: I1205 12:44:38.585978 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:39.584996 master-0 kubenswrapper[8731]: I1205 12:44:39.584930 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:39.584996 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:39.584996 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:39.584996 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:39.585648 master-0 kubenswrapper[8731]: I1205 12:44:39.585000 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:40.584810 master-0 kubenswrapper[8731]: I1205 12:44:40.584757 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:40.584810 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:40.584810 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:40.584810 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:40.585413 master-0 kubenswrapper[8731]: I1205 12:44:40.584827 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:41.059465 master-0 kubenswrapper[8731]: E1205 12:44:41.059369 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="the server was unable to return a response in the time allotted, but may still be processing the request (get leases.coordination.k8s.io master-0)" interval="400ms" Dec 05 12:44:41.584950 master-0 kubenswrapper[8731]: I1205 12:44:41.584878 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:41.584950 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:41.584950 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:41.584950 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:41.586055 master-0 kubenswrapper[8731]: I1205 12:44:41.584995 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:41.970839 master-0 kubenswrapper[8731]: E1205 12:44:41.970769 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 05 12:44:41.971544 master-0 kubenswrapper[8731]: I1205 12:44:41.971501 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 05 12:44:42.584781 master-0 kubenswrapper[8731]: I1205 12:44:42.584678 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:42.584781 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:42.584781 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:42.584781 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:42.584781 master-0 kubenswrapper[8731]: I1205 12:44:42.584761 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:42.791813 master-0 kubenswrapper[8731]: I1205 12:44:42.791587 8731 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="7fe2ad5243db75a4d0831218b0b4d047af3794e202e2009112af905d4919bd2b" exitCode=0 Dec 05 12:44:42.791813 master-0 kubenswrapper[8731]: I1205 12:44:42.791705 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"7fe2ad5243db75a4d0831218b0b4d047af3794e202e2009112af905d4919bd2b"} Dec 05 12:44:42.791813 master-0 kubenswrapper[8731]: I1205 12:44:42.791786 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"a484ee5e7b41d00e01ba54d4ad8789422ba018cb058ac26feb10517be87018de"} Dec 05 12:44:42.792500 master-0 kubenswrapper[8731]: I1205 12:44:42.792296 8731 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:44:42.792500 master-0 kubenswrapper[8731]: I1205 12:44:42.792321 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:44:43.295142 master-0 kubenswrapper[8731]: I1205 12:44:43.295058 8731 status_manager.go:851] "Failed to get status for pod" podUID="c2415969-33ad-418b-9df0-4a6c7bb279db" pod="openshift-etcd/installer-2-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-2-master-0)" Dec 05 12:44:43.585975 master-0 kubenswrapper[8731]: I1205 12:44:43.584830 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:43.585975 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:43.585975 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:43.585975 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:43.585975 master-0 kubenswrapper[8731]: I1205 12:44:43.584927 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:44.585555 master-0 kubenswrapper[8731]: I1205 12:44:44.585436 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:44.585555 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:44.585555 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:44.585555 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:44.586223 master-0 kubenswrapper[8731]: I1205 12:44:44.585563 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:45.586387 master-0 kubenswrapper[8731]: I1205 12:44:45.586270 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:45.586387 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:45.586387 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:45.586387 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:45.586387 master-0 kubenswrapper[8731]: I1205 12:44:45.586387 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:46.585831 master-0 kubenswrapper[8731]: I1205 12:44:46.585691 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:46.585831 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:46.585831 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:46.585831 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:46.585831 master-0 kubenswrapper[8731]: I1205 12:44:46.585811 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:46.935099 master-0 kubenswrapper[8731]: I1205 12:44:46.934903 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:44:46.935595 master-0 kubenswrapper[8731]: E1205 12:44:46.935515 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:44:47.585424 master-0 kubenswrapper[8731]: I1205 12:44:47.585308 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:47.585424 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:47.585424 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:47.585424 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:47.585424 master-0 kubenswrapper[8731]: I1205 12:44:47.585382 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:47.935431 master-0 kubenswrapper[8731]: I1205 12:44:47.935230 8731 scope.go:117] "RemoveContainer" containerID="9a04a03647acf84231bb505b6acd2c588670eb0bd70e0221386d9b53a3261e61" Dec 05 12:44:47.936130 master-0 kubenswrapper[8731]: E1205 12:44:47.935612 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-7xrk6_openshift-ingress-operator(a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" Dec 05 12:44:48.586122 master-0 kubenswrapper[8731]: I1205 12:44:48.585428 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:48.586122 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:48.586122 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:48.586122 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:48.586122 master-0 kubenswrapper[8731]: I1205 12:44:48.585575 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:49.584783 master-0 kubenswrapper[8731]: I1205 12:44:49.584737 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:49.584783 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:49.584783 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:49.584783 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:49.585484 master-0 kubenswrapper[8731]: I1205 12:44:49.585403 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:50.586294 master-0 kubenswrapper[8731]: I1205 12:44:50.586150 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:50.586294 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:50.586294 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:50.586294 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:50.587127 master-0 kubenswrapper[8731]: I1205 12:44:50.586297 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:51.460492 master-0 kubenswrapper[8731]: E1205 12:44:51.459998 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Dec 05 12:44:51.586123 master-0 kubenswrapper[8731]: I1205 12:44:51.586002 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:51.586123 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:51.586123 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:51.586123 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:51.586123 master-0 kubenswrapper[8731]: I1205 12:44:51.586106 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:52.585930 master-0 kubenswrapper[8731]: I1205 12:44:52.585776 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:52.585930 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:52.585930 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:52.585930 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:52.585930 master-0 kubenswrapper[8731]: I1205 12:44:52.585910 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:53.584726 master-0 kubenswrapper[8731]: I1205 12:44:53.584615 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:53.584726 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:53.584726 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:53.584726 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:53.585110 master-0 kubenswrapper[8731]: I1205 12:44:53.584744 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:54.584983 master-0 kubenswrapper[8731]: I1205 12:44:54.584891 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:54.584983 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:54.584983 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:54.584983 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:54.584983 master-0 kubenswrapper[8731]: I1205 12:44:54.584984 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:55.586395 master-0 kubenswrapper[8731]: I1205 12:44:55.586247 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:55.586395 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:55.586395 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:55.586395 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:55.587542 master-0 kubenswrapper[8731]: I1205 12:44:55.586420 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:56.585436 master-0 kubenswrapper[8731]: I1205 12:44:56.585301 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:56.585436 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:56.585436 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:56.585436 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:56.585436 master-0 kubenswrapper[8731]: I1205 12:44:56.585433 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:57.586439 master-0 kubenswrapper[8731]: I1205 12:44:57.586278 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:57.586439 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:57.586439 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:57.586439 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:57.586439 master-0 kubenswrapper[8731]: I1205 12:44:57.586422 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:57.935038 master-0 kubenswrapper[8731]: I1205 12:44:57.934839 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:44:57.935561 master-0 kubenswrapper[8731]: E1205 12:44:57.935161 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:44:58.585867 master-0 kubenswrapper[8731]: I1205 12:44:58.585780 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:58.585867 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:58.585867 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:58.585867 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:58.585867 master-0 kubenswrapper[8731]: I1205 12:44:58.585865 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:44:59.585141 master-0 kubenswrapper[8731]: I1205 12:44:59.585043 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:44:59.585141 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:44:59.585141 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:44:59.585141 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:44:59.585141 master-0 kubenswrapper[8731]: I1205 12:44:59.585131 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:00.585595 master-0 kubenswrapper[8731]: I1205 12:45:00.585492 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:00.585595 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:00.585595 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:00.585595 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:00.585595 master-0 kubenswrapper[8731]: I1205 12:45:00.585569 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:00.934454 master-0 kubenswrapper[8731]: I1205 12:45:00.934251 8731 scope.go:117] "RemoveContainer" containerID="9a04a03647acf84231bb505b6acd2c588670eb0bd70e0221386d9b53a3261e61" Dec 05 12:45:01.585048 master-0 kubenswrapper[8731]: I1205 12:45:01.584950 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:01.585048 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:01.585048 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:01.585048 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:01.585048 master-0 kubenswrapper[8731]: I1205 12:45:01.585032 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:01.946163 master-0 kubenswrapper[8731]: I1205 12:45:01.946065 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/3.log" Dec 05 12:45:01.948592 master-0 kubenswrapper[8731]: I1205 12:45:01.948522 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerStarted","Data":"a62572546062b2df435bc85f27bda94544b75d65580e59f21beaef134a43b821"} Dec 05 12:45:02.261794 master-0 kubenswrapper[8731]: E1205 12:45:02.261509 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Dec 05 12:45:02.585317 master-0 kubenswrapper[8731]: I1205 12:45:02.585170 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:02.585317 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:02.585317 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:02.585317 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:02.585631 master-0 kubenswrapper[8731]: I1205 12:45:02.585315 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:03.584872 master-0 kubenswrapper[8731]: I1205 12:45:03.584786 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:03.584872 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:03.584872 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:03.584872 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:03.584872 master-0 kubenswrapper[8731]: I1205 12:45:03.584880 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:04.584967 master-0 kubenswrapper[8731]: I1205 12:45:04.584845 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:04.584967 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:04.584967 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:04.584967 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:04.585849 master-0 kubenswrapper[8731]: I1205 12:45:04.585041 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:05.585543 master-0 kubenswrapper[8731]: I1205 12:45:05.585415 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:05.585543 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:05.585543 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:05.585543 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:05.585543 master-0 kubenswrapper[8731]: I1205 12:45:05.585528 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:06.585659 master-0 kubenswrapper[8731]: I1205 12:45:06.585563 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:06.585659 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:06.585659 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:06.585659 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:06.586796 master-0 kubenswrapper[8731]: I1205 12:45:06.586317 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:07.587268 master-0 kubenswrapper[8731]: I1205 12:45:07.587138 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:07.587268 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:07.587268 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:07.587268 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:07.587925 master-0 kubenswrapper[8731]: I1205 12:45:07.587271 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:08.584864 master-0 kubenswrapper[8731]: I1205 12:45:08.584765 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:08.584864 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:08.584864 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:08.584864 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:08.584864 master-0 kubenswrapper[8731]: I1205 12:45:08.584870 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:09.585387 master-0 kubenswrapper[8731]: I1205 12:45:09.585243 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:09.585387 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:09.585387 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:09.585387 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:09.585387 master-0 kubenswrapper[8731]: I1205 12:45:09.585381 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:10.584669 master-0 kubenswrapper[8731]: I1205 12:45:10.584511 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:10.584669 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:10.584669 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:10.584669 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:10.584669 master-0 kubenswrapper[8731]: I1205 12:45:10.584663 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:11.034868 master-0 kubenswrapper[8731]: E1205 12:45:11.034665 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187e51f9f940c109 kube-system 7596 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:8b47694fcc32464ab24d09c23d6efb57,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:BackOff,Message:Back-off restarting failed container kube-controller-manager in pod bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:37:18 +0000 UTC,LastTimestamp:2025-12-05 12:43:44.304478969 +0000 UTC m=+722.608463136,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:45:11.585454 master-0 kubenswrapper[8731]: I1205 12:45:11.585362 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:11.585454 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:11.585454 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:11.585454 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:11.585746 master-0 kubenswrapper[8731]: I1205 12:45:11.585471 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:12.585674 master-0 kubenswrapper[8731]: I1205 12:45:12.585367 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:12.585674 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:12.585674 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:12.585674 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:12.585674 master-0 kubenswrapper[8731]: I1205 12:45:12.585435 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:12.934775 master-0 kubenswrapper[8731]: I1205 12:45:12.934604 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:45:13.586508 master-0 kubenswrapper[8731]: I1205 12:45:13.585505 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:13.586508 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:13.586508 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:13.586508 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:13.586508 master-0 kubenswrapper[8731]: I1205 12:45:13.585618 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:13.863268 master-0 kubenswrapper[8731]: E1205 12:45:13.863018 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 05 12:45:14.057537 master-0 kubenswrapper[8731]: I1205 12:45:14.057455 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1"} Dec 05 12:45:14.586252 master-0 kubenswrapper[8731]: I1205 12:45:14.586109 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:14.586252 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:14.586252 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:14.586252 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:14.587341 master-0 kubenswrapper[8731]: I1205 12:45:14.586394 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:15.070598 master-0 kubenswrapper[8731]: I1205 12:45:15.070504 8731 generic.go:334] "Generic (PLEG): container finished" podID="1e6babfe-724a-4eab-bb3b-bc318bf57b70" containerID="9059626ad4510705fe438e1803257849f89596beec2662512048f0044416af28" exitCode=0 Dec 05 12:45:15.070598 master-0 kubenswrapper[8731]: I1205 12:45:15.070579 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" event={"ID":"1e6babfe-724a-4eab-bb3b-bc318bf57b70","Type":"ContainerDied","Data":"9059626ad4510705fe438e1803257849f89596beec2662512048f0044416af28"} Dec 05 12:45:15.071477 master-0 kubenswrapper[8731]: I1205 12:45:15.071424 8731 scope.go:117] "RemoveContainer" containerID="9059626ad4510705fe438e1803257849f89596beec2662512048f0044416af28" Dec 05 12:45:15.242978 master-0 kubenswrapper[8731]: I1205 12:45:15.242914 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:45:15.585467 master-0 kubenswrapper[8731]: I1205 12:45:15.585360 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:15.585467 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:15.585467 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:15.585467 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:15.585467 master-0 kubenswrapper[8731]: I1205 12:45:15.585466 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:16.083150 master-0 kubenswrapper[8731]: I1205 12:45:16.083069 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" event={"ID":"1e6babfe-724a-4eab-bb3b-bc318bf57b70","Type":"ContainerStarted","Data":"c57ae702507ae32ef2cc00a4261c94cb1e11a39f67dcebd33d947544fc98f957"} Dec 05 12:45:16.084171 master-0 kubenswrapper[8731]: I1205 12:45:16.083600 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:45:16.086161 master-0 kubenswrapper[8731]: I1205 12:45:16.086079 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:45:16.585828 master-0 kubenswrapper[8731]: I1205 12:45:16.585676 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:16.585828 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:16.585828 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:16.585828 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:16.586337 master-0 kubenswrapper[8731]: I1205 12:45:16.585837 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:16.795858 master-0 kubenswrapper[8731]: E1205 12:45:16.795711 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 05 12:45:17.094349 master-0 kubenswrapper[8731]: I1205 12:45:17.094288 8731 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="073375b200bc70b30fb0cad0a5ecce97a68446c026019d9c52074056ad94e0a7" exitCode=0 Dec 05 12:45:17.094930 master-0 kubenswrapper[8731]: I1205 12:45:17.094406 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"073375b200bc70b30fb0cad0a5ecce97a68446c026019d9c52074056ad94e0a7"} Dec 05 12:45:17.095080 master-0 kubenswrapper[8731]: I1205 12:45:17.095042 8731 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:45:17.095124 master-0 kubenswrapper[8731]: I1205 12:45:17.095083 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:45:17.585865 master-0 kubenswrapper[8731]: I1205 12:45:17.585703 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:17.585865 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:17.585865 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:17.585865 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:17.585865 master-0 kubenswrapper[8731]: I1205 12:45:17.585822 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:18.113306 master-0 kubenswrapper[8731]: I1205 12:45:18.113223 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm_dbe144b5-3b78-4946-bbf9-b825b0e47b07/cluster-cloud-controller-manager/0.log" Dec 05 12:45:18.114236 master-0 kubenswrapper[8731]: I1205 12:45:18.113381 8731 generic.go:334] "Generic (PLEG): container finished" podID="dbe144b5-3b78-4946-bbf9-b825b0e47b07" containerID="b36175e01241a922ef57ef9968701e5af5fa8f55a7287b6d3fe1828d9e78254f" exitCode=1 Dec 05 12:45:18.114236 master-0 kubenswrapper[8731]: I1205 12:45:18.113496 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerDied","Data":"b36175e01241a922ef57ef9968701e5af5fa8f55a7287b6d3fe1828d9e78254f"} Dec 05 12:45:18.114674 master-0 kubenswrapper[8731]: I1205 12:45:18.114615 8731 scope.go:117] "RemoveContainer" containerID="b36175e01241a922ef57ef9968701e5af5fa8f55a7287b6d3fe1828d9e78254f" Dec 05 12:45:18.587428 master-0 kubenswrapper[8731]: I1205 12:45:18.587295 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:18.587428 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:18.587428 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:18.587428 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:18.587901 master-0 kubenswrapper[8731]: I1205 12:45:18.587430 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:19.127870 master-0 kubenswrapper[8731]: I1205 12:45:19.127778 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm_dbe144b5-3b78-4946-bbf9-b825b0e47b07/cluster-cloud-controller-manager/0.log" Dec 05 12:45:19.128895 master-0 kubenswrapper[8731]: I1205 12:45:19.127896 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerStarted","Data":"85646088c9e7224de8ae7078ab75c2d8ce77a9f7abbf5c5fe5944cd4c31dc3b4"} Dec 05 12:45:19.586019 master-0 kubenswrapper[8731]: I1205 12:45:19.585911 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:19.586019 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:19.586019 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:19.586019 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:19.586777 master-0 kubenswrapper[8731]: I1205 12:45:19.586041 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:20.516474 master-0 kubenswrapper[8731]: I1205 12:45:20.516342 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:45:20.585388 master-0 kubenswrapper[8731]: I1205 12:45:20.585253 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:20.585388 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:20.585388 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:20.585388 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:20.585861 master-0 kubenswrapper[8731]: I1205 12:45:20.585394 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:21.584955 master-0 kubenswrapper[8731]: I1205 12:45:21.584848 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:21.584955 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:21.584955 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:21.584955 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:21.584955 master-0 kubenswrapper[8731]: I1205 12:45:21.584937 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:22.585638 master-0 kubenswrapper[8731]: I1205 12:45:22.585534 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:22.585638 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:22.585638 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:22.585638 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:22.586476 master-0 kubenswrapper[8731]: I1205 12:45:22.585663 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:23.516366 master-0 kubenswrapper[8731]: I1205 12:45:23.516254 8731 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 12:45:23.585311 master-0 kubenswrapper[8731]: I1205 12:45:23.585215 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:23.585311 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:23.585311 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:23.585311 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:23.586501 master-0 kubenswrapper[8731]: I1205 12:45:23.585316 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:24.584932 master-0 kubenswrapper[8731]: I1205 12:45:24.584871 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:24.584932 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:24.584932 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:24.584932 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:24.585344 master-0 kubenswrapper[8731]: I1205 12:45:24.584938 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:25.585115 master-0 kubenswrapper[8731]: I1205 12:45:25.585024 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:25.585115 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:25.585115 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:25.585115 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:25.585115 master-0 kubenswrapper[8731]: I1205 12:45:25.585099 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:26.584984 master-0 kubenswrapper[8731]: I1205 12:45:26.584866 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:26.584984 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:26.584984 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:26.584984 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:26.584984 master-0 kubenswrapper[8731]: I1205 12:45:26.584964 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:27.064648 master-0 kubenswrapper[8731]: E1205 12:45:27.064129 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 05 12:45:27.190417 master-0 kubenswrapper[8731]: I1205 12:45:27.190347 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-n28z2_3b741029-0eb5-409b-b7f1-95e8385dc400/manager/1.log" Dec 05 12:45:27.191550 master-0 kubenswrapper[8731]: I1205 12:45:27.191483 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-n28z2_3b741029-0eb5-409b-b7f1-95e8385dc400/manager/0.log" Dec 05 12:45:27.192315 master-0 kubenswrapper[8731]: I1205 12:45:27.192253 8731 generic.go:334] "Generic (PLEG): container finished" podID="3b741029-0eb5-409b-b7f1-95e8385dc400" containerID="712a042e7fad33cd815c939ca364362f7220e1f1ce6096f34de7bd5630509fb8" exitCode=1 Dec 05 12:45:27.192368 master-0 kubenswrapper[8731]: I1205 12:45:27.192328 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerDied","Data":"712a042e7fad33cd815c939ca364362f7220e1f1ce6096f34de7bd5630509fb8"} Dec 05 12:45:27.192436 master-0 kubenswrapper[8731]: I1205 12:45:27.192405 8731 scope.go:117] "RemoveContainer" containerID="73f6bfa12151c71020cd1cc8c48ebdf6c4c24dbf1a05b4873ce05f073bdcce94" Dec 05 12:45:27.193124 master-0 kubenswrapper[8731]: I1205 12:45:27.193070 8731 scope.go:117] "RemoveContainer" containerID="712a042e7fad33cd815c939ca364362f7220e1f1ce6096f34de7bd5630509fb8" Dec 05 12:45:27.193582 master-0 kubenswrapper[8731]: E1205 12:45:27.193528 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=catalogd-controller-manager-7cc89f4c4c-n28z2_openshift-catalogd(3b741029-0eb5-409b-b7f1-95e8385dc400)\"" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" podUID="3b741029-0eb5-409b-b7f1-95e8385dc400" Dec 05 12:45:27.585559 master-0 kubenswrapper[8731]: I1205 12:45:27.585297 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:27.585559 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:27.585559 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:27.585559 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:27.585559 master-0 kubenswrapper[8731]: I1205 12:45:27.585433 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:28.203964 master-0 kubenswrapper[8731]: I1205 12:45:28.203870 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-n28z2_3b741029-0eb5-409b-b7f1-95e8385dc400/manager/1.log" Dec 05 12:45:28.585260 master-0 kubenswrapper[8731]: I1205 12:45:28.585200 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:28.585260 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:28.585260 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:28.585260 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:28.585638 master-0 kubenswrapper[8731]: I1205 12:45:28.585272 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:28.735600 master-0 kubenswrapper[8731]: I1205 12:45:28.735521 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:45:28.736544 master-0 kubenswrapper[8731]: I1205 12:45:28.736351 8731 scope.go:117] "RemoveContainer" containerID="712a042e7fad33cd815c939ca364362f7220e1f1ce6096f34de7bd5630509fb8" Dec 05 12:45:28.736732 master-0 kubenswrapper[8731]: E1205 12:45:28.736679 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=catalogd-controller-manager-7cc89f4c4c-n28z2_openshift-catalogd(3b741029-0eb5-409b-b7f1-95e8385dc400)\"" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" podUID="3b741029-0eb5-409b-b7f1-95e8385dc400" Dec 05 12:45:29.218279 master-0 kubenswrapper[8731]: I1205 12:45:29.218225 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm_dbe144b5-3b78-4946-bbf9-b825b0e47b07/config-sync-controllers/0.log" Dec 05 12:45:29.219809 master-0 kubenswrapper[8731]: I1205 12:45:29.219741 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm_dbe144b5-3b78-4946-bbf9-b825b0e47b07/cluster-cloud-controller-manager/0.log" Dec 05 12:45:29.219923 master-0 kubenswrapper[8731]: I1205 12:45:29.219864 8731 generic.go:334] "Generic (PLEG): container finished" podID="dbe144b5-3b78-4946-bbf9-b825b0e47b07" containerID="88bb2ee05e17ca0ccc95842f8e427991824283668dc77c62b2a389be9423149d" exitCode=1 Dec 05 12:45:29.219975 master-0 kubenswrapper[8731]: I1205 12:45:29.219940 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerDied","Data":"88bb2ee05e17ca0ccc95842f8e427991824283668dc77c62b2a389be9423149d"} Dec 05 12:45:29.221197 master-0 kubenswrapper[8731]: I1205 12:45:29.221126 8731 scope.go:117] "RemoveContainer" containerID="88bb2ee05e17ca0ccc95842f8e427991824283668dc77c62b2a389be9423149d" Dec 05 12:45:29.585067 master-0 kubenswrapper[8731]: I1205 12:45:29.584988 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:29.585067 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:29.585067 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:29.585067 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:29.585448 master-0 kubenswrapper[8731]: I1205 12:45:29.585088 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:30.230506 master-0 kubenswrapper[8731]: I1205 12:45:30.230459 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm_dbe144b5-3b78-4946-bbf9-b825b0e47b07/config-sync-controllers/0.log" Dec 05 12:45:30.231263 master-0 kubenswrapper[8731]: I1205 12:45:30.231109 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm_dbe144b5-3b78-4946-bbf9-b825b0e47b07/cluster-cloud-controller-manager/0.log" Dec 05 12:45:30.231263 master-0 kubenswrapper[8731]: I1205 12:45:30.231172 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerStarted","Data":"a8d28b5a11dbbe0ce40e81abee57abc61672afe8fdac498f35db8f445d2e2f79"} Dec 05 12:45:30.586085 master-0 kubenswrapper[8731]: I1205 12:45:30.585883 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:30.586085 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:30.586085 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:30.586085 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:30.586085 master-0 kubenswrapper[8731]: I1205 12:45:30.585984 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:31.585917 master-0 kubenswrapper[8731]: I1205 12:45:31.585809 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:31.585917 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:31.585917 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:31.585917 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:31.586994 master-0 kubenswrapper[8731]: I1205 12:45:31.585909 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:32.585700 master-0 kubenswrapper[8731]: I1205 12:45:32.585597 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:32.585700 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:32.585700 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:32.585700 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:32.586906 master-0 kubenswrapper[8731]: I1205 12:45:32.585715 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:33.516947 master-0 kubenswrapper[8731]: I1205 12:45:33.516830 8731 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 12:45:33.584882 master-0 kubenswrapper[8731]: I1205 12:45:33.584764 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:33.584882 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:33.584882 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:33.584882 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:33.584882 master-0 kubenswrapper[8731]: I1205 12:45:33.584880 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:34.586057 master-0 kubenswrapper[8731]: I1205 12:45:34.585963 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:34.586057 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:34.586057 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:34.586057 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:34.586698 master-0 kubenswrapper[8731]: I1205 12:45:34.586070 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:35.269209 master-0 kubenswrapper[8731]: I1205 12:45:35.269122 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-d9g7k_153fec1f-a10b-4c6c-a997-60fa80c13a86/manager/1.log" Dec 05 12:45:35.270795 master-0 kubenswrapper[8731]: I1205 12:45:35.270757 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-d9g7k_153fec1f-a10b-4c6c-a997-60fa80c13a86/manager/0.log" Dec 05 12:45:35.270867 master-0 kubenswrapper[8731]: I1205 12:45:35.270821 8731 generic.go:334] "Generic (PLEG): container finished" podID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerID="7b0e1392f4706a31c5e08db223b1244b230bf09a2ede6f19588e74a4a3860cf4" exitCode=1 Dec 05 12:45:35.270913 master-0 kubenswrapper[8731]: I1205 12:45:35.270861 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerDied","Data":"7b0e1392f4706a31c5e08db223b1244b230bf09a2ede6f19588e74a4a3860cf4"} Dec 05 12:45:35.270913 master-0 kubenswrapper[8731]: I1205 12:45:35.270907 8731 scope.go:117] "RemoveContainer" containerID="b02b74337c561023bb77d95397661e10a1ee5fc12d28b2fd7ee9556bbaba81e5" Dec 05 12:45:35.271836 master-0 kubenswrapper[8731]: I1205 12:45:35.271779 8731 scope.go:117] "RemoveContainer" containerID="7b0e1392f4706a31c5e08db223b1244b230bf09a2ede6f19588e74a4a3860cf4" Dec 05 12:45:35.272203 master-0 kubenswrapper[8731]: E1205 12:45:35.272127 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=operator-controller-controller-manager-7cbd59c7f8-d9g7k_openshift-operator-controller(153fec1f-a10b-4c6c-a997-60fa80c13a86)\"" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" Dec 05 12:45:35.584805 master-0 kubenswrapper[8731]: I1205 12:45:35.584603 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:35.584805 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:35.584805 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:35.584805 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:35.584805 master-0 kubenswrapper[8731]: I1205 12:45:35.584694 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:36.280722 master-0 kubenswrapper[8731]: I1205 12:45:36.280651 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-d9g7k_153fec1f-a10b-4c6c-a997-60fa80c13a86/manager/1.log" Dec 05 12:45:36.585756 master-0 kubenswrapper[8731]: I1205 12:45:36.585552 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:36.585756 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:36.585756 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:36.585756 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:36.585756 master-0 kubenswrapper[8731]: I1205 12:45:36.585645 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:37.585141 master-0 kubenswrapper[8731]: I1205 12:45:37.585033 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:37.585141 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:37.585141 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:37.585141 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:37.585141 master-0 kubenswrapper[8731]: I1205 12:45:37.585118 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:38.585479 master-0 kubenswrapper[8731]: I1205 12:45:38.585340 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:38.585479 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:38.585479 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:38.585479 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:38.585479 master-0 kubenswrapper[8731]: I1205 12:45:38.585428 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:38.735706 master-0 kubenswrapper[8731]: I1205 12:45:38.735604 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:45:38.737063 master-0 kubenswrapper[8731]: I1205 12:45:38.736997 8731 scope.go:117] "RemoveContainer" containerID="712a042e7fad33cd815c939ca364362f7220e1f1ce6096f34de7bd5630509fb8" Dec 05 12:45:39.306222 master-0 kubenswrapper[8731]: I1205 12:45:39.306115 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-n28z2_3b741029-0eb5-409b-b7f1-95e8385dc400/manager/1.log" Dec 05 12:45:39.306634 master-0 kubenswrapper[8731]: I1205 12:45:39.306582 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerStarted","Data":"46664a4ee70d50343e807b0abdcc4556bf4a4ccc60c19f6748f2ddf921020853"} Dec 05 12:45:39.306904 master-0 kubenswrapper[8731]: I1205 12:45:39.306850 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:45:39.585021 master-0 kubenswrapper[8731]: I1205 12:45:39.584894 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:39.585021 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:39.585021 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:39.585021 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:39.585021 master-0 kubenswrapper[8731]: I1205 12:45:39.584978 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:40.584004 master-0 kubenswrapper[8731]: I1205 12:45:40.583932 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:40.584004 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:40.584004 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:40.584004 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:40.584004 master-0 kubenswrapper[8731]: I1205 12:45:40.583998 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:41.329617 master-0 kubenswrapper[8731]: I1205 12:45:41.329542 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/3.log" Dec 05 12:45:41.330028 master-0 kubenswrapper[8731]: I1205 12:45:41.329988 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/2.log" Dec 05 12:45:41.330098 master-0 kubenswrapper[8731]: I1205 12:45:41.330044 8731 generic.go:334] "Generic (PLEG): container finished" podID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" containerID="9777a8bc6b6304e63d985ec731f3cc644371b2e8a1b12c0286fe36fe4b312701" exitCode=1 Dec 05 12:45:41.330098 master-0 kubenswrapper[8731]: I1205 12:45:41.330085 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerDied","Data":"9777a8bc6b6304e63d985ec731f3cc644371b2e8a1b12c0286fe36fe4b312701"} Dec 05 12:45:41.330269 master-0 kubenswrapper[8731]: I1205 12:45:41.330137 8731 scope.go:117] "RemoveContainer" containerID="142cac7db86d510a3cb1fe121b732aea43f370fa9eb0fe98a9655b028e584160" Dec 05 12:45:41.331146 master-0 kubenswrapper[8731]: I1205 12:45:41.331094 8731 scope.go:117] "RemoveContainer" containerID="9777a8bc6b6304e63d985ec731f3cc644371b2e8a1b12c0286fe36fe4b312701" Dec 05 12:45:41.331956 master-0 kubenswrapper[8731]: E1205 12:45:41.331431 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:45:41.585859 master-0 kubenswrapper[8731]: I1205 12:45:41.585675 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:41.585859 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:41.585859 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:41.585859 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:41.585859 master-0 kubenswrapper[8731]: I1205 12:45:41.585829 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:42.339932 master-0 kubenswrapper[8731]: I1205 12:45:42.339815 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/3.log" Dec 05 12:45:42.584836 master-0 kubenswrapper[8731]: I1205 12:45:42.584772 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:42.584836 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:42.584836 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:42.584836 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:42.585227 master-0 kubenswrapper[8731]: I1205 12:45:42.584867 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:43.305945 master-0 kubenswrapper[8731]: I1205 12:45:43.305881 8731 status_manager.go:851] "Failed to get status for pod" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods router-default-5465c8b4db-dzlmb)" Dec 05 12:45:43.466315 master-0 kubenswrapper[8731]: E1205 12:45:43.466198 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:45:43.516772 master-0 kubenswrapper[8731]: I1205 12:45:43.516667 8731 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 12:45:43.516772 master-0 kubenswrapper[8731]: I1205 12:45:43.516784 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:45:43.517560 master-0 kubenswrapper[8731]: I1205 12:45:43.517490 8731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 12:45:43.517654 master-0 kubenswrapper[8731]: I1205 12:45:43.517584 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" containerID="cri-o://18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" gracePeriod=30 Dec 05 12:45:43.586306 master-0 kubenswrapper[8731]: I1205 12:45:43.585161 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:43.586306 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:43.586306 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:43.586306 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:43.586306 master-0 kubenswrapper[8731]: I1205 12:45:43.585263 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:43.635466 master-0 kubenswrapper[8731]: E1205 12:45:43.635416 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:45:44.340380 master-0 kubenswrapper[8731]: I1205 12:45:44.340308 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:45:44.341114 master-0 kubenswrapper[8731]: I1205 12:45:44.341078 8731 scope.go:117] "RemoveContainer" containerID="7b0e1392f4706a31c5e08db223b1244b230bf09a2ede6f19588e74a4a3860cf4" Dec 05 12:45:44.341511 master-0 kubenswrapper[8731]: E1205 12:45:44.341404 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=operator-controller-controller-manager-7cbd59c7f8-d9g7k_openshift-operator-controller(153fec1f-a10b-4c6c-a997-60fa80c13a86)\"" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" podUID="153fec1f-a10b-4c6c-a997-60fa80c13a86" Dec 05 12:45:44.358537 master-0 kubenswrapper[8731]: I1205 12:45:44.358450 8731 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" exitCode=2 Dec 05 12:45:44.358537 master-0 kubenswrapper[8731]: I1205 12:45:44.358537 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerDied","Data":"18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1"} Dec 05 12:45:44.358900 master-0 kubenswrapper[8731]: I1205 12:45:44.358599 8731 scope.go:117] "RemoveContainer" containerID="cc5f5346417c97786dab5fab35e6b2b1d263681d0252f5454c34842f718cd60f" Dec 05 12:45:44.359439 master-0 kubenswrapper[8731]: I1205 12:45:44.359390 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:45:44.359919 master-0 kubenswrapper[8731]: E1205 12:45:44.359852 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:45:44.585246 master-0 kubenswrapper[8731]: I1205 12:45:44.585151 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:44.585246 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:44.585246 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:44.585246 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:44.585622 master-0 kubenswrapper[8731]: I1205 12:45:44.585274 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:45.038512 master-0 kubenswrapper[8731]: E1205 12:45:45.038316 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e521791a26229 kube-system 10281 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:BackOff,Message:Back-off restarting failed container kube-scheduler in pod bootstrap-kube-scheduler-master-0_kube-system(5e09e2af7200e6f9be469dbfd9bb1127),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:39:25 +0000 UTC,LastTimestamp:2025-12-05 12:43:45.317143708 +0000 UTC m=+723.621127875,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:45:45.585494 master-0 kubenswrapper[8731]: I1205 12:45:45.585344 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:45.585494 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:45.585494 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:45.585494 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:45.586514 master-0 kubenswrapper[8731]: I1205 12:45:45.585507 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:46.584979 master-0 kubenswrapper[8731]: I1205 12:45:46.584910 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:46.584979 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:46.584979 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:46.584979 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:46.585427 master-0 kubenswrapper[8731]: I1205 12:45:46.585000 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:47.586414 master-0 kubenswrapper[8731]: I1205 12:45:47.586256 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:47.586414 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:47.586414 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:47.586414 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:47.586414 master-0 kubenswrapper[8731]: I1205 12:45:47.586386 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:48.583811 master-0 kubenswrapper[8731]: I1205 12:45:48.583700 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:48.583811 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:48.583811 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:48.583811 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:48.583811 master-0 kubenswrapper[8731]: I1205 12:45:48.583812 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:48.740070 master-0 kubenswrapper[8731]: I1205 12:45:48.739990 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:45:49.563883 master-0 kubenswrapper[8731]: E1205 12:45:49.563778 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:45:39Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:45:39Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:45:39Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:45:39Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:45:49.585472 master-0 kubenswrapper[8731]: I1205 12:45:49.585399 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:49.585472 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:49.585472 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:49.585472 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:49.585862 master-0 kubenswrapper[8731]: I1205 12:45:49.585521 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:50.586114 master-0 kubenswrapper[8731]: I1205 12:45:50.586031 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:50.586114 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:50.586114 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:50.586114 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:50.586919 master-0 kubenswrapper[8731]: I1205 12:45:50.586129 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:51.097946 master-0 kubenswrapper[8731]: E1205 12:45:51.097871 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 05 12:45:51.420032 master-0 kubenswrapper[8731]: I1205 12:45:51.419974 8731 generic.go:334] "Generic (PLEG): container finished" podID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" containerID="fa4f02496398ccdc5c55acbb60e75e3c69d9850820e087e65cbe9d00bf63d07e" exitCode=0 Dec 05 12:45:51.420307 master-0 kubenswrapper[8731]: I1205 12:45:51.420079 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" event={"ID":"c39d2089-d3bf-4556-b6ef-c362a08c21a2","Type":"ContainerDied","Data":"fa4f02496398ccdc5c55acbb60e75e3c69d9850820e087e65cbe9d00bf63d07e"} Dec 05 12:45:51.420910 master-0 kubenswrapper[8731]: I1205 12:45:51.420877 8731 scope.go:117] "RemoveContainer" containerID="fa4f02496398ccdc5c55acbb60e75e3c69d9850820e087e65cbe9d00bf63d07e" Dec 05 12:45:51.423905 master-0 kubenswrapper[8731]: I1205 12:45:51.423861 8731 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="2a8e7f38b128627544fcfe08f2d0eef9ae364770a9037f3dac3761d553a8ed98" exitCode=0 Dec 05 12:45:51.423977 master-0 kubenswrapper[8731]: I1205 12:45:51.423934 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"2a8e7f38b128627544fcfe08f2d0eef9ae364770a9037f3dac3761d553a8ed98"} Dec 05 12:45:51.424367 master-0 kubenswrapper[8731]: I1205 12:45:51.424342 8731 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:45:51.424367 master-0 kubenswrapper[8731]: I1205 12:45:51.424363 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:45:51.427121 master-0 kubenswrapper[8731]: I1205 12:45:51.427088 8731 generic.go:334] "Generic (PLEG): container finished" podID="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" containerID="11b2f7447856caa8c6cb51432e0d7392e86f64482a9fae5b398a57d71719f20e" exitCode=0 Dec 05 12:45:51.427224 master-0 kubenswrapper[8731]: I1205 12:45:51.427122 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerDied","Data":"11b2f7447856caa8c6cb51432e0d7392e86f64482a9fae5b398a57d71719f20e"} Dec 05 12:45:51.427224 master-0 kubenswrapper[8731]: I1205 12:45:51.427155 8731 scope.go:117] "RemoveContainer" containerID="c63a8034e23c88dd09173f57e05eee7c9bc26e35890cfdd9f1fdc8ef0e16d843" Dec 05 12:45:51.427713 master-0 kubenswrapper[8731]: I1205 12:45:51.427679 8731 scope.go:117] "RemoveContainer" containerID="11b2f7447856caa8c6cb51432e0d7392e86f64482a9fae5b398a57d71719f20e" Dec 05 12:45:51.427964 master-0 kubenswrapper[8731]: E1205 12:45:51.427930 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-cluster-manager pod=ovnkube-control-plane-5df5548d54-7tvfb_openshift-ovn-kubernetes(a757f807-e1bf-4f1e-9787-6b4acc8d09cf)\"" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" podUID="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" Dec 05 12:45:51.460756 master-0 kubenswrapper[8731]: I1205 12:45:51.460693 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:45:51.461850 master-0 kubenswrapper[8731]: I1205 12:45:51.461796 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:45:51.462403 master-0 kubenswrapper[8731]: E1205 12:45:51.462329 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:45:51.511647 master-0 kubenswrapper[8731]: E1205 12:45:51.511579 8731 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc39d2089_d3bf_4556_b6ef_c362a08c21a2.slice/crio-fa4f02496398ccdc5c55acbb60e75e3c69d9850820e087e65cbe9d00bf63d07e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc39d2089_d3bf_4556_b6ef_c362a08c21a2.slice/crio-conmon-fa4f02496398ccdc5c55acbb60e75e3c69d9850820e087e65cbe9d00bf63d07e.scope\": RecentStats: unable to find data in memory cache]" Dec 05 12:45:51.584990 master-0 kubenswrapper[8731]: I1205 12:45:51.584919 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:51.584990 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:51.584990 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:51.584990 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:51.585524 master-0 kubenswrapper[8731]: I1205 12:45:51.585482 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:52.435029 master-0 kubenswrapper[8731]: I1205 12:45:52.434979 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" event={"ID":"c39d2089-d3bf-4556-b6ef-c362a08c21a2","Type":"ContainerStarted","Data":"712d1506cd51ade164ab750d49a330bfc3046901061c8f5155346b2e5325a1c2"} Dec 05 12:45:52.435588 master-0 kubenswrapper[8731]: I1205 12:45:52.435370 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:45:52.440694 master-0 kubenswrapper[8731]: I1205 12:45:52.440642 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:45:52.584249 master-0 kubenswrapper[8731]: I1205 12:45:52.584177 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:52.584249 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:52.584249 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:52.584249 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:52.584575 master-0 kubenswrapper[8731]: I1205 12:45:52.584267 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:53.446898 master-0 kubenswrapper[8731]: I1205 12:45:53.446833 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-74d9cbffbc-r7kbd_db27bee9-3d33-4c4a-b38b-72f7cec77c7a/machine-approver-controller/0.log" Dec 05 12:45:53.448372 master-0 kubenswrapper[8731]: I1205 12:45:53.448285 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" event={"ID":"db27bee9-3d33-4c4a-b38b-72f7cec77c7a","Type":"ContainerDied","Data":"e4b5bdd189732f9903e53555a7a61c0d10d37cd90596a4c760274c5cce948d5d"} Dec 05 12:45:53.448372 master-0 kubenswrapper[8731]: I1205 12:45:53.448324 8731 generic.go:334] "Generic (PLEG): container finished" podID="db27bee9-3d33-4c4a-b38b-72f7cec77c7a" containerID="e4b5bdd189732f9903e53555a7a61c0d10d37cd90596a4c760274c5cce948d5d" exitCode=255 Dec 05 12:45:53.448892 master-0 kubenswrapper[8731]: I1205 12:45:53.448854 8731 scope.go:117] "RemoveContainer" containerID="e4b5bdd189732f9903e53555a7a61c0d10d37cd90596a4c760274c5cce948d5d" Dec 05 12:45:53.584371 master-0 kubenswrapper[8731]: I1205 12:45:53.584229 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:53.584371 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:53.584371 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:53.584371 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:53.584371 master-0 kubenswrapper[8731]: I1205 12:45:53.584304 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:53.934873 master-0 kubenswrapper[8731]: I1205 12:45:53.934694 8731 scope.go:117] "RemoveContainer" containerID="9777a8bc6b6304e63d985ec731f3cc644371b2e8a1b12c0286fe36fe4b312701" Dec 05 12:45:53.935222 master-0 kubenswrapper[8731]: E1205 12:45:53.934939 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:45:54.341456 master-0 kubenswrapper[8731]: I1205 12:45:54.341362 8731 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:45:54.342115 master-0 kubenswrapper[8731]: I1205 12:45:54.342077 8731 scope.go:117] "RemoveContainer" containerID="7b0e1392f4706a31c5e08db223b1244b230bf09a2ede6f19588e74a4a3860cf4" Dec 05 12:45:54.460757 master-0 kubenswrapper[8731]: I1205 12:45:54.460612 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-74d9cbffbc-r7kbd_db27bee9-3d33-4c4a-b38b-72f7cec77c7a/machine-approver-controller/0.log" Dec 05 12:45:54.461444 master-0 kubenswrapper[8731]: I1205 12:45:54.461399 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" event={"ID":"db27bee9-3d33-4c4a-b38b-72f7cec77c7a","Type":"ContainerStarted","Data":"fb6a19c9ac322ac2e35728dc364dd53920d781096971f1443cf23fd5196c363e"} Dec 05 12:45:54.584488 master-0 kubenswrapper[8731]: I1205 12:45:54.584429 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:54.584488 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:54.584488 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:54.584488 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:54.584826 master-0 kubenswrapper[8731]: I1205 12:45:54.584495 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:55.472313 master-0 kubenswrapper[8731]: I1205 12:45:55.472170 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-d9g7k_153fec1f-a10b-4c6c-a997-60fa80c13a86/manager/1.log" Dec 05 12:45:55.473074 master-0 kubenswrapper[8731]: I1205 12:45:55.472940 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerStarted","Data":"ba9a9971d6a0e8e47787750aed5178bf0427946fa6537ae74aff0ff8a94d2c5c"} Dec 05 12:45:55.473643 master-0 kubenswrapper[8731]: I1205 12:45:55.473568 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:45:55.585570 master-0 kubenswrapper[8731]: I1205 12:45:55.585497 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:55.585570 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:55.585570 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:55.585570 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:55.585913 master-0 kubenswrapper[8731]: I1205 12:45:55.585592 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:56.586521 master-0 kubenswrapper[8731]: I1205 12:45:56.586430 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:56.586521 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:56.586521 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:56.586521 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:56.587132 master-0 kubenswrapper[8731]: I1205 12:45:56.586543 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:57.585725 master-0 kubenswrapper[8731]: I1205 12:45:57.585580 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:57.585725 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:57.585725 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:57.585725 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:57.586273 master-0 kubenswrapper[8731]: I1205 12:45:57.585715 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:58.586023 master-0 kubenswrapper[8731]: I1205 12:45:58.585907 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:58.586023 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:58.586023 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:58.586023 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:58.586880 master-0 kubenswrapper[8731]: I1205 12:45:58.586052 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:45:59.564895 master-0 kubenswrapper[8731]: E1205 12:45:59.564796 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:45:59.585249 master-0 kubenswrapper[8731]: I1205 12:45:59.585141 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:45:59.585249 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:45:59.585249 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:45:59.585249 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:45:59.585663 master-0 kubenswrapper[8731]: I1205 12:45:59.585277 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:00.469165 master-0 kubenswrapper[8731]: E1205 12:46:00.468816 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:46:00.585318 master-0 kubenswrapper[8731]: I1205 12:46:00.585056 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:00.585318 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:00.585318 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:00.585318 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:00.585318 master-0 kubenswrapper[8731]: I1205 12:46:00.585163 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:01.585671 master-0 kubenswrapper[8731]: I1205 12:46:01.585576 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:01.585671 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:01.585671 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:01.585671 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:01.586961 master-0 kubenswrapper[8731]: I1205 12:46:01.585720 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:01.934764 master-0 kubenswrapper[8731]: I1205 12:46:01.934573 8731 scope.go:117] "RemoveContainer" containerID="11b2f7447856caa8c6cb51432e0d7392e86f64482a9fae5b398a57d71719f20e" Dec 05 12:46:02.534726 master-0 kubenswrapper[8731]: I1205 12:46:02.534659 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerStarted","Data":"c3f1ecb329d73d055806e0a97968047a0f0996cc11b92a4b13a31f4dd631d1b9"} Dec 05 12:46:02.585300 master-0 kubenswrapper[8731]: I1205 12:46:02.585227 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:02.585300 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:02.585300 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:02.585300 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:02.585662 master-0 kubenswrapper[8731]: I1205 12:46:02.585342 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:02.935168 master-0 kubenswrapper[8731]: I1205 12:46:02.935045 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:46:02.935726 master-0 kubenswrapper[8731]: E1205 12:46:02.935328 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:46:03.584365 master-0 kubenswrapper[8731]: I1205 12:46:03.584311 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:03.584365 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:03.584365 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:03.584365 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:03.584793 master-0 kubenswrapper[8731]: I1205 12:46:03.584764 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:04.345785 master-0 kubenswrapper[8731]: I1205 12:46:04.345692 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:46:04.585032 master-0 kubenswrapper[8731]: I1205 12:46:04.584947 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:04.585032 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:04.585032 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:04.585032 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:04.585441 master-0 kubenswrapper[8731]: I1205 12:46:04.585053 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:05.585264 master-0 kubenswrapper[8731]: I1205 12:46:05.585154 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:05.585264 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:05.585264 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:05.585264 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:05.586272 master-0 kubenswrapper[8731]: I1205 12:46:05.585279 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:05.935637 master-0 kubenswrapper[8731]: I1205 12:46:05.935432 8731 scope.go:117] "RemoveContainer" containerID="9777a8bc6b6304e63d985ec731f3cc644371b2e8a1b12c0286fe36fe4b312701" Dec 05 12:46:05.935965 master-0 kubenswrapper[8731]: E1205 12:46:05.935705 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:46:06.584557 master-0 kubenswrapper[8731]: I1205 12:46:06.584465 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:06.584557 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:06.584557 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:06.584557 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:06.584924 master-0 kubenswrapper[8731]: I1205 12:46:06.584594 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:07.584004 master-0 kubenswrapper[8731]: I1205 12:46:07.583903 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:07.584004 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:07.584004 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:07.584004 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:07.584004 master-0 kubenswrapper[8731]: I1205 12:46:07.583996 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:08.575208 master-0 kubenswrapper[8731]: I1205 12:46:08.575114 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-67477646d4-9vfxw_0dda6d9b-cb3a-413a-85af-ef08f15ea42e/package-server-manager/0.log" Dec 05 12:46:08.575857 master-0 kubenswrapper[8731]: I1205 12:46:08.575805 8731 generic.go:334] "Generic (PLEG): container finished" podID="0dda6d9b-cb3a-413a-85af-ef08f15ea42e" containerID="494ce5c3826b0b94b974fd41d16b7ba6517fb0d007e27462a2d7b33e01aa4443" exitCode=1 Dec 05 12:46:08.576070 master-0 kubenswrapper[8731]: I1205 12:46:08.575874 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" event={"ID":"0dda6d9b-cb3a-413a-85af-ef08f15ea42e","Type":"ContainerDied","Data":"494ce5c3826b0b94b974fd41d16b7ba6517fb0d007e27462a2d7b33e01aa4443"} Dec 05 12:46:08.577233 master-0 kubenswrapper[8731]: I1205 12:46:08.577200 8731 scope.go:117] "RemoveContainer" containerID="494ce5c3826b0b94b974fd41d16b7ba6517fb0d007e27462a2d7b33e01aa4443" Dec 05 12:46:08.584767 master-0 kubenswrapper[8731]: I1205 12:46:08.584702 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:08.584767 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:08.584767 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:08.584767 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:08.585614 master-0 kubenswrapper[8731]: I1205 12:46:08.585574 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:09.565834 master-0 kubenswrapper[8731]: E1205 12:46:09.565747 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:46:09.584622 master-0 kubenswrapper[8731]: I1205 12:46:09.584515 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:09.584622 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:09.584622 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:09.584622 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:09.585331 master-0 kubenswrapper[8731]: I1205 12:46:09.584636 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:09.585525 master-0 kubenswrapper[8731]: I1205 12:46:09.585485 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-67477646d4-9vfxw_0dda6d9b-cb3a-413a-85af-ef08f15ea42e/package-server-manager/0.log" Dec 05 12:46:09.586023 master-0 kubenswrapper[8731]: I1205 12:46:09.585974 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" event={"ID":"0dda6d9b-cb3a-413a-85af-ef08f15ea42e","Type":"ContainerStarted","Data":"b34007475640228397f904792caa70766119deff9ff9ac4f7b367d7746a11f97"} Dec 05 12:46:09.586409 master-0 kubenswrapper[8731]: I1205 12:46:09.586363 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:46:10.586557 master-0 kubenswrapper[8731]: I1205 12:46:10.584947 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:10.586557 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:10.586557 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:10.586557 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:10.586557 master-0 kubenswrapper[8731]: I1205 12:46:10.585040 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:11.584725 master-0 kubenswrapper[8731]: I1205 12:46:11.584628 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:11.584725 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:11.584725 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:11.584725 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:11.585283 master-0 kubenswrapper[8731]: I1205 12:46:11.584757 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:12.584731 master-0 kubenswrapper[8731]: I1205 12:46:12.584661 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:12.584731 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:12.584731 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:12.584731 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:12.585384 master-0 kubenswrapper[8731]: I1205 12:46:12.584747 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:13.584781 master-0 kubenswrapper[8731]: I1205 12:46:13.584722 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:13.584781 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:13.584781 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:13.584781 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:13.586060 master-0 kubenswrapper[8731]: I1205 12:46:13.584796 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:13.935551 master-0 kubenswrapper[8731]: I1205 12:46:13.935459 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:46:13.936064 master-0 kubenswrapper[8731]: E1205 12:46:13.935964 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:46:14.586208 master-0 kubenswrapper[8731]: I1205 12:46:14.586083 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:14.586208 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:14.586208 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:14.586208 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:14.587407 master-0 kubenswrapper[8731]: I1205 12:46:14.586218 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:14.619942 master-0 kubenswrapper[8731]: I1205 12:46:14.619882 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-ldg5j_531b8927-92db-4e9d-9a0a-12ff948cdaad/control-plane-machine-set-operator/0.log" Dec 05 12:46:14.619942 master-0 kubenswrapper[8731]: I1205 12:46:14.619944 8731 generic.go:334] "Generic (PLEG): container finished" podID="531b8927-92db-4e9d-9a0a-12ff948cdaad" containerID="5227d615ebfc1e16e53996d356380f47ad9e8fd55349d0658112ccb54f8ef1bb" exitCode=1 Dec 05 12:46:14.620172 master-0 kubenswrapper[8731]: I1205 12:46:14.620003 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" event={"ID":"531b8927-92db-4e9d-9a0a-12ff948cdaad","Type":"ContainerDied","Data":"5227d615ebfc1e16e53996d356380f47ad9e8fd55349d0658112ccb54f8ef1bb"} Dec 05 12:46:14.620566 master-0 kubenswrapper[8731]: I1205 12:46:14.620520 8731 scope.go:117] "RemoveContainer" containerID="5227d615ebfc1e16e53996d356380f47ad9e8fd55349d0658112ccb54f8ef1bb" Dec 05 12:46:14.622488 master-0 kubenswrapper[8731]: I1205 12:46:14.622405 8731 generic.go:334] "Generic (PLEG): container finished" podID="38941513-e968-45f1-9cb2-b63d40338f36" containerID="418f3d79b0988ff7f7ba36537b8459867264703e9c8f702bbd93e4dee2835882" exitCode=0 Dec 05 12:46:14.622598 master-0 kubenswrapper[8731]: I1205 12:46:14.622497 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" event={"ID":"38941513-e968-45f1-9cb2-b63d40338f36","Type":"ContainerDied","Data":"418f3d79b0988ff7f7ba36537b8459867264703e9c8f702bbd93e4dee2835882"} Dec 05 12:46:14.623294 master-0 kubenswrapper[8731]: I1205 12:46:14.623266 8731 scope.go:117] "RemoveContainer" containerID="418f3d79b0988ff7f7ba36537b8459867264703e9c8f702bbd93e4dee2835882" Dec 05 12:46:15.586171 master-0 kubenswrapper[8731]: I1205 12:46:15.586050 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:15.586171 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:15.586171 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:15.586171 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:15.586171 master-0 kubenswrapper[8731]: I1205 12:46:15.586163 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:15.632461 master-0 kubenswrapper[8731]: I1205 12:46:15.632391 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-85cff47f46-p9xtc_a2acba71-b9dc-4b85-be35-c995b8be2f19/cluster-node-tuning-operator/0.log" Dec 05 12:46:15.632461 master-0 kubenswrapper[8731]: I1205 12:46:15.632460 8731 generic.go:334] "Generic (PLEG): container finished" podID="a2acba71-b9dc-4b85-be35-c995b8be2f19" containerID="0ef8d356dac19c1922c065326d7809108046e5a2cd059d5d50b5229acd7007ec" exitCode=1 Dec 05 12:46:15.632820 master-0 kubenswrapper[8731]: I1205 12:46:15.632532 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" event={"ID":"a2acba71-b9dc-4b85-be35-c995b8be2f19","Type":"ContainerDied","Data":"0ef8d356dac19c1922c065326d7809108046e5a2cd059d5d50b5229acd7007ec"} Dec 05 12:46:15.633404 master-0 kubenswrapper[8731]: I1205 12:46:15.633053 8731 scope.go:117] "RemoveContainer" containerID="0ef8d356dac19c1922c065326d7809108046e5a2cd059d5d50b5229acd7007ec" Dec 05 12:46:15.635948 master-0 kubenswrapper[8731]: I1205 12:46:15.635875 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-5xg2k_a280c582-685e-47ac-bf6b-248aa0c129a9/cluster-baremetal-operator/0.log" Dec 05 12:46:15.635948 master-0 kubenswrapper[8731]: I1205 12:46:15.635933 8731 generic.go:334] "Generic (PLEG): container finished" podID="a280c582-685e-47ac-bf6b-248aa0c129a9" containerID="502462f2915d6fff82c1a557ec2a9e24c7fbeef3a6daff0dad2cf5862df79899" exitCode=1 Dec 05 12:46:15.636135 master-0 kubenswrapper[8731]: I1205 12:46:15.635994 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" event={"ID":"a280c582-685e-47ac-bf6b-248aa0c129a9","Type":"ContainerDied","Data":"502462f2915d6fff82c1a557ec2a9e24c7fbeef3a6daff0dad2cf5862df79899"} Dec 05 12:46:15.636420 master-0 kubenswrapper[8731]: I1205 12:46:15.636375 8731 scope.go:117] "RemoveContainer" containerID="502462f2915d6fff82c1a557ec2a9e24c7fbeef3a6daff0dad2cf5862df79899" Dec 05 12:46:15.639975 master-0 kubenswrapper[8731]: I1205 12:46:15.639934 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-ldg5j_531b8927-92db-4e9d-9a0a-12ff948cdaad/control-plane-machine-set-operator/0.log" Dec 05 12:46:15.640124 master-0 kubenswrapper[8731]: I1205 12:46:15.640050 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" event={"ID":"531b8927-92db-4e9d-9a0a-12ff948cdaad","Type":"ContainerStarted","Data":"a1e1f964f61db578543e8bda36d3c26eb06dbcb3659a952a96708304cb1ba2a9"} Dec 05 12:46:15.643331 master-0 kubenswrapper[8731]: I1205 12:46:15.643276 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" event={"ID":"38941513-e968-45f1-9cb2-b63d40338f36","Type":"ContainerStarted","Data":"422041b3d2323dfdeb50d410c114367777627894d6f0b8ffccb3a7e50a46157a"} Dec 05 12:46:16.585838 master-0 kubenswrapper[8731]: I1205 12:46:16.585724 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:16.585838 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:16.585838 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:16.585838 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:16.585838 master-0 kubenswrapper[8731]: I1205 12:46:16.585830 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:16.653172 master-0 kubenswrapper[8731]: I1205 12:46:16.653099 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-5xg2k_a280c582-685e-47ac-bf6b-248aa0c129a9/cluster-baremetal-operator/0.log" Dec 05 12:46:16.653920 master-0 kubenswrapper[8731]: I1205 12:46:16.653279 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" event={"ID":"a280c582-685e-47ac-bf6b-248aa0c129a9","Type":"ContainerStarted","Data":"fb030ad34b9342fc42a80c2fdf5d7deaefdc07aa0ffbb47e24246b631e76fcfa"} Dec 05 12:46:16.656815 master-0 kubenswrapper[8731]: I1205 12:46:16.656752 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-85cff47f46-p9xtc_a2acba71-b9dc-4b85-be35-c995b8be2f19/cluster-node-tuning-operator/0.log" Dec 05 12:46:16.656886 master-0 kubenswrapper[8731]: I1205 12:46:16.656833 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" event={"ID":"a2acba71-b9dc-4b85-be35-c995b8be2f19","Type":"ContainerStarted","Data":"0e7509a9d39d3092ec9ec8e1b908f1fa3448275694e027b1ba9f70fa93878312"} Dec 05 12:46:16.935066 master-0 kubenswrapper[8731]: I1205 12:46:16.934924 8731 scope.go:117] "RemoveContainer" containerID="9777a8bc6b6304e63d985ec731f3cc644371b2e8a1b12c0286fe36fe4b312701" Dec 05 12:46:16.935701 master-0 kubenswrapper[8731]: E1205 12:46:16.935666 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:46:17.470560 master-0 kubenswrapper[8731]: E1205 12:46:17.470425 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:46:17.585433 master-0 kubenswrapper[8731]: I1205 12:46:17.585374 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:17.585433 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:17.585433 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:17.585433 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:17.586038 master-0 kubenswrapper[8731]: I1205 12:46:17.585959 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:18.584656 master-0 kubenswrapper[8731]: I1205 12:46:18.584557 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:18.584656 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:18.584656 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:18.584656 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:18.584656 master-0 kubenswrapper[8731]: I1205 12:46:18.584652 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:19.042163 master-0 kubenswrapper[8731]: E1205 12:46:19.041997 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e521791a26229 kube-system 10281 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:BackOff,Message:Back-off restarting failed container kube-scheduler in pod bootstrap-kube-scheduler-master-0_kube-system(5e09e2af7200e6f9be469dbfd9bb1127),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:39:25 +0000 UTC,LastTimestamp:2025-12-05 12:43:58.93533945 +0000 UTC m=+737.239323617,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:46:19.566980 master-0 kubenswrapper[8731]: E1205 12:46:19.566904 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:46:19.586040 master-0 kubenswrapper[8731]: I1205 12:46:19.585958 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:19.586040 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:19.586040 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:19.586040 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:19.586955 master-0 kubenswrapper[8731]: I1205 12:46:19.586041 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:20.585414 master-0 kubenswrapper[8731]: I1205 12:46:20.585254 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:20.585414 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:20.585414 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:20.585414 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:20.586019 master-0 kubenswrapper[8731]: I1205 12:46:20.585407 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:21.586049 master-0 kubenswrapper[8731]: I1205 12:46:21.585907 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:21.586049 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:21.586049 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:21.586049 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:21.586049 master-0 kubenswrapper[8731]: I1205 12:46:21.586019 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:22.586645 master-0 kubenswrapper[8731]: I1205 12:46:22.586547 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:22.586645 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:22.586645 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:22.586645 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:22.587136 master-0 kubenswrapper[8731]: I1205 12:46:22.586662 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:23.584442 master-0 kubenswrapper[8731]: I1205 12:46:23.584373 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:23.584442 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:23.584442 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:23.584442 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:23.584877 master-0 kubenswrapper[8731]: I1205 12:46:23.584454 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:24.585100 master-0 kubenswrapper[8731]: I1205 12:46:24.585023 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:24.585100 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:24.585100 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:24.585100 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:24.586039 master-0 kubenswrapper[8731]: I1205 12:46:24.585999 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:25.428329 master-0 kubenswrapper[8731]: E1205 12:46:25.428252 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 05 12:46:25.585253 master-0 kubenswrapper[8731]: I1205 12:46:25.585154 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:25.585253 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:25.585253 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:25.585253 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:25.586003 master-0 kubenswrapper[8731]: I1205 12:46:25.585277 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:25.731251 master-0 kubenswrapper[8731]: I1205 12:46:25.731155 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"bca56bcd2d866b305199c7dd4a2eee615bb7722c74c3f021e5b1413c58454e2d"} Dec 05 12:46:26.585769 master-0 kubenswrapper[8731]: I1205 12:46:26.585652 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:26.585769 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:26.585769 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:26.585769 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:26.586548 master-0 kubenswrapper[8731]: I1205 12:46:26.585790 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:26.746913 master-0 kubenswrapper[8731]: I1205 12:46:26.746730 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"b3860dcf136009c692df523a643ebd872d14983cf881ae6cecf3b72bd4c343db"} Dec 05 12:46:26.746913 master-0 kubenswrapper[8731]: I1205 12:46:26.746821 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"835dec2af52fd7ec20588c9018988cf86c305d1989f96ea1549008fdb5e04109"} Dec 05 12:46:26.746913 master-0 kubenswrapper[8731]: I1205 12:46:26.746850 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"f10cf9fb0238be4d34d1001638012d731272864867405100db90e54fd1b0489b"} Dec 05 12:46:27.584413 master-0 kubenswrapper[8731]: I1205 12:46:27.584287 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:27.584413 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:27.584413 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:27.584413 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:27.584898 master-0 kubenswrapper[8731]: I1205 12:46:27.584407 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:27.762808 master-0 kubenswrapper[8731]: I1205 12:46:27.762700 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"34c662fa07de0c08cafe82dd42ae1e0359fa3bbfb03c3cbf9cbec7bb72517328"} Dec 05 12:46:27.763831 master-0 kubenswrapper[8731]: I1205 12:46:27.763328 8731 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:46:27.763831 master-0 kubenswrapper[8731]: I1205 12:46:27.763378 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:46:28.584989 master-0 kubenswrapper[8731]: I1205 12:46:28.584824 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:28.584989 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:28.584989 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:28.584989 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:28.584989 master-0 kubenswrapper[8731]: I1205 12:46:28.584945 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:28.935494 master-0 kubenswrapper[8731]: I1205 12:46:28.935349 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:46:28.935947 master-0 kubenswrapper[8731]: E1205 12:46:28.935871 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:46:29.567293 master-0 kubenswrapper[8731]: E1205 12:46:29.567197 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:46:29.567293 master-0 kubenswrapper[8731]: E1205 12:46:29.567238 8731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 12:46:29.586201 master-0 kubenswrapper[8731]: I1205 12:46:29.586080 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:29.586201 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:29.586201 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:29.586201 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:29.586493 master-0 kubenswrapper[8731]: I1205 12:46:29.586219 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:30.585703 master-0 kubenswrapper[8731]: I1205 12:46:30.585552 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:46:30.585703 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:46:30.585703 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:46:30.585703 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:46:30.585703 master-0 kubenswrapper[8731]: I1205 12:46:30.585658 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:46:30.585703 master-0 kubenswrapper[8731]: I1205 12:46:30.585733 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:46:30.587659 master-0 kubenswrapper[8731]: I1205 12:46:30.586565 8731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8"} pod="openshift-ingress/router-default-5465c8b4db-dzlmb" containerMessage="Container router failed startup probe, will be restarted" Dec 05 12:46:30.587659 master-0 kubenswrapper[8731]: I1205 12:46:30.586612 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" containerID="cri-o://eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8" gracePeriod=3600 Dec 05 12:46:31.935642 master-0 kubenswrapper[8731]: I1205 12:46:31.935568 8731 scope.go:117] "RemoveContainer" containerID="9777a8bc6b6304e63d985ec731f3cc644371b2e8a1b12c0286fe36fe4b312701" Dec 05 12:46:31.972076 master-0 kubenswrapper[8731]: I1205 12:46:31.971759 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 05 12:46:31.972076 master-0 kubenswrapper[8731]: I1205 12:46:31.971823 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 05 12:46:32.805710 master-0 kubenswrapper[8731]: I1205 12:46:32.805668 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/3.log" Dec 05 12:46:32.805944 master-0 kubenswrapper[8731]: I1205 12:46:32.805738 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerStarted","Data":"33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654"} Dec 05 12:46:34.472383 master-0 kubenswrapper[8731]: E1205 12:46:34.472286 8731 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 05 12:46:41.999471 master-0 kubenswrapper[8731]: I1205 12:46:41.999410 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 05 12:46:42.936053 master-0 kubenswrapper[8731]: I1205 12:46:42.935900 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:46:42.936329 master-0 kubenswrapper[8731]: E1205 12:46:42.936267 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:46:43.309412 master-0 kubenswrapper[8731]: I1205 12:46:43.309343 8731 status_manager.go:851] "Failed to get status for pod" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods ingress-operator-8649c48786-7xrk6)" Dec 05 12:46:46.775380 master-0 kubenswrapper[8731]: I1205 12:46:46.775253 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:46:46.988442 master-0 kubenswrapper[8731]: I1205 12:46:46.988337 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 05 12:46:49.789428 master-0 kubenswrapper[8731]: E1205 12:46:49.789337 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:46:39Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:46:39Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:46:39Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:46:39Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:46:53.045246 master-0 kubenswrapper[8731]: E1205 12:46:53.045098 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{ingress-operator-8649c48786-7xrk6.187e5218f9cbbad4 openshift-ingress-operator 10932 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-ingress-operator,Name:ingress-operator-8649c48786-7xrk6,UID:a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7,APIVersion:v1,ResourceVersion:3549,FieldPath:spec.containers{ingress-operator},},Reason:BackOff,Message:Back-off restarting failed container ingress-operator in pod ingress-operator-8649c48786-7xrk6_openshift-ingress-operator(a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:39:31 +0000 UTC,LastTimestamp:2025-12-05 12:44:09.528067887 +0000 UTC m=+747.832052054,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:46:54.935319 master-0 kubenswrapper[8731]: I1205 12:46:54.935245 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:46:54.936406 master-0 kubenswrapper[8731]: E1205 12:46:54.935511 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:46:59.790300 master-0 kubenswrapper[8731]: E1205 12:46:59.790164 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:47:01.766600 master-0 kubenswrapper[8731]: E1205 12:47:01.766536 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 05 12:47:02.010605 master-0 kubenswrapper[8731]: I1205 12:47:02.010522 8731 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:47:02.010605 master-0 kubenswrapper[8731]: I1205 12:47:02.010572 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:47:03.019415 master-0 kubenswrapper[8731]: I1205 12:47:03.019362 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/4.log" Dec 05 12:47:03.020176 master-0 kubenswrapper[8731]: I1205 12:47:03.019946 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/3.log" Dec 05 12:47:03.020176 master-0 kubenswrapper[8731]: I1205 12:47:03.019995 8731 generic.go:334] "Generic (PLEG): container finished" podID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" containerID="33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654" exitCode=1 Dec 05 12:47:03.020176 master-0 kubenswrapper[8731]: I1205 12:47:03.020030 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerDied","Data":"33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654"} Dec 05 12:47:03.020176 master-0 kubenswrapper[8731]: I1205 12:47:03.020071 8731 scope.go:117] "RemoveContainer" containerID="9777a8bc6b6304e63d985ec731f3cc644371b2e8a1b12c0286fe36fe4b312701" Dec 05 12:47:03.020532 master-0 kubenswrapper[8731]: I1205 12:47:03.020484 8731 scope.go:117] "RemoveContainer" containerID="33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654" Dec 05 12:47:03.020708 master-0 kubenswrapper[8731]: E1205 12:47:03.020673 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:47:04.029954 master-0 kubenswrapper[8731]: I1205 12:47:04.029906 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/4.log" Dec 05 12:47:07.935617 master-0 kubenswrapper[8731]: I1205 12:47:07.935571 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:47:07.936685 master-0 kubenswrapper[8731]: E1205 12:47:07.936600 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:47:09.791111 master-0 kubenswrapper[8731]: E1205 12:47:09.790991 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:47:13.935236 master-0 kubenswrapper[8731]: I1205 12:47:13.935172 8731 scope.go:117] "RemoveContainer" containerID="33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654" Dec 05 12:47:13.935791 master-0 kubenswrapper[8731]: E1205 12:47:13.935425 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:47:17.120940 master-0 kubenswrapper[8731]: I1205 12:47:17.120894 8731 generic.go:334] "Generic (PLEG): container finished" podID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerID="eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8" exitCode=0 Dec 05 12:47:17.120940 master-0 kubenswrapper[8731]: I1205 12:47:17.120940 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerDied","Data":"eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8"} Dec 05 12:47:17.121581 master-0 kubenswrapper[8731]: I1205 12:47:17.120972 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerStarted","Data":"b7483a678d691fbf8a3207dd7d6ed1c739a3647a4a630049897502326cc17230"} Dec 05 12:47:17.121581 master-0 kubenswrapper[8731]: I1205 12:47:17.120996 8731 scope.go:117] "RemoveContainer" containerID="8fbcd680e597e847a58340dd3596ea3cc035b2de307cd72ebb1304a012ac892d" Dec 05 12:47:17.582870 master-0 kubenswrapper[8731]: I1205 12:47:17.582812 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:47:17.585996 master-0 kubenswrapper[8731]: I1205 12:47:17.585948 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:17.585996 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:17.585996 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:17.585996 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:17.586220 master-0 kubenswrapper[8731]: I1205 12:47:17.586029 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:18.584821 master-0 kubenswrapper[8731]: I1205 12:47:18.584745 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:18.584821 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:18.584821 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:18.584821 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:18.585611 master-0 kubenswrapper[8731]: I1205 12:47:18.584819 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:19.585636 master-0 kubenswrapper[8731]: I1205 12:47:19.585559 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:19.585636 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:19.585636 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:19.585636 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:19.586919 master-0 kubenswrapper[8731]: I1205 12:47:19.586317 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:19.791483 master-0 kubenswrapper[8731]: E1205 12:47:19.791367 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:47:19.935436 master-0 kubenswrapper[8731]: I1205 12:47:19.935165 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:47:19.935704 master-0 kubenswrapper[8731]: E1205 12:47:19.935489 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:47:20.585047 master-0 kubenswrapper[8731]: I1205 12:47:20.584930 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:20.585047 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:20.585047 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:20.585047 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:20.585485 master-0 kubenswrapper[8731]: I1205 12:47:20.585068 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:21.586098 master-0 kubenswrapper[8731]: I1205 12:47:21.585996 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:21.586098 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:21.586098 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:21.586098 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:21.586774 master-0 kubenswrapper[8731]: I1205 12:47:21.586114 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:22.585119 master-0 kubenswrapper[8731]: I1205 12:47:22.585045 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:22.585119 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:22.585119 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:22.585119 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:22.585554 master-0 kubenswrapper[8731]: I1205 12:47:22.585148 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:23.583443 master-0 kubenswrapper[8731]: I1205 12:47:23.583324 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:47:23.587831 master-0 kubenswrapper[8731]: I1205 12:47:23.587763 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:23.587831 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:23.587831 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:23.587831 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:23.588592 master-0 kubenswrapper[8731]: I1205 12:47:23.588531 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:24.584832 master-0 kubenswrapper[8731]: I1205 12:47:24.584745 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:24.584832 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:24.584832 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:24.584832 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:24.584832 master-0 kubenswrapper[8731]: I1205 12:47:24.584824 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:25.585105 master-0 kubenswrapper[8731]: I1205 12:47:25.585000 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:25.585105 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:25.585105 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:25.585105 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:25.585105 master-0 kubenswrapper[8731]: I1205 12:47:25.585110 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:25.935253 master-0 kubenswrapper[8731]: I1205 12:47:25.935073 8731 scope.go:117] "RemoveContainer" containerID="33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654" Dec 05 12:47:25.935473 master-0 kubenswrapper[8731]: E1205 12:47:25.935384 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:47:26.585414 master-0 kubenswrapper[8731]: I1205 12:47:26.585308 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:26.585414 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:26.585414 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:26.585414 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:26.586531 master-0 kubenswrapper[8731]: I1205 12:47:26.585422 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:27.048498 master-0 kubenswrapper[8731]: E1205 12:47:27.048287 8731 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187e51bf3dba73a7 kube-system 10357 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:5e09e2af7200e6f9be469dbfd9bb1127,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:33:05 +0000 UTC,LastTimestamp:2025-12-05 12:44:09.93684855 +0000 UTC m=+748.240832727,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:47:27.585333 master-0 kubenswrapper[8731]: I1205 12:47:27.585223 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:27.585333 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:27.585333 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:27.585333 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:27.585958 master-0 kubenswrapper[8731]: I1205 12:47:27.585375 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:28.599052 master-0 kubenswrapper[8731]: I1205 12:47:28.598930 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:28.599052 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:28.599052 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:28.599052 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:28.599734 master-0 kubenswrapper[8731]: I1205 12:47:28.599063 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:29.584501 master-0 kubenswrapper[8731]: I1205 12:47:29.584426 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:29.584501 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:29.584501 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:29.584501 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:29.584501 master-0 kubenswrapper[8731]: I1205 12:47:29.584481 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:29.792778 master-0 kubenswrapper[8731]: E1205 12:47:29.792394 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:47:29.793409 master-0 kubenswrapper[8731]: E1205 12:47:29.792773 8731 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 12:47:30.584987 master-0 kubenswrapper[8731]: I1205 12:47:30.584899 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:30.584987 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:30.584987 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:30.584987 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:30.585401 master-0 kubenswrapper[8731]: I1205 12:47:30.585007 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:31.585115 master-0 kubenswrapper[8731]: I1205 12:47:31.585018 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:31.585115 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:31.585115 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:31.585115 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:31.586377 master-0 kubenswrapper[8731]: I1205 12:47:31.585119 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:31.935716 master-0 kubenswrapper[8731]: I1205 12:47:31.935570 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:47:31.936499 master-0 kubenswrapper[8731]: E1205 12:47:31.936450 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:47:32.584418 master-0 kubenswrapper[8731]: I1205 12:47:32.584369 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:32.584418 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:32.584418 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:32.584418 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:32.584902 master-0 kubenswrapper[8731]: I1205 12:47:32.584871 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:33.586287 master-0 kubenswrapper[8731]: I1205 12:47:33.586214 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:33.586287 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:33.586287 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:33.586287 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:33.587155 master-0 kubenswrapper[8731]: I1205 12:47:33.586327 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:34.588300 master-0 kubenswrapper[8731]: I1205 12:47:34.588048 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:34.588300 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:34.588300 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:34.588300 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:34.588300 master-0 kubenswrapper[8731]: I1205 12:47:34.588161 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:35.592688 master-0 kubenswrapper[8731]: I1205 12:47:35.586719 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:35.592688 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:35.592688 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:35.592688 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:35.592688 master-0 kubenswrapper[8731]: I1205 12:47:35.586818 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:36.013656 master-0 kubenswrapper[8731]: E1205 12:47:36.013583 8731 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 05 12:47:36.586109 master-0 kubenswrapper[8731]: I1205 12:47:36.585994 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:36.586109 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:36.586109 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:36.586109 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:36.586511 master-0 kubenswrapper[8731]: I1205 12:47:36.586160 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:37.586358 master-0 kubenswrapper[8731]: I1205 12:47:37.586282 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:37.586358 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:37.586358 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:37.586358 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:37.587263 master-0 kubenswrapper[8731]: I1205 12:47:37.586381 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:38.585125 master-0 kubenswrapper[8731]: I1205 12:47:38.585023 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:38.585125 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:38.585125 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:38.585125 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:38.585633 master-0 kubenswrapper[8731]: I1205 12:47:38.585131 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:38.935226 master-0 kubenswrapper[8731]: I1205 12:47:38.935035 8731 scope.go:117] "RemoveContainer" containerID="33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654" Dec 05 12:47:38.936353 master-0 kubenswrapper[8731]: E1205 12:47:38.936320 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:47:39.585304 master-0 kubenswrapper[8731]: I1205 12:47:39.585229 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:39.585304 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:39.585304 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:39.585304 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:39.585652 master-0 kubenswrapper[8731]: I1205 12:47:39.585330 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:40.585248 master-0 kubenswrapper[8731]: I1205 12:47:40.585097 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:40.585248 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:40.585248 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:40.585248 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:40.585248 master-0 kubenswrapper[8731]: I1205 12:47:40.585199 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:41.585866 master-0 kubenswrapper[8731]: I1205 12:47:41.585742 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:41.585866 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:41.585866 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:41.585866 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:41.585866 master-0 kubenswrapper[8731]: I1205 12:47:41.585861 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:42.584658 master-0 kubenswrapper[8731]: I1205 12:47:42.584562 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:42.584658 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:42.584658 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:42.584658 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:42.585042 master-0 kubenswrapper[8731]: I1205 12:47:42.584687 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:43.312050 master-0 kubenswrapper[8731]: I1205 12:47:43.311901 8731 status_manager.go:851] "Failed to get status for pod" podUID="b8233dad-bd19-4842-a4d5-cfa84f1feb83" pod="openshift-network-node-identity/network-node-identity-xwx26" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods network-node-identity-xwx26)" Dec 05 12:47:43.588069 master-0 kubenswrapper[8731]: I1205 12:47:43.587856 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:43.588069 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:43.588069 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:43.588069 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:43.589332 master-0 kubenswrapper[8731]: I1205 12:47:43.589236 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:44.584928 master-0 kubenswrapper[8731]: I1205 12:47:44.584850 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:44.584928 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:44.584928 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:44.584928 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:44.585910 master-0 kubenswrapper[8731]: I1205 12:47:44.584950 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:44.935947 master-0 kubenswrapper[8731]: I1205 12:47:44.935762 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:47:44.936248 master-0 kubenswrapper[8731]: E1205 12:47:44.936155 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:47:45.586007 master-0 kubenswrapper[8731]: I1205 12:47:45.585934 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:45.586007 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:45.586007 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:45.586007 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:45.587377 master-0 kubenswrapper[8731]: I1205 12:47:45.586037 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:46.584623 master-0 kubenswrapper[8731]: I1205 12:47:46.584548 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:46.584623 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:46.584623 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:46.584623 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:46.585025 master-0 kubenswrapper[8731]: I1205 12:47:46.584637 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:47.585664 master-0 kubenswrapper[8731]: I1205 12:47:47.585616 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:47.585664 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:47.585664 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:47.585664 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:47.586404 master-0 kubenswrapper[8731]: I1205 12:47:47.586312 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:48.584767 master-0 kubenswrapper[8731]: I1205 12:47:48.584669 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:48.584767 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:48.584767 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:48.584767 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:48.584767 master-0 kubenswrapper[8731]: I1205 12:47:48.584738 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:49.585039 master-0 kubenswrapper[8731]: I1205 12:47:49.584945 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:49.585039 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:49.585039 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:49.585039 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:49.585829 master-0 kubenswrapper[8731]: I1205 12:47:49.585046 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:50.585378 master-0 kubenswrapper[8731]: I1205 12:47:50.585310 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:50.585378 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:50.585378 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:50.585378 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:50.585993 master-0 kubenswrapper[8731]: I1205 12:47:50.585405 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:50.936054 master-0 kubenswrapper[8731]: I1205 12:47:50.935890 8731 scope.go:117] "RemoveContainer" containerID="33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654" Dec 05 12:47:50.936332 master-0 kubenswrapper[8731]: E1205 12:47:50.936218 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:47:51.584843 master-0 kubenswrapper[8731]: I1205 12:47:51.584750 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:51.584843 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:51.584843 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:51.584843 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:51.585241 master-0 kubenswrapper[8731]: I1205 12:47:51.584843 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:52.584682 master-0 kubenswrapper[8731]: I1205 12:47:52.584575 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:52.584682 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:52.584682 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:52.584682 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:52.584682 master-0 kubenswrapper[8731]: I1205 12:47:52.584667 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:53.584368 master-0 kubenswrapper[8731]: I1205 12:47:53.584261 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:53.584368 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:53.584368 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:53.584368 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:53.584714 master-0 kubenswrapper[8731]: I1205 12:47:53.584383 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:54.584918 master-0 kubenswrapper[8731]: I1205 12:47:54.584816 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:54.584918 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:54.584918 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:54.584918 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:54.585635 master-0 kubenswrapper[8731]: I1205 12:47:54.584950 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:55.585654 master-0 kubenswrapper[8731]: I1205 12:47:55.585513 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:55.585654 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:55.585654 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:55.585654 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:55.585654 master-0 kubenswrapper[8731]: I1205 12:47:55.585630 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:56.584544 master-0 kubenswrapper[8731]: I1205 12:47:56.584461 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:56.584544 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:56.584544 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:56.584544 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:56.584837 master-0 kubenswrapper[8731]: I1205 12:47:56.584555 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:56.935101 master-0 kubenswrapper[8731]: I1205 12:47:56.934889 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:47:56.936106 master-0 kubenswrapper[8731]: E1205 12:47:56.935283 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:47:57.586159 master-0 kubenswrapper[8731]: I1205 12:47:57.586085 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:57.586159 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:57.586159 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:57.586159 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:57.586551 master-0 kubenswrapper[8731]: I1205 12:47:57.586211 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:58.584906 master-0 kubenswrapper[8731]: I1205 12:47:58.584793 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:58.584906 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:58.584906 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:58.584906 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:58.586021 master-0 kubenswrapper[8731]: I1205 12:47:58.584909 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:47:59.584895 master-0 kubenswrapper[8731]: I1205 12:47:59.584828 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:47:59.584895 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:47:59.584895 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:47:59.584895 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:47:59.585741 master-0 kubenswrapper[8731]: I1205 12:47:59.584910 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:00.585246 master-0 kubenswrapper[8731]: I1205 12:48:00.585141 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:00.585246 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:00.585246 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:00.585246 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:00.586310 master-0 kubenswrapper[8731]: I1205 12:48:00.585261 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:01.584860 master-0 kubenswrapper[8731]: I1205 12:48:01.584733 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:01.584860 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:01.584860 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:01.584860 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:01.586779 master-0 kubenswrapper[8731]: I1205 12:48:01.584863 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:02.584418 master-0 kubenswrapper[8731]: I1205 12:48:02.584325 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:02.584418 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:02.584418 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:02.584418 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:02.584418 master-0 kubenswrapper[8731]: I1205 12:48:02.584416 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:03.584693 master-0 kubenswrapper[8731]: I1205 12:48:03.584611 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:03.584693 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:03.584693 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:03.584693 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:03.585732 master-0 kubenswrapper[8731]: I1205 12:48:03.584705 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:03.936309 master-0 kubenswrapper[8731]: I1205 12:48:03.934984 8731 scope.go:117] "RemoveContainer" containerID="33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654" Dec 05 12:48:03.936309 master-0 kubenswrapper[8731]: E1205 12:48:03.935298 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:48:04.585045 master-0 kubenswrapper[8731]: I1205 12:48:04.584968 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:04.585045 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:04.585045 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:04.585045 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:04.585799 master-0 kubenswrapper[8731]: I1205 12:48:04.585058 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:05.584547 master-0 kubenswrapper[8731]: I1205 12:48:05.584487 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:05.584547 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:05.584547 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:05.584547 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:05.585006 master-0 kubenswrapper[8731]: I1205 12:48:05.584973 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:06.586100 master-0 kubenswrapper[8731]: I1205 12:48:06.585981 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:06.586100 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:06.586100 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:06.586100 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:06.586100 master-0 kubenswrapper[8731]: I1205 12:48:06.586086 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:07.584226 master-0 kubenswrapper[8731]: I1205 12:48:07.584119 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:07.584226 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:07.584226 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:07.584226 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:07.584636 master-0 kubenswrapper[8731]: I1205 12:48:07.584257 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:08.586254 master-0 kubenswrapper[8731]: I1205 12:48:08.584650 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:08.586254 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:08.586254 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:08.586254 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:08.586254 master-0 kubenswrapper[8731]: I1205 12:48:08.584756 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:09.584274 master-0 kubenswrapper[8731]: I1205 12:48:09.584124 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:09.584274 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:09.584274 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:09.584274 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:09.584274 master-0 kubenswrapper[8731]: I1205 12:48:09.584246 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:10.566507 master-0 kubenswrapper[8731]: E1205 12:48:10.566427 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:48:00Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:48:00Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:48:00Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:48:00Z\\\",\\\"type\\\":\\\"Ready\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:48:10.583879 master-0 kubenswrapper[8731]: I1205 12:48:10.583792 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:10.583879 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:10.583879 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:10.583879 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:10.584232 master-0 kubenswrapper[8731]: I1205 12:48:10.583934 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:11.585924 master-0 kubenswrapper[8731]: I1205 12:48:11.585804 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:11.585924 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:11.585924 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:11.585924 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:11.587062 master-0 kubenswrapper[8731]: I1205 12:48:11.585946 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:11.936117 master-0 kubenswrapper[8731]: I1205 12:48:11.935913 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:48:11.936487 master-0 kubenswrapper[8731]: E1205 12:48:11.936330 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(8b47694fcc32464ab24d09c23d6efb57)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" Dec 05 12:48:12.584929 master-0 kubenswrapper[8731]: I1205 12:48:12.584837 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:12.584929 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:12.584929 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:12.584929 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:12.584929 master-0 kubenswrapper[8731]: I1205 12:48:12.584896 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:13.586557 master-0 kubenswrapper[8731]: I1205 12:48:13.586442 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:13.586557 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:13.586557 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:13.586557 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:13.586557 master-0 kubenswrapper[8731]: I1205 12:48:13.586543 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:14.585557 master-0 kubenswrapper[8731]: I1205 12:48:14.585438 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:14.585557 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:14.585557 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:14.585557 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:14.585557 master-0 kubenswrapper[8731]: I1205 12:48:14.585546 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:15.584500 master-0 kubenswrapper[8731]: I1205 12:48:15.584386 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:15.584500 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:15.584500 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:15.584500 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:15.584500 master-0 kubenswrapper[8731]: I1205 12:48:15.584484 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:16.585730 master-0 kubenswrapper[8731]: I1205 12:48:16.585646 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:16.585730 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:16.585730 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:16.585730 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:16.586454 master-0 kubenswrapper[8731]: I1205 12:48:16.585746 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:16.934900 master-0 kubenswrapper[8731]: I1205 12:48:16.934706 8731 scope.go:117] "RemoveContainer" containerID="33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654" Dec 05 12:48:16.935258 master-0 kubenswrapper[8731]: E1205 12:48:16.935096 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6b958b6f94-7r5wv_openshift-cluster-storage-operator(b9623eb8-55d2-4c5c-aa8d-74b6a27274d8)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" podUID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" Dec 05 12:48:17.585067 master-0 kubenswrapper[8731]: I1205 12:48:17.585001 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:17.585067 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:17.585067 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:17.585067 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:17.585529 master-0 kubenswrapper[8731]: I1205 12:48:17.585079 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:18.585781 master-0 kubenswrapper[8731]: I1205 12:48:18.585692 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:18.585781 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:18.585781 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:18.585781 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:18.586988 master-0 kubenswrapper[8731]: I1205 12:48:18.585811 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:19.585642 master-0 kubenswrapper[8731]: I1205 12:48:19.585534 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:19.585642 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:19.585642 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:19.585642 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:19.586617 master-0 kubenswrapper[8731]: I1205 12:48:19.585642 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:20.567261 master-0 kubenswrapper[8731]: E1205 12:48:20.566944 8731 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 12:48:20.584504 master-0 kubenswrapper[8731]: I1205 12:48:20.584387 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:20.584504 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:20.584504 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:20.584504 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:20.584504 master-0 kubenswrapper[8731]: I1205 12:48:20.584497 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:21.585128 master-0 kubenswrapper[8731]: I1205 12:48:21.585033 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:21.585128 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:21.585128 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:21.585128 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:21.585128 master-0 kubenswrapper[8731]: I1205 12:48:21.585133 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:22.357642 master-0 kubenswrapper[8731]: I1205 12:48:22.355378 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 05 12:48:22.357642 master-0 kubenswrapper[8731]: E1205 12:48:22.355853 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2415969-33ad-418b-9df0-4a6c7bb279db" containerName="installer" Dec 05 12:48:22.357642 master-0 kubenswrapper[8731]: I1205 12:48:22.355870 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2415969-33ad-418b-9df0-4a6c7bb279db" containerName="installer" Dec 05 12:48:22.357642 master-0 kubenswrapper[8731]: E1205 12:48:22.355882 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af196a48-6fcc-47d1-95ac-7c0acd63dd21" containerName="installer" Dec 05 12:48:22.357642 master-0 kubenswrapper[8731]: I1205 12:48:22.355888 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="af196a48-6fcc-47d1-95ac-7c0acd63dd21" containerName="installer" Dec 05 12:48:22.357642 master-0 kubenswrapper[8731]: I1205 12:48:22.356014 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2415969-33ad-418b-9df0-4a6c7bb279db" containerName="installer" Dec 05 12:48:22.357642 master-0 kubenswrapper[8731]: I1205 12:48:22.356039 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="af196a48-6fcc-47d1-95ac-7c0acd63dd21" containerName="installer" Dec 05 12:48:22.357642 master-0 kubenswrapper[8731]: I1205 12:48:22.356673 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:22.359658 master-0 kubenswrapper[8731]: I1205 12:48:22.359623 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-bv9jw" Dec 05 12:48:22.361161 master-0 kubenswrapper[8731]: I1205 12:48:22.361106 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 05 12:48:22.363157 master-0 kubenswrapper[8731]: I1205 12:48:22.363125 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Dec 05 12:48:22.456953 master-0 kubenswrapper[8731]: I1205 12:48:22.456810 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-var-lock\") pod \"installer-5-master-0\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:22.457242 master-0 kubenswrapper[8731]: I1205 12:48:22.457019 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21de9318-06b4-42ba-8791-6d22055a04f2-kube-api-access\") pod \"installer-5-master-0\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:22.457308 master-0 kubenswrapper[8731]: I1205 12:48:22.457233 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:22.558752 master-0 kubenswrapper[8731]: I1205 12:48:22.558656 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-var-lock\") pod \"installer-5-master-0\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:22.559000 master-0 kubenswrapper[8731]: I1205 12:48:22.558794 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-var-lock\") pod \"installer-5-master-0\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:22.559000 master-0 kubenswrapper[8731]: I1205 12:48:22.558832 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21de9318-06b4-42ba-8791-6d22055a04f2-kube-api-access\") pod \"installer-5-master-0\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:22.559000 master-0 kubenswrapper[8731]: I1205 12:48:22.558907 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:22.559118 master-0 kubenswrapper[8731]: I1205 12:48:22.559079 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:22.585381 master-0 kubenswrapper[8731]: I1205 12:48:22.585291 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:22.585381 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:22.585381 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:22.585381 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:22.585882 master-0 kubenswrapper[8731]: I1205 12:48:22.585386 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:23.337915 master-0 kubenswrapper[8731]: I1205 12:48:23.337830 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21de9318-06b4-42ba-8791-6d22055a04f2-kube-api-access\") pod \"installer-5-master-0\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:23.453277 master-0 kubenswrapper[8731]: I1205 12:48:23.453159 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cflj7"] Dec 05 12:48:23.455052 master-0 kubenswrapper[8731]: I1205 12:48:23.455012 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.462224 master-0 kubenswrapper[8731]: I1205 12:48:23.462105 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vljsk"] Dec 05 12:48:23.464115 master-0 kubenswrapper[8731]: I1205 12:48:23.464074 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.503369 master-0 kubenswrapper[8731]: I1205 12:48:23.503271 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cflj7"] Dec 05 12:48:23.505107 master-0 kubenswrapper[8731]: I1205 12:48:23.505012 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vljsk"] Dec 05 12:48:23.575943 master-0 kubenswrapper[8731]: I1205 12:48:23.575500 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-utilities\") pod \"redhat-marketplace-vljsk\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.575943 master-0 kubenswrapper[8731]: I1205 12:48:23.575602 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knnqs\" (UniqueName: \"kubernetes.io/projected/270f4b35-a279-4edd-9f6d-cd42d22730d4-kube-api-access-knnqs\") pod \"redhat-marketplace-vljsk\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.575943 master-0 kubenswrapper[8731]: I1205 12:48:23.575638 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-utilities\") pod \"redhat-operators-cflj7\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.575943 master-0 kubenswrapper[8731]: I1205 12:48:23.575667 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-catalog-content\") pod \"redhat-operators-cflj7\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.575943 master-0 kubenswrapper[8731]: I1205 12:48:23.575735 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-catalog-content\") pod \"redhat-marketplace-vljsk\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.575943 master-0 kubenswrapper[8731]: I1205 12:48:23.575779 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8wng\" (UniqueName: \"kubernetes.io/projected/77883ef6-1901-4be4-89e8-4d6ecfe766df-kube-api-access-z8wng\") pod \"redhat-operators-cflj7\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.579642 master-0 kubenswrapper[8731]: I1205 12:48:23.579583 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:23.588870 master-0 kubenswrapper[8731]: I1205 12:48:23.587978 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:23.588870 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:23.588870 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:23.588870 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:23.588870 master-0 kubenswrapper[8731]: I1205 12:48:23.588082 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:23.687205 master-0 kubenswrapper[8731]: I1205 12:48:23.684637 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-utilities\") pod \"redhat-operators-cflj7\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.687205 master-0 kubenswrapper[8731]: I1205 12:48:23.684710 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knnqs\" (UniqueName: \"kubernetes.io/projected/270f4b35-a279-4edd-9f6d-cd42d22730d4-kube-api-access-knnqs\") pod \"redhat-marketplace-vljsk\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.687205 master-0 kubenswrapper[8731]: I1205 12:48:23.684748 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-catalog-content\") pod \"redhat-operators-cflj7\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.687205 master-0 kubenswrapper[8731]: I1205 12:48:23.684817 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-catalog-content\") pod \"redhat-marketplace-vljsk\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.687205 master-0 kubenswrapper[8731]: I1205 12:48:23.684865 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8wng\" (UniqueName: \"kubernetes.io/projected/77883ef6-1901-4be4-89e8-4d6ecfe766df-kube-api-access-z8wng\") pod \"redhat-operators-cflj7\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.687205 master-0 kubenswrapper[8731]: I1205 12:48:23.684903 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-utilities\") pod \"redhat-marketplace-vljsk\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.687205 master-0 kubenswrapper[8731]: I1205 12:48:23.685508 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-utilities\") pod \"redhat-marketplace-vljsk\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.687205 master-0 kubenswrapper[8731]: I1205 12:48:23.685705 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-catalog-content\") pod \"redhat-operators-cflj7\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.687205 master-0 kubenswrapper[8731]: I1205 12:48:23.685809 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-catalog-content\") pod \"redhat-marketplace-vljsk\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.687205 master-0 kubenswrapper[8731]: I1205 12:48:23.685960 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-utilities\") pod \"redhat-operators-cflj7\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.708213 master-0 kubenswrapper[8731]: I1205 12:48:23.705294 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8wng\" (UniqueName: \"kubernetes.io/projected/77883ef6-1901-4be4-89e8-4d6ecfe766df-kube-api-access-z8wng\") pod \"redhat-operators-cflj7\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.712700 master-0 kubenswrapper[8731]: I1205 12:48:23.710283 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knnqs\" (UniqueName: \"kubernetes.io/projected/270f4b35-a279-4edd-9f6d-cd42d22730d4-kube-api-access-knnqs\") pod \"redhat-marketplace-vljsk\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.786684 master-0 kubenswrapper[8731]: I1205 12:48:23.786608 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:23.796075 master-0 kubenswrapper[8731]: I1205 12:48:23.795922 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:23.934568 master-0 kubenswrapper[8731]: I1205 12:48:23.934506 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:48:24.585210 master-0 kubenswrapper[8731]: I1205 12:48:24.585140 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:24.585210 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:24.585210 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:24.585210 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:24.585528 master-0 kubenswrapper[8731]: I1205 12:48:24.585241 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:25.587253 master-0 kubenswrapper[8731]: I1205 12:48:25.586833 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:25.587253 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:25.587253 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:25.587253 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:25.587253 master-0 kubenswrapper[8731]: I1205 12:48:25.586948 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:25.828678 master-0 kubenswrapper[8731]: I1205 12:48:25.828561 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 05 12:48:25.851545 master-0 kubenswrapper[8731]: I1205 12:48:25.851448 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vljsk"] Dec 05 12:48:25.864669 master-0 kubenswrapper[8731]: I1205 12:48:25.864611 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cflj7"] Dec 05 12:48:26.191168 master-0 kubenswrapper[8731]: I1205 12:48:26.191084 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tfcdn"] Dec 05 12:48:26.198173 master-0 kubenswrapper[8731]: I1205 12:48:26.193019 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfcdn"] Dec 05 12:48:26.198173 master-0 kubenswrapper[8731]: I1205 12:48:26.193059 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.346416 master-0 kubenswrapper[8731]: I1205 12:48:26.346363 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7rhc\" (UniqueName: \"kubernetes.io/projected/3285cb4d-9385-4d55-96a7-567d01bce0d6-kube-api-access-k7rhc\") pod \"certified-operators-tfcdn\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.346593 master-0 kubenswrapper[8731]: I1205 12:48:26.346426 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-utilities\") pod \"certified-operators-tfcdn\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.346593 master-0 kubenswrapper[8731]: I1205 12:48:26.346514 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-catalog-content\") pod \"certified-operators-tfcdn\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.448529 master-0 kubenswrapper[8731]: I1205 12:48:26.448466 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7rhc\" (UniqueName: \"kubernetes.io/projected/3285cb4d-9385-4d55-96a7-567d01bce0d6-kube-api-access-k7rhc\") pod \"certified-operators-tfcdn\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.448529 master-0 kubenswrapper[8731]: I1205 12:48:26.448534 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-utilities\") pod \"certified-operators-tfcdn\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.448835 master-0 kubenswrapper[8731]: I1205 12:48:26.448560 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-catalog-content\") pod \"certified-operators-tfcdn\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.449168 master-0 kubenswrapper[8731]: I1205 12:48:26.449122 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-utilities\") pod \"certified-operators-tfcdn\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.449287 master-0 kubenswrapper[8731]: I1205 12:48:26.449254 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-catalog-content\") pod \"certified-operators-tfcdn\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.495628 master-0 kubenswrapper[8731]: I1205 12:48:26.495558 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7rhc\" (UniqueName: \"kubernetes.io/projected/3285cb4d-9385-4d55-96a7-567d01bce0d6-kube-api-access-k7rhc\") pod \"certified-operators-tfcdn\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.587736 master-0 kubenswrapper[8731]: I1205 12:48:26.587633 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:26.587736 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:26.587736 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:26.587736 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:26.588435 master-0 kubenswrapper[8731]: I1205 12:48:26.587767 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:26.649312 master-0 kubenswrapper[8731]: I1205 12:48:26.649257 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:26.650486 master-0 kubenswrapper[8731]: I1205 12:48:26.650425 8731 generic.go:334] "Generic (PLEG): container finished" podID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerID="9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64" exitCode=0 Dec 05 12:48:26.650594 master-0 kubenswrapper[8731]: I1205 12:48:26.650530 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vljsk" event={"ID":"270f4b35-a279-4edd-9f6d-cd42d22730d4","Type":"ContainerDied","Data":"9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64"} Dec 05 12:48:26.650594 master-0 kubenswrapper[8731]: I1205 12:48:26.650587 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vljsk" event={"ID":"270f4b35-a279-4edd-9f6d-cd42d22730d4","Type":"ContainerStarted","Data":"d289281125e97f9e5bec8733eaa93516e3b4640f9c10f9824b836457154aba5b"} Dec 05 12:48:26.651995 master-0 kubenswrapper[8731]: I1205 12:48:26.651947 8731 generic.go:334] "Generic (PLEG): container finished" podID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerID="a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2" exitCode=0 Dec 05 12:48:26.652088 master-0 kubenswrapper[8731]: I1205 12:48:26.652017 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflj7" event={"ID":"77883ef6-1901-4be4-89e8-4d6ecfe766df","Type":"ContainerDied","Data":"a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2"} Dec 05 12:48:26.652088 master-0 kubenswrapper[8731]: I1205 12:48:26.652036 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflj7" event={"ID":"77883ef6-1901-4be4-89e8-4d6ecfe766df","Type":"ContainerStarted","Data":"e1fb3155f0a2c155a57ecfb157c92106694c7be09a5b743d900d93cd9f503437"} Dec 05 12:48:26.655393 master-0 kubenswrapper[8731]: I1205 12:48:26.655353 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"8b47694fcc32464ab24d09c23d6efb57","Type":"ContainerStarted","Data":"3287f56a58ec6df79eb961042eccb67f5309daab6cc145e4e1caa74cca9833e8"} Dec 05 12:48:26.656712 master-0 kubenswrapper[8731]: I1205 12:48:26.656669 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21de9318-06b4-42ba-8791-6d22055a04f2","Type":"ContainerStarted","Data":"a6eeacf32c540b469027d242ad82a84ffbe4f8b8381d45f48601d0197961c30d"} Dec 05 12:48:26.656712 master-0 kubenswrapper[8731]: I1205 12:48:26.656699 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21de9318-06b4-42ba-8791-6d22055a04f2","Type":"ContainerStarted","Data":"6cb38a8f7e475b51ec4e82d15e81123c84bcaa6f22b937b869d4c561cbe1b95c"} Dec 05 12:48:26.659007 master-0 kubenswrapper[8731]: I1205 12:48:26.658916 8731 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:48:26.700251 master-0 kubenswrapper[8731]: I1205 12:48:26.700091 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=4.700070423 podStartE2EDuration="4.700070423s" podCreationTimestamp="2025-12-05 12:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:48:26.699892258 +0000 UTC m=+1005.003876435" watchObservedRunningTime="2025-12-05 12:48:26.700070423 +0000 UTC m=+1005.004054600" Dec 05 12:48:27.078842 master-0 kubenswrapper[8731]: I1205 12:48:27.078784 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tfcdn"] Dec 05 12:48:27.086858 master-0 kubenswrapper[8731]: W1205 12:48:27.086795 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3285cb4d_9385_4d55_96a7_567d01bce0d6.slice/crio-34f2b4ab09473d6dcd0b09d0edf008dd7e6280048603488b5d25a0a3ba9e2ed1 WatchSource:0}: Error finding container 34f2b4ab09473d6dcd0b09d0edf008dd7e6280048603488b5d25a0a3ba9e2ed1: Status 404 returned error can't find the container with id 34f2b4ab09473d6dcd0b09d0edf008dd7e6280048603488b5d25a0a3ba9e2ed1 Dec 05 12:48:27.584617 master-0 kubenswrapper[8731]: I1205 12:48:27.584535 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:27.584617 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:27.584617 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:27.584617 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:27.584865 master-0 kubenswrapper[8731]: I1205 12:48:27.584644 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:27.665264 master-0 kubenswrapper[8731]: I1205 12:48:27.665212 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflj7" event={"ID":"77883ef6-1901-4be4-89e8-4d6ecfe766df","Type":"ContainerStarted","Data":"fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d"} Dec 05 12:48:27.667437 master-0 kubenswrapper[8731]: I1205 12:48:27.667383 8731 generic.go:334] "Generic (PLEG): container finished" podID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerID="2b938be2f812e431241f9048d46726ec98832d7511a8c889c22fa9adcac223ca" exitCode=0 Dec 05 12:48:27.667581 master-0 kubenswrapper[8731]: I1205 12:48:27.667535 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcdn" event={"ID":"3285cb4d-9385-4d55-96a7-567d01bce0d6","Type":"ContainerDied","Data":"2b938be2f812e431241f9048d46726ec98832d7511a8c889c22fa9adcac223ca"} Dec 05 12:48:27.667630 master-0 kubenswrapper[8731]: I1205 12:48:27.667593 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcdn" event={"ID":"3285cb4d-9385-4d55-96a7-567d01bce0d6","Type":"ContainerStarted","Data":"34f2b4ab09473d6dcd0b09d0edf008dd7e6280048603488b5d25a0a3ba9e2ed1"} Dec 05 12:48:28.585589 master-0 kubenswrapper[8731]: I1205 12:48:28.585495 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:28.585589 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:28.585589 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:28.585589 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:28.585837 master-0 kubenswrapper[8731]: I1205 12:48:28.585622 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:28.698387 master-0 kubenswrapper[8731]: I1205 12:48:28.697799 8731 generic.go:334] "Generic (PLEG): container finished" podID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerID="fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d" exitCode=0 Dec 05 12:48:28.698954 master-0 kubenswrapper[8731]: I1205 12:48:28.698423 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflj7" event={"ID":"77883ef6-1901-4be4-89e8-4d6ecfe766df","Type":"ContainerDied","Data":"fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d"} Dec 05 12:48:28.709199 master-0 kubenswrapper[8731]: I1205 12:48:28.709040 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcdn" event={"ID":"3285cb4d-9385-4d55-96a7-567d01bce0d6","Type":"ContainerStarted","Data":"1ba98c3301f4ceb58183d619b5ab8dac2952ed95ab754a2856eb60287488d3d9"} Dec 05 12:48:28.711964 master-0 kubenswrapper[8731]: I1205 12:48:28.711876 8731 generic.go:334] "Generic (PLEG): container finished" podID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerID="92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884" exitCode=0 Dec 05 12:48:28.711964 master-0 kubenswrapper[8731]: I1205 12:48:28.711918 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vljsk" event={"ID":"270f4b35-a279-4edd-9f6d-cd42d22730d4","Type":"ContainerDied","Data":"92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884"} Dec 05 12:48:28.935031 master-0 kubenswrapper[8731]: I1205 12:48:28.934966 8731 scope.go:117] "RemoveContainer" containerID="33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654" Dec 05 12:48:29.589087 master-0 kubenswrapper[8731]: I1205 12:48:29.588914 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:29.589087 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:29.589087 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:29.589087 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:29.589087 master-0 kubenswrapper[8731]: I1205 12:48:29.589012 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:29.722361 master-0 kubenswrapper[8731]: I1205 12:48:29.722270 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vljsk" event={"ID":"270f4b35-a279-4edd-9f6d-cd42d22730d4","Type":"ContainerStarted","Data":"4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d"} Dec 05 12:48:29.724812 master-0 kubenswrapper[8731]: I1205 12:48:29.724766 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/4.log" Dec 05 12:48:29.724922 master-0 kubenswrapper[8731]: I1205 12:48:29.724885 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerStarted","Data":"f31f1f557e375896231e731e60b18a48878ddaf2be696f8a53d9f13550375166"} Dec 05 12:48:29.727418 master-0 kubenswrapper[8731]: I1205 12:48:29.727374 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflj7" event={"ID":"77883ef6-1901-4be4-89e8-4d6ecfe766df","Type":"ContainerStarted","Data":"0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1"} Dec 05 12:48:29.729353 master-0 kubenswrapper[8731]: I1205 12:48:29.729322 8731 generic.go:334] "Generic (PLEG): container finished" podID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerID="1ba98c3301f4ceb58183d619b5ab8dac2952ed95ab754a2856eb60287488d3d9" exitCode=0 Dec 05 12:48:29.729495 master-0 kubenswrapper[8731]: I1205 12:48:29.729354 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcdn" event={"ID":"3285cb4d-9385-4d55-96a7-567d01bce0d6","Type":"ContainerDied","Data":"1ba98c3301f4ceb58183d619b5ab8dac2952ed95ab754a2856eb60287488d3d9"} Dec 05 12:48:29.749200 master-0 kubenswrapper[8731]: I1205 12:48:29.749076 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vljsk" podStartSLOduration=4.282814826 podStartE2EDuration="6.749048476s" podCreationTimestamp="2025-12-05 12:48:23 +0000 UTC" firstStartedPulling="2025-12-05 12:48:26.658985453 +0000 UTC m=+1004.962969620" lastFinishedPulling="2025-12-05 12:48:29.125219103 +0000 UTC m=+1007.429203270" observedRunningTime="2025-12-05 12:48:29.747303998 +0000 UTC m=+1008.051288165" watchObservedRunningTime="2025-12-05 12:48:29.749048476 +0000 UTC m=+1008.053032653" Dec 05 12:48:29.812542 master-0 kubenswrapper[8731]: I1205 12:48:29.812459 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cflj7" podStartSLOduration=4.300564913 podStartE2EDuration="6.812422548s" podCreationTimestamp="2025-12-05 12:48:23 +0000 UTC" firstStartedPulling="2025-12-05 12:48:26.65886716 +0000 UTC m=+1004.962851327" lastFinishedPulling="2025-12-05 12:48:29.170724795 +0000 UTC m=+1007.474708962" observedRunningTime="2025-12-05 12:48:29.807879374 +0000 UTC m=+1008.111863541" watchObservedRunningTime="2025-12-05 12:48:29.812422548 +0000 UTC m=+1008.116406715" Dec 05 12:48:30.516292 master-0 kubenswrapper[8731]: I1205 12:48:30.516161 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:48:30.520051 master-0 kubenswrapper[8731]: I1205 12:48:30.520016 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:48:30.586243 master-0 kubenswrapper[8731]: I1205 12:48:30.586129 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:30.586243 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:30.586243 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:30.586243 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:30.586243 master-0 kubenswrapper[8731]: I1205 12:48:30.586239 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:30.739150 master-0 kubenswrapper[8731]: I1205 12:48:30.739072 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcdn" event={"ID":"3285cb4d-9385-4d55-96a7-567d01bce0d6","Type":"ContainerStarted","Data":"bbce9a0f764587b69dddd3fae15162d689aff839b3a5a9256a2124c4e45788cc"} Dec 05 12:48:30.739943 master-0 kubenswrapper[8731]: I1205 12:48:30.739902 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:48:31.585271 master-0 kubenswrapper[8731]: I1205 12:48:31.585153 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:31.585271 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:31.585271 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:31.585271 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:31.585598 master-0 kubenswrapper[8731]: I1205 12:48:31.585303 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:32.584820 master-0 kubenswrapper[8731]: I1205 12:48:32.584691 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:32.584820 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:32.584820 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:32.584820 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:32.584820 master-0 kubenswrapper[8731]: I1205 12:48:32.584786 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:33.584353 master-0 kubenswrapper[8731]: I1205 12:48:33.584300 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:33.584353 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:33.584353 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:33.584353 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:33.584725 master-0 kubenswrapper[8731]: I1205 12:48:33.584360 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:33.789258 master-0 kubenswrapper[8731]: I1205 12:48:33.788053 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:33.789258 master-0 kubenswrapper[8731]: I1205 12:48:33.788135 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:33.796556 master-0 kubenswrapper[8731]: I1205 12:48:33.796498 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:33.796556 master-0 kubenswrapper[8731]: I1205 12:48:33.796558 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:33.842266 master-0 kubenswrapper[8731]: I1205 12:48:33.842118 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:34.372725 master-0 kubenswrapper[8731]: I1205 12:48:34.372613 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tfcdn" podStartSLOduration=5.9162209919999995 podStartE2EDuration="8.372590752s" podCreationTimestamp="2025-12-05 12:48:26 +0000 UTC" firstStartedPulling="2025-12-05 12:48:27.669092506 +0000 UTC m=+1005.973076673" lastFinishedPulling="2025-12-05 12:48:30.125462266 +0000 UTC m=+1008.429446433" observedRunningTime="2025-12-05 12:48:31.061014669 +0000 UTC m=+1009.364998846" watchObservedRunningTime="2025-12-05 12:48:34.372590752 +0000 UTC m=+1012.676574919" Dec 05 12:48:34.584565 master-0 kubenswrapper[8731]: I1205 12:48:34.584508 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:34.584565 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:34.584565 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:34.584565 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:34.584919 master-0 kubenswrapper[8731]: I1205 12:48:34.584581 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:34.810767 master-0 kubenswrapper[8731]: I1205 12:48:34.810683 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:34.830028 master-0 kubenswrapper[8731]: I1205 12:48:34.829957 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cflj7" podUID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerName="registry-server" probeResult="failure" output=< Dec 05 12:48:34.830028 master-0 kubenswrapper[8731]: timeout: failed to connect service ":50051" within 1s Dec 05 12:48:34.830028 master-0 kubenswrapper[8731]: > Dec 05 12:48:35.250705 master-0 kubenswrapper[8731]: I1205 12:48:35.250639 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:48:35.561817 master-0 kubenswrapper[8731]: I1205 12:48:35.561652 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 05 12:48:35.562715 master-0 kubenswrapper[8731]: I1205 12:48:35.562677 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:35.565430 master-0 kubenswrapper[8731]: I1205 12:48:35.565367 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 12:48:35.565504 master-0 kubenswrapper[8731]: I1205 12:48:35.565464 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-74gjx" Dec 05 12:48:35.585424 master-0 kubenswrapper[8731]: I1205 12:48:35.585353 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 05 12:48:35.586104 master-0 kubenswrapper[8731]: I1205 12:48:35.586055 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:35.586104 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:35.586104 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:35.586104 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:35.586321 master-0 kubenswrapper[8731]: I1205 12:48:35.586134 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:35.699426 master-0 kubenswrapper[8731]: I1205 12:48:35.699346 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kube-api-access\") pod \"installer-2-master-0\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:35.699426 master-0 kubenswrapper[8731]: I1205 12:48:35.699427 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:35.699773 master-0 kubenswrapper[8731]: I1205 12:48:35.699463 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-var-lock\") pod \"installer-2-master-0\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:35.801916 master-0 kubenswrapper[8731]: I1205 12:48:35.800586 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kube-api-access\") pod \"installer-2-master-0\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:35.801916 master-0 kubenswrapper[8731]: I1205 12:48:35.800725 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:35.801916 master-0 kubenswrapper[8731]: I1205 12:48:35.800761 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-var-lock\") pod \"installer-2-master-0\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:35.801916 master-0 kubenswrapper[8731]: I1205 12:48:35.800835 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:35.801916 master-0 kubenswrapper[8731]: I1205 12:48:35.800840 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-var-lock\") pod \"installer-2-master-0\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:35.819311 master-0 kubenswrapper[8731]: I1205 12:48:35.819170 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kube-api-access\") pod \"installer-2-master-0\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:35.880844 master-0 kubenswrapper[8731]: I1205 12:48:35.880778 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:48:36.324265 master-0 kubenswrapper[8731]: I1205 12:48:36.321432 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 05 12:48:36.584322 master-0 kubenswrapper[8731]: I1205 12:48:36.584148 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:36.584322 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:36.584322 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:36.584322 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:36.584322 master-0 kubenswrapper[8731]: I1205 12:48:36.584239 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:36.649725 master-0 kubenswrapper[8731]: I1205 12:48:36.649594 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:36.650024 master-0 kubenswrapper[8731]: I1205 12:48:36.649729 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:36.693252 master-0 kubenswrapper[8731]: I1205 12:48:36.693171 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:36.793371 master-0 kubenswrapper[8731]: I1205 12:48:36.793170 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"03d7ab51-31d5-4ee7-9262-38dc86e5cb77","Type":"ContainerStarted","Data":"b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39"} Dec 05 12:48:36.793371 master-0 kubenswrapper[8731]: I1205 12:48:36.793255 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"03d7ab51-31d5-4ee7-9262-38dc86e5cb77","Type":"ContainerStarted","Data":"98bd4438caffed0c526c52d31971c736bf48b2ab011b662b4597f37de3a58ded"} Dec 05 12:48:36.829430 master-0 kubenswrapper[8731]: I1205 12:48:36.829356 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:37.166906 master-0 kubenswrapper[8731]: I1205 12:48:37.166828 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vljsk"] Dec 05 12:48:37.167225 master-0 kubenswrapper[8731]: I1205 12:48:37.167111 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vljsk" podUID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerName="registry-server" containerID="cri-o://4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d" gracePeriod=2 Dec 05 12:48:37.186467 master-0 kubenswrapper[8731]: I1205 12:48:37.186335 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r84v9"] Dec 05 12:48:37.188641 master-0 kubenswrapper[8731]: I1205 12:48:37.188595 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.193884 master-0 kubenswrapper[8731]: I1205 12:48:37.193807 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 05 12:48:37.194864 master-0 kubenswrapper[8731]: I1205 12:48:37.194819 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.197800 master-0 kubenswrapper[8731]: I1205 12:48:37.197756 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kkk9n" Dec 05 12:48:37.198000 master-0 kubenswrapper[8731]: I1205 12:48:37.197982 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 12:48:37.198880 master-0 kubenswrapper[8731]: I1205 12:48:37.198381 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.198362747 podStartE2EDuration="2.198362747s" podCreationTimestamp="2025-12-05 12:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:48:37.171457867 +0000 UTC m=+1015.475442044" watchObservedRunningTime="2025-12-05 12:48:37.198362747 +0000 UTC m=+1015.502346924" Dec 05 12:48:37.271214 master-0 kubenswrapper[8731]: I1205 12:48:37.270751 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 05 12:48:37.279132 master-0 kubenswrapper[8731]: I1205 12:48:37.279076 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r84v9"] Dec 05 12:48:37.325216 master-0 kubenswrapper[8731]: I1205 12:48:37.325017 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-utilities\") pod \"community-operators-r84v9\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.325216 master-0 kubenswrapper[8731]: I1205 12:48:37.325107 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-826gl\" (UniqueName: \"kubernetes.io/projected/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-kube-api-access-826gl\") pod \"community-operators-r84v9\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.325216 master-0 kubenswrapper[8731]: I1205 12:48:37.325157 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-catalog-content\") pod \"community-operators-r84v9\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.325216 master-0 kubenswrapper[8731]: I1205 12:48:37.325206 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4957e218-f580-41a9-866a-fd4f92a3c007-kube-api-access\") pod \"installer-3-master-0\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.325634 master-0 kubenswrapper[8731]: I1205 12:48:37.325264 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.325634 master-0 kubenswrapper[8731]: I1205 12:48:37.325293 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-var-lock\") pod \"installer-3-master-0\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.428400 master-0 kubenswrapper[8731]: I1205 12:48:37.428221 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.428400 master-0 kubenswrapper[8731]: I1205 12:48:37.428267 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-var-lock\") pod \"installer-3-master-0\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.428769 master-0 kubenswrapper[8731]: I1205 12:48:37.428416 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.428769 master-0 kubenswrapper[8731]: I1205 12:48:37.428480 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-utilities\") pod \"community-operators-r84v9\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.428769 master-0 kubenswrapper[8731]: I1205 12:48:37.428499 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-var-lock\") pod \"installer-3-master-0\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.428932 master-0 kubenswrapper[8731]: I1205 12:48:37.428889 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-826gl\" (UniqueName: \"kubernetes.io/projected/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-kube-api-access-826gl\") pod \"community-operators-r84v9\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.429091 master-0 kubenswrapper[8731]: I1205 12:48:37.429052 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-catalog-content\") pod \"community-operators-r84v9\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.429151 master-0 kubenswrapper[8731]: I1205 12:48:37.429099 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4957e218-f580-41a9-866a-fd4f92a3c007-kube-api-access\") pod \"installer-3-master-0\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.429151 master-0 kubenswrapper[8731]: I1205 12:48:37.429105 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-utilities\") pod \"community-operators-r84v9\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.429484 master-0 kubenswrapper[8731]: I1205 12:48:37.429442 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-catalog-content\") pod \"community-operators-r84v9\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.447723 master-0 kubenswrapper[8731]: I1205 12:48:37.447643 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4957e218-f580-41a9-866a-fd4f92a3c007-kube-api-access\") pod \"installer-3-master-0\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.458653 master-0 kubenswrapper[8731]: I1205 12:48:37.455807 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-826gl\" (UniqueName: \"kubernetes.io/projected/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-kube-api-access-826gl\") pod \"community-operators-r84v9\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.525205 master-0 kubenswrapper[8731]: I1205 12:48:37.525121 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:37.601209 master-0 kubenswrapper[8731]: I1205 12:48:37.600771 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:37.601209 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:37.601209 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:37.601209 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:37.601209 master-0 kubenswrapper[8731]: I1205 12:48:37.600998 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:37.645249 master-0 kubenswrapper[8731]: I1205 12:48:37.634277 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:37.645249 master-0 kubenswrapper[8731]: I1205 12:48:37.636393 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:48:37.735622 master-0 kubenswrapper[8731]: I1205 12:48:37.733990 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knnqs\" (UniqueName: \"kubernetes.io/projected/270f4b35-a279-4edd-9f6d-cd42d22730d4-kube-api-access-knnqs\") pod \"270f4b35-a279-4edd-9f6d-cd42d22730d4\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " Dec 05 12:48:37.735622 master-0 kubenswrapper[8731]: I1205 12:48:37.734211 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-utilities\") pod \"270f4b35-a279-4edd-9f6d-cd42d22730d4\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " Dec 05 12:48:37.735622 master-0 kubenswrapper[8731]: I1205 12:48:37.734293 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-catalog-content\") pod \"270f4b35-a279-4edd-9f6d-cd42d22730d4\" (UID: \"270f4b35-a279-4edd-9f6d-cd42d22730d4\") " Dec 05 12:48:37.735879 master-0 kubenswrapper[8731]: I1205 12:48:37.735787 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-utilities" (OuterVolumeSpecName: "utilities") pod "270f4b35-a279-4edd-9f6d-cd42d22730d4" (UID: "270f4b35-a279-4edd-9f6d-cd42d22730d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:48:37.743251 master-0 kubenswrapper[8731]: I1205 12:48:37.739437 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/270f4b35-a279-4edd-9f6d-cd42d22730d4-kube-api-access-knnqs" (OuterVolumeSpecName: "kube-api-access-knnqs") pod "270f4b35-a279-4edd-9f6d-cd42d22730d4" (UID: "270f4b35-a279-4edd-9f6d-cd42d22730d4"). InnerVolumeSpecName "kube-api-access-knnqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:48:37.760227 master-0 kubenswrapper[8731]: I1205 12:48:37.758643 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "270f4b35-a279-4edd-9f6d-cd42d22730d4" (UID: "270f4b35-a279-4edd-9f6d-cd42d22730d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:48:37.804536 master-0 kubenswrapper[8731]: I1205 12:48:37.804337 8731 generic.go:334] "Generic (PLEG): container finished" podID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerID="4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d" exitCode=0 Dec 05 12:48:37.804536 master-0 kubenswrapper[8731]: I1205 12:48:37.804431 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vljsk" Dec 05 12:48:37.804782 master-0 kubenswrapper[8731]: I1205 12:48:37.804421 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vljsk" event={"ID":"270f4b35-a279-4edd-9f6d-cd42d22730d4","Type":"ContainerDied","Data":"4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d"} Dec 05 12:48:37.804782 master-0 kubenswrapper[8731]: I1205 12:48:37.804601 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vljsk" event={"ID":"270f4b35-a279-4edd-9f6d-cd42d22730d4","Type":"ContainerDied","Data":"d289281125e97f9e5bec8733eaa93516e3b4640f9c10f9824b836457154aba5b"} Dec 05 12:48:37.804782 master-0 kubenswrapper[8731]: I1205 12:48:37.804631 8731 scope.go:117] "RemoveContainer" containerID="4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d" Dec 05 12:48:37.823275 master-0 kubenswrapper[8731]: I1205 12:48:37.823090 8731 scope.go:117] "RemoveContainer" containerID="92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884" Dec 05 12:48:37.836364 master-0 kubenswrapper[8731]: I1205 12:48:37.836004 8731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:37.836364 master-0 kubenswrapper[8731]: I1205 12:48:37.836047 8731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/270f4b35-a279-4edd-9f6d-cd42d22730d4-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:37.836364 master-0 kubenswrapper[8731]: I1205 12:48:37.836062 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knnqs\" (UniqueName: \"kubernetes.io/projected/270f4b35-a279-4edd-9f6d-cd42d22730d4-kube-api-access-knnqs\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:37.846804 master-0 kubenswrapper[8731]: I1205 12:48:37.846757 8731 scope.go:117] "RemoveContainer" containerID="9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64" Dec 05 12:48:37.904522 master-0 kubenswrapper[8731]: I1205 12:48:37.904470 8731 scope.go:117] "RemoveContainer" containerID="4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d" Dec 05 12:48:37.905127 master-0 kubenswrapper[8731]: E1205 12:48:37.905087 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d\": container with ID starting with 4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d not found: ID does not exist" containerID="4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d" Dec 05 12:48:37.905214 master-0 kubenswrapper[8731]: I1205 12:48:37.905122 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d"} err="failed to get container status \"4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d\": rpc error: code = NotFound desc = could not find container \"4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d\": container with ID starting with 4affe50dcf74cd705c76cabe02135c3ad2430f35a0bb076dc7cf18b90fb3669d not found: ID does not exist" Dec 05 12:48:37.905214 master-0 kubenswrapper[8731]: I1205 12:48:37.905143 8731 scope.go:117] "RemoveContainer" containerID="92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884" Dec 05 12:48:37.905585 master-0 kubenswrapper[8731]: E1205 12:48:37.905527 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884\": container with ID starting with 92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884 not found: ID does not exist" containerID="92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884" Dec 05 12:48:37.905638 master-0 kubenswrapper[8731]: I1205 12:48:37.905595 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884"} err="failed to get container status \"92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884\": rpc error: code = NotFound desc = could not find container \"92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884\": container with ID starting with 92a45bf07d6d81267b3d5038fefdaff6a874fb344cce5ce7456109a29b4f4884 not found: ID does not exist" Dec 05 12:48:37.905638 master-0 kubenswrapper[8731]: I1205 12:48:37.905629 8731 scope.go:117] "RemoveContainer" containerID="9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64" Dec 05 12:48:37.906053 master-0 kubenswrapper[8731]: E1205 12:48:37.905993 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64\": container with ID starting with 9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64 not found: ID does not exist" containerID="9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64" Dec 05 12:48:37.906053 master-0 kubenswrapper[8731]: I1205 12:48:37.906023 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64"} err="failed to get container status \"9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64\": rpc error: code = NotFound desc = could not find container \"9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64\": container with ID starting with 9d35bc057d8b64cde9727c67ea09bdd1ba0ff48b8286cc11d823030f3db0bb64 not found: ID does not exist" Dec 05 12:48:38.584818 master-0 kubenswrapper[8731]: I1205 12:48:38.584702 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:38.584818 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:38.584818 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:38.584818 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:38.584818 master-0 kubenswrapper[8731]: I1205 12:48:38.584762 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:39.585813 master-0 kubenswrapper[8731]: I1205 12:48:39.585706 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:39.585813 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:39.585813 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:39.585813 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:39.585813 master-0 kubenswrapper[8731]: I1205 12:48:39.585809 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:40.377930 master-0 kubenswrapper[8731]: I1205 12:48:40.377838 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r84v9"] Dec 05 12:48:40.387233 master-0 kubenswrapper[8731]: I1205 12:48:40.385436 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 05 12:48:40.390491 master-0 kubenswrapper[8731]: W1205 12:48:40.390418 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4957e218_f580_41a9_866a_fd4f92a3c007.slice/crio-a533643c1db6301ec0bbc4a1c931a3217a3a4fe2dc4e69292c8e2163d1c11951 WatchSource:0}: Error finding container a533643c1db6301ec0bbc4a1c931a3217a3a4fe2dc4e69292c8e2163d1c11951: Status 404 returned error can't find the container with id a533643c1db6301ec0bbc4a1c931a3217a3a4fe2dc4e69292c8e2163d1c11951 Dec 05 12:48:40.390801 master-0 kubenswrapper[8731]: I1205 12:48:40.390783 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 05 12:48:40.391073 master-0 kubenswrapper[8731]: I1205 12:48:40.391002 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="03d7ab51-31d5-4ee7-9262-38dc86e5cb77" containerName="installer" containerID="cri-o://b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39" gracePeriod=30 Dec 05 12:48:40.396155 master-0 kubenswrapper[8731]: I1205 12:48:40.396110 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vljsk"] Dec 05 12:48:40.401867 master-0 kubenswrapper[8731]: I1205 12:48:40.401706 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tfcdn"] Dec 05 12:48:40.401964 master-0 kubenswrapper[8731]: I1205 12:48:40.401918 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tfcdn" podUID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerName="registry-server" containerID="cri-o://bbce9a0f764587b69dddd3fae15162d689aff839b3a5a9256a2124c4e45788cc" gracePeriod=2 Dec 05 12:48:40.411402 master-0 kubenswrapper[8731]: I1205 12:48:40.411340 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vljsk"] Dec 05 12:48:40.586651 master-0 kubenswrapper[8731]: I1205 12:48:40.586609 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:40.586651 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:40.586651 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:40.586651 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:40.587286 master-0 kubenswrapper[8731]: I1205 12:48:40.586677 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:40.847764 master-0 kubenswrapper[8731]: I1205 12:48:40.847623 8731 generic.go:334] "Generic (PLEG): container finished" podID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerID="bbce9a0f764587b69dddd3fae15162d689aff839b3a5a9256a2124c4e45788cc" exitCode=0 Dec 05 12:48:40.847764 master-0 kubenswrapper[8731]: I1205 12:48:40.847710 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcdn" event={"ID":"3285cb4d-9385-4d55-96a7-567d01bce0d6","Type":"ContainerDied","Data":"bbce9a0f764587b69dddd3fae15162d689aff839b3a5a9256a2124c4e45788cc"} Dec 05 12:48:40.850671 master-0 kubenswrapper[8731]: I1205 12:48:40.850608 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"4957e218-f580-41a9-866a-fd4f92a3c007","Type":"ContainerStarted","Data":"a533643c1db6301ec0bbc4a1c931a3217a3a4fe2dc4e69292c8e2163d1c11951"} Dec 05 12:48:40.853133 master-0 kubenswrapper[8731]: I1205 12:48:40.853082 8731 generic.go:334] "Generic (PLEG): container finished" podID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerID="8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191" exitCode=0 Dec 05 12:48:40.853133 master-0 kubenswrapper[8731]: I1205 12:48:40.853130 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r84v9" event={"ID":"6a0d8237-edfb-46b6-ad94-8aa3048ffa18","Type":"ContainerDied","Data":"8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191"} Dec 05 12:48:40.853328 master-0 kubenswrapper[8731]: I1205 12:48:40.853157 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r84v9" event={"ID":"6a0d8237-edfb-46b6-ad94-8aa3048ffa18","Type":"ContainerStarted","Data":"77ab9cf5c777a03260a75ee3fe2d32c9c434762f232d9be384eb7c622545cec2"} Dec 05 12:48:40.932710 master-0 kubenswrapper[8731]: I1205 12:48:40.932553 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:41.083723 master-0 kubenswrapper[8731]: I1205 12:48:41.083660 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-utilities\") pod \"3285cb4d-9385-4d55-96a7-567d01bce0d6\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " Dec 05 12:48:41.084000 master-0 kubenswrapper[8731]: I1205 12:48:41.083748 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-catalog-content\") pod \"3285cb4d-9385-4d55-96a7-567d01bce0d6\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " Dec 05 12:48:41.084000 master-0 kubenswrapper[8731]: I1205 12:48:41.083871 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7rhc\" (UniqueName: \"kubernetes.io/projected/3285cb4d-9385-4d55-96a7-567d01bce0d6-kube-api-access-k7rhc\") pod \"3285cb4d-9385-4d55-96a7-567d01bce0d6\" (UID: \"3285cb4d-9385-4d55-96a7-567d01bce0d6\") " Dec 05 12:48:41.084815 master-0 kubenswrapper[8731]: I1205 12:48:41.084537 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-utilities" (OuterVolumeSpecName: "utilities") pod "3285cb4d-9385-4d55-96a7-567d01bce0d6" (UID: "3285cb4d-9385-4d55-96a7-567d01bce0d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:48:41.086830 master-0 kubenswrapper[8731]: I1205 12:48:41.086797 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3285cb4d-9385-4d55-96a7-567d01bce0d6-kube-api-access-k7rhc" (OuterVolumeSpecName: "kube-api-access-k7rhc") pod "3285cb4d-9385-4d55-96a7-567d01bce0d6" (UID: "3285cb4d-9385-4d55-96a7-567d01bce0d6"). InnerVolumeSpecName "kube-api-access-k7rhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:48:41.132051 master-0 kubenswrapper[8731]: I1205 12:48:41.131984 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3285cb4d-9385-4d55-96a7-567d01bce0d6" (UID: "3285cb4d-9385-4d55-96a7-567d01bce0d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:48:41.185521 master-0 kubenswrapper[8731]: I1205 12:48:41.185405 8731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:41.185521 master-0 kubenswrapper[8731]: I1205 12:48:41.185445 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7rhc\" (UniqueName: \"kubernetes.io/projected/3285cb4d-9385-4d55-96a7-567d01bce0d6-kube-api-access-k7rhc\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:41.185521 master-0 kubenswrapper[8731]: I1205 12:48:41.185469 8731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3285cb4d-9385-4d55-96a7-567d01bce0d6-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:41.584423 master-0 kubenswrapper[8731]: I1205 12:48:41.584359 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:41.584423 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:41.584423 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:41.584423 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:41.584740 master-0 kubenswrapper[8731]: I1205 12:48:41.584428 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:41.864572 master-0 kubenswrapper[8731]: I1205 12:48:41.864415 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tfcdn" event={"ID":"3285cb4d-9385-4d55-96a7-567d01bce0d6","Type":"ContainerDied","Data":"34f2b4ab09473d6dcd0b09d0edf008dd7e6280048603488b5d25a0a3ba9e2ed1"} Dec 05 12:48:41.864572 master-0 kubenswrapper[8731]: I1205 12:48:41.864519 8731 scope.go:117] "RemoveContainer" containerID="bbce9a0f764587b69dddd3fae15162d689aff839b3a5a9256a2124c4e45788cc" Dec 05 12:48:41.865259 master-0 kubenswrapper[8731]: I1205 12:48:41.864589 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tfcdn" Dec 05 12:48:41.866462 master-0 kubenswrapper[8731]: I1205 12:48:41.866429 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"4957e218-f580-41a9-866a-fd4f92a3c007","Type":"ContainerStarted","Data":"eed2e77d9f832d089463e6b1b5c8775d3273e95a2de91d82d1ec20f52035753f"} Dec 05 12:48:41.869922 master-0 kubenswrapper[8731]: I1205 12:48:41.869879 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r84v9" event={"ID":"6a0d8237-edfb-46b6-ad94-8aa3048ffa18","Type":"ContainerStarted","Data":"54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a"} Dec 05 12:48:41.882034 master-0 kubenswrapper[8731]: I1205 12:48:41.881978 8731 scope.go:117] "RemoveContainer" containerID="1ba98c3301f4ceb58183d619b5ab8dac2952ed95ab754a2856eb60287488d3d9" Dec 05 12:48:41.929563 master-0 kubenswrapper[8731]: I1205 12:48:41.929504 8731 scope.go:117] "RemoveContainer" containerID="2b938be2f812e431241f9048d46726ec98832d7511a8c889c22fa9adcac223ca" Dec 05 12:48:41.944036 master-0 kubenswrapper[8731]: I1205 12:48:41.943982 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="270f4b35-a279-4edd-9f6d-cd42d22730d4" path="/var/lib/kubelet/pods/270f4b35-a279-4edd-9f6d-cd42d22730d4/volumes" Dec 05 12:48:42.035782 master-0 kubenswrapper[8731]: I1205 12:48:42.034736 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=5.034713765 podStartE2EDuration="5.034713765s" podCreationTimestamp="2025-12-05 12:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:48:42.031215028 +0000 UTC m=+1020.335199205" watchObservedRunningTime="2025-12-05 12:48:42.034713765 +0000 UTC m=+1020.338697932" Dec 05 12:48:42.171019 master-0 kubenswrapper[8731]: I1205 12:48:42.170868 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tfcdn"] Dec 05 12:48:42.193137 master-0 kubenswrapper[8731]: I1205 12:48:42.193031 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tfcdn"] Dec 05 12:48:42.584606 master-0 kubenswrapper[8731]: I1205 12:48:42.584551 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:42.584606 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:42.584606 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:42.584606 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:42.584971 master-0 kubenswrapper[8731]: I1205 12:48:42.584629 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:42.881533 master-0 kubenswrapper[8731]: I1205 12:48:42.881382 8731 generic.go:334] "Generic (PLEG): container finished" podID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerID="54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a" exitCode=0 Dec 05 12:48:42.881533 master-0 kubenswrapper[8731]: I1205 12:48:42.881451 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r84v9" event={"ID":"6a0d8237-edfb-46b6-ad94-8aa3048ffa18","Type":"ContainerDied","Data":"54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a"} Dec 05 12:48:42.934467 master-0 kubenswrapper[8731]: I1205 12:48:42.934415 8731 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:48:42.934759 master-0 kubenswrapper[8731]: I1205 12:48:42.934745 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:48:43.584277 master-0 kubenswrapper[8731]: I1205 12:48:43.584165 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:43.584277 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:43.584277 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:43.584277 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:43.584674 master-0 kubenswrapper[8731]: I1205 12:48:43.584277 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:43.826627 master-0 kubenswrapper[8731]: I1205 12:48:43.826553 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 05 12:48:43.831544 master-0 kubenswrapper[8731]: I1205 12:48:43.831505 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:43.871358 master-0 kubenswrapper[8731]: I1205 12:48:43.871220 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:43.942204 master-0 kubenswrapper[8731]: I1205 12:48:43.942112 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3285cb4d-9385-4d55-96a7-567d01bce0d6" path="/var/lib/kubelet/pods/3285cb4d-9385-4d55-96a7-567d01bce0d6/volumes" Dec 05 12:48:44.027495 master-0 kubenswrapper[8731]: I1205 12:48:44.027419 8731 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: I1205 12:48:44.039944 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: I1205 12:48:44.042202 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: E1205 12:48:44.042564 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerName="extract-content" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: I1205 12:48:44.042580 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerName="extract-content" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: E1205 12:48:44.042601 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerName="extract-content" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: I1205 12:48:44.042608 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerName="extract-content" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: E1205 12:48:44.042623 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerName="extract-utilities" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: I1205 12:48:44.042632 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerName="extract-utilities" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: E1205 12:48:44.042649 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerName="registry-server" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: I1205 12:48:44.042658 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerName="registry-server" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: E1205 12:48:44.042673 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerName="extract-utilities" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: I1205 12:48:44.042685 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerName="extract-utilities" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: E1205 12:48:44.042715 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerName="registry-server" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: I1205 12:48:44.042722 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerName="registry-server" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: I1205 12:48:44.042945 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="3285cb4d-9385-4d55-96a7-567d01bce0d6" containerName="registry-server" Dec 05 12:48:44.043280 master-0 kubenswrapper[8731]: I1205 12:48:44.042976 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="270f4b35-a279-4edd-9f6d-cd42d22730d4" containerName="registry-server" Dec 05 12:48:44.044995 master-0 kubenswrapper[8731]: I1205 12:48:44.043570 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.053302 master-0 kubenswrapper[8731]: I1205 12:48:44.053236 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 05 12:48:44.087579 master-0 kubenswrapper[8731]: I1205 12:48:44.087501 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 05 12:48:44.137457 master-0 kubenswrapper[8731]: I1205 12:48:44.137366 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-var-lock\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.137457 master-0 kubenswrapper[8731]: I1205 12:48:44.137427 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d215811-6210-4ec2-8356-f1533dc43f65-kube-api-access\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.137638 master-0 kubenswrapper[8731]: I1205 12:48:44.137461 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.238933 master-0 kubenswrapper[8731]: I1205 12:48:44.238880 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-var-lock\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.239051 master-0 kubenswrapper[8731]: I1205 12:48:44.238949 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d215811-6210-4ec2-8356-f1533dc43f65-kube-api-access\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.239096 master-0 kubenswrapper[8731]: I1205 12:48:44.239058 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-var-lock\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.239169 master-0 kubenswrapper[8731]: I1205 12:48:44.239133 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.239483 master-0 kubenswrapper[8731]: I1205 12:48:44.239243 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.257573 master-0 kubenswrapper[8731]: I1205 12:48:44.257512 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d215811-6210-4ec2-8356-f1533dc43f65-kube-api-access\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.377208 master-0 kubenswrapper[8731]: I1205 12:48:44.377092 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:48:44.583944 master-0 kubenswrapper[8731]: I1205 12:48:44.583876 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:44.583944 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:44.583944 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:44.583944 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:44.584312 master-0 kubenswrapper[8731]: I1205 12:48:44.583958 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:44.828105 master-0 kubenswrapper[8731]: I1205 12:48:44.828046 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 05 12:48:44.902198 master-0 kubenswrapper[8731]: I1205 12:48:44.902086 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"4d215811-6210-4ec2-8356-f1533dc43f65","Type":"ContainerStarted","Data":"a8ddc41afaf0c618d55e894f2ce13b792424c9105a66a883a048089812798f25"} Dec 05 12:48:44.905857 master-0 kubenswrapper[8731]: I1205 12:48:44.905772 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r84v9" event={"ID":"6a0d8237-edfb-46b6-ad94-8aa3048ffa18","Type":"ContainerStarted","Data":"eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9"} Dec 05 12:48:44.906304 master-0 kubenswrapper[8731]: I1205 12:48:44.906241 8731 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:48:44.906304 master-0 kubenswrapper[8731]: I1205 12:48:44.906292 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="fde55ae0-2a24-4980-9ad8-db1079735b66" Dec 05 12:48:44.927021 master-0 kubenswrapper[8731]: I1205 12:48:44.926905 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r84v9" podStartSLOduration=5.722259083 podStartE2EDuration="8.926814224s" podCreationTimestamp="2025-12-05 12:48:36 +0000 UTC" firstStartedPulling="2025-12-05 12:48:40.854668558 +0000 UTC m=+1019.158652725" lastFinishedPulling="2025-12-05 12:48:44.059223689 +0000 UTC m=+1022.363207866" observedRunningTime="2025-12-05 12:48:44.922912566 +0000 UTC m=+1023.226896763" watchObservedRunningTime="2025-12-05 12:48:44.926814224 +0000 UTC m=+1023.230798421" Dec 05 12:48:44.956998 master-0 kubenswrapper[8731]: I1205 12:48:44.956879 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=0.95684407 podStartE2EDuration="956.84407ms" podCreationTimestamp="2025-12-05 12:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:48:44.949649581 +0000 UTC m=+1023.253633748" watchObservedRunningTime="2025-12-05 12:48:44.95684407 +0000 UTC m=+1023.260828237" Dec 05 12:48:45.585142 master-0 kubenswrapper[8731]: I1205 12:48:45.585015 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:45.585142 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:45.585142 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:45.585142 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:45.585142 master-0 kubenswrapper[8731]: I1205 12:48:45.585091 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:45.913894 master-0 kubenswrapper[8731]: I1205 12:48:45.913766 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"4d215811-6210-4ec2-8356-f1533dc43f65","Type":"ContainerStarted","Data":"419f6f30a7830337f1a96ed401ad15741b6815b1dc5b3d9cd59d5f9c8beb4aa8"} Dec 05 12:48:45.945083 master-0 kubenswrapper[8731]: I1205 12:48:45.944991 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=2.944969418 podStartE2EDuration="2.944969418s" podCreationTimestamp="2025-12-05 12:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:48:45.942360006 +0000 UTC m=+1024.246344163" watchObservedRunningTime="2025-12-05 12:48:45.944969418 +0000 UTC m=+1024.248953595" Dec 05 12:48:46.584999 master-0 kubenswrapper[8731]: I1205 12:48:46.584919 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:46.584999 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:46.584999 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:46.584999 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:46.586259 master-0 kubenswrapper[8731]: I1205 12:48:46.585040 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:47.525875 master-0 kubenswrapper[8731]: I1205 12:48:47.525804 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:47.526138 master-0 kubenswrapper[8731]: I1205 12:48:47.525913 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:47.567740 master-0 kubenswrapper[8731]: I1205 12:48:47.567699 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:47.584533 master-0 kubenswrapper[8731]: I1205 12:48:47.584460 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:47.584533 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:47.584533 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:47.584533 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:47.584819 master-0 kubenswrapper[8731]: I1205 12:48:47.584543 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:47.863766 master-0 kubenswrapper[8731]: I1205 12:48:47.863619 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cflj7"] Dec 05 12:48:47.864380 master-0 kubenswrapper[8731]: I1205 12:48:47.863971 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cflj7" podUID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerName="registry-server" containerID="cri-o://0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1" gracePeriod=2 Dec 05 12:48:48.310375 master-0 kubenswrapper[8731]: I1205 12:48:48.310288 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv"] Dec 05 12:48:48.314092 master-0 kubenswrapper[8731]: I1205 12:48:48.312898 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.319996 master-0 kubenswrapper[8731]: I1205 12:48:48.317547 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-rdxkm" Dec 05 12:48:48.319996 master-0 kubenswrapper[8731]: I1205 12:48:48.317745 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 12:48:48.319996 master-0 kubenswrapper[8731]: I1205 12:48:48.319677 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:48.327224 master-0 kubenswrapper[8731]: I1205 12:48:48.326780 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv"] Dec 05 12:48:48.406507 master-0 kubenswrapper[8731]: I1205 12:48:48.405691 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-utilities\") pod \"77883ef6-1901-4be4-89e8-4d6ecfe766df\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " Dec 05 12:48:48.406507 master-0 kubenswrapper[8731]: I1205 12:48:48.405769 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-catalog-content\") pod \"77883ef6-1901-4be4-89e8-4d6ecfe766df\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " Dec 05 12:48:48.406507 master-0 kubenswrapper[8731]: I1205 12:48:48.405799 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8wng\" (UniqueName: \"kubernetes.io/projected/77883ef6-1901-4be4-89e8-4d6ecfe766df-kube-api-access-z8wng\") pod \"77883ef6-1901-4be4-89e8-4d6ecfe766df\" (UID: \"77883ef6-1901-4be4-89e8-4d6ecfe766df\") " Dec 05 12:48:48.406507 master-0 kubenswrapper[8731]: I1205 12:48:48.406113 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dtfb\" (UniqueName: \"kubernetes.io/projected/954c5c79-a96c-4c47-a4bc-024aaf4dc789-kube-api-access-7dtfb\") pod \"collect-profiles-29415645-h72bv\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.406507 master-0 kubenswrapper[8731]: I1205 12:48:48.406179 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/954c5c79-a96c-4c47-a4bc-024aaf4dc789-config-volume\") pod \"collect-profiles-29415645-h72bv\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.406507 master-0 kubenswrapper[8731]: I1205 12:48:48.406236 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/954c5c79-a96c-4c47-a4bc-024aaf4dc789-secret-volume\") pod \"collect-profiles-29415645-h72bv\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.407127 master-0 kubenswrapper[8731]: I1205 12:48:48.407094 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-utilities" (OuterVolumeSpecName: "utilities") pod "77883ef6-1901-4be4-89e8-4d6ecfe766df" (UID: "77883ef6-1901-4be4-89e8-4d6ecfe766df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:48:48.411436 master-0 kubenswrapper[8731]: I1205 12:48:48.411377 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77883ef6-1901-4be4-89e8-4d6ecfe766df-kube-api-access-z8wng" (OuterVolumeSpecName: "kube-api-access-z8wng") pod "77883ef6-1901-4be4-89e8-4d6ecfe766df" (UID: "77883ef6-1901-4be4-89e8-4d6ecfe766df"). InnerVolumeSpecName "kube-api-access-z8wng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:48:48.507460 master-0 kubenswrapper[8731]: I1205 12:48:48.507390 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dtfb\" (UniqueName: \"kubernetes.io/projected/954c5c79-a96c-4c47-a4bc-024aaf4dc789-kube-api-access-7dtfb\") pod \"collect-profiles-29415645-h72bv\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.507977 master-0 kubenswrapper[8731]: I1205 12:48:48.507704 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/954c5c79-a96c-4c47-a4bc-024aaf4dc789-config-volume\") pod \"collect-profiles-29415645-h72bv\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.508179 master-0 kubenswrapper[8731]: I1205 12:48:48.508136 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/954c5c79-a96c-4c47-a4bc-024aaf4dc789-secret-volume\") pod \"collect-profiles-29415645-h72bv\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.508360 master-0 kubenswrapper[8731]: I1205 12:48:48.508314 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8wng\" (UniqueName: \"kubernetes.io/projected/77883ef6-1901-4be4-89e8-4d6ecfe766df-kube-api-access-z8wng\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:48.508360 master-0 kubenswrapper[8731]: I1205 12:48:48.508343 8731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:48.508922 master-0 kubenswrapper[8731]: I1205 12:48:48.508888 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/954c5c79-a96c-4c47-a4bc-024aaf4dc789-config-volume\") pod \"collect-profiles-29415645-h72bv\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.512338 master-0 kubenswrapper[8731]: I1205 12:48:48.512241 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/954c5c79-a96c-4c47-a4bc-024aaf4dc789-secret-volume\") pod \"collect-profiles-29415645-h72bv\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.567319 master-0 kubenswrapper[8731]: I1205 12:48:48.563277 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77883ef6-1901-4be4-89e8-4d6ecfe766df" (UID: "77883ef6-1901-4be4-89e8-4d6ecfe766df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:48:48.567319 master-0 kubenswrapper[8731]: I1205 12:48:48.553652 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-8dbbb5754-j7x5j"] Dec 05 12:48:48.567319 master-0 kubenswrapper[8731]: E1205 12:48:48.567094 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerName="extract-utilities" Dec 05 12:48:48.567319 master-0 kubenswrapper[8731]: I1205 12:48:48.567119 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerName="extract-utilities" Dec 05 12:48:48.567319 master-0 kubenswrapper[8731]: E1205 12:48:48.567170 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerName="extract-content" Dec 05 12:48:48.568549 master-0 kubenswrapper[8731]: I1205 12:48:48.568520 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerName="extract-content" Dec 05 12:48:48.568628 master-0 kubenswrapper[8731]: E1205 12:48:48.568599 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerName="registry-server" Dec 05 12:48:48.568628 master-0 kubenswrapper[8731]: I1205 12:48:48.568613 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerName="registry-server" Dec 05 12:48:48.570198 master-0 kubenswrapper[8731]: I1205 12:48:48.570154 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerName="registry-server" Dec 05 12:48:48.583149 master-0 kubenswrapper[8731]: I1205 12:48:48.583084 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:48:48.585981 master-0 kubenswrapper[8731]: I1205 12:48:48.585281 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dtfb\" (UniqueName: \"kubernetes.io/projected/954c5c79-a96c-4c47-a4bc-024aaf4dc789-kube-api-access-7dtfb\") pod \"collect-profiles-29415645-h72bv\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.586898 master-0 kubenswrapper[8731]: I1205 12:48:48.586841 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-49k2k"] Dec 05 12:48:48.587304 master-0 kubenswrapper[8731]: I1205 12:48:48.587254 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-nz5rx" Dec 05 12:48:48.587725 master-0 kubenswrapper[8731]: I1205 12:48:48.587681 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:48.587725 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:48.587725 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:48.587725 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:48.587845 master-0 kubenswrapper[8731]: I1205 12:48:48.587744 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:48.588776 master-0 kubenswrapper[8731]: I1205 12:48:48.588744 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.591289 master-0 kubenswrapper[8731]: I1205 12:48:48.591256 8731 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-j79fh" Dec 05 12:48:48.597652 master-0 kubenswrapper[8731]: I1205 12:48:48.591642 8731 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Dec 05 12:48:48.598405 master-0 kubenswrapper[8731]: I1205 12:48:48.598204 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8dbbb5754-j7x5j"] Dec 05 12:48:48.609092 master-0 kubenswrapper[8731]: I1205 12:48:48.609044 8731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77883ef6-1901-4be4-89e8-4d6ecfe766df-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:48.659456 master-0 kubenswrapper[8731]: I1205 12:48:48.659394 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:48.713501 master-0 kubenswrapper[8731]: I1205 12:48:48.712605 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjbwh\" (UniqueName: \"kubernetes.io/projected/d3e283fe-a474-4f83-ad66-62971945060a-kube-api-access-pjbwh\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:48:48.713501 master-0 kubenswrapper[8731]: I1205 12:48:48.713088 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:48:48.713501 master-0 kubenswrapper[8731]: I1205 12:48:48.713267 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46b72a36-ef75-4fa3-a6ec-c277b2f43140-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.713501 master-0 kubenswrapper[8731]: I1205 12:48:48.713330 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/46b72a36-ef75-4fa3-a6ec-c277b2f43140-ready\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.713501 master-0 kubenswrapper[8731]: I1205 12:48:48.713406 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pr7q\" (UniqueName: \"kubernetes.io/projected/46b72a36-ef75-4fa3-a6ec-c277b2f43140-kube-api-access-9pr7q\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.713501 master-0 kubenswrapper[8731]: I1205 12:48:48.713474 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46b72a36-ef75-4fa3-a6ec-c277b2f43140-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.815491 master-0 kubenswrapper[8731]: I1205 12:48:48.815422 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:48:48.815666 master-0 kubenswrapper[8731]: I1205 12:48:48.815508 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46b72a36-ef75-4fa3-a6ec-c277b2f43140-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.816307 master-0 kubenswrapper[8731]: I1205 12:48:48.815775 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/46b72a36-ef75-4fa3-a6ec-c277b2f43140-ready\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.816307 master-0 kubenswrapper[8731]: I1205 12:48:48.815995 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pr7q\" (UniqueName: \"kubernetes.io/projected/46b72a36-ef75-4fa3-a6ec-c277b2f43140-kube-api-access-9pr7q\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.816307 master-0 kubenswrapper[8731]: I1205 12:48:48.816109 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46b72a36-ef75-4fa3-a6ec-c277b2f43140-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.816642 master-0 kubenswrapper[8731]: I1205 12:48:48.816342 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjbwh\" (UniqueName: \"kubernetes.io/projected/d3e283fe-a474-4f83-ad66-62971945060a-kube-api-access-pjbwh\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:48:48.816687 master-0 kubenswrapper[8731]: I1205 12:48:48.816641 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46b72a36-ef75-4fa3-a6ec-c277b2f43140-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.816852 master-0 kubenswrapper[8731]: I1205 12:48:48.816815 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46b72a36-ef75-4fa3-a6ec-c277b2f43140-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.817255 master-0 kubenswrapper[8731]: I1205 12:48:48.817168 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/46b72a36-ef75-4fa3-a6ec-c277b2f43140-ready\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.819762 master-0 kubenswrapper[8731]: I1205 12:48:48.819728 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:48:48.871514 master-0 kubenswrapper[8731]: I1205 12:48:48.871398 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pr7q\" (UniqueName: \"kubernetes.io/projected/46b72a36-ef75-4fa3-a6ec-c277b2f43140-kube-api-access-9pr7q\") pod \"cni-sysctl-allowlist-ds-49k2k\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.875125 master-0 kubenswrapper[8731]: I1205 12:48:48.874647 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjbwh\" (UniqueName: \"kubernetes.io/projected/d3e283fe-a474-4f83-ad66-62971945060a-kube-api-access-pjbwh\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:48:48.907508 master-0 kubenswrapper[8731]: I1205 12:48:48.907407 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:48:48.929867 master-0 kubenswrapper[8731]: I1205 12:48:48.929813 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:48.938772 master-0 kubenswrapper[8731]: I1205 12:48:48.938692 8731 generic.go:334] "Generic (PLEG): container finished" podID="77883ef6-1901-4be4-89e8-4d6ecfe766df" containerID="0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1" exitCode=0 Dec 05 12:48:48.940155 master-0 kubenswrapper[8731]: I1205 12:48:48.940114 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cflj7" Dec 05 12:48:48.942423 master-0 kubenswrapper[8731]: I1205 12:48:48.942372 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflj7" event={"ID":"77883ef6-1901-4be4-89e8-4d6ecfe766df","Type":"ContainerDied","Data":"0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1"} Dec 05 12:48:48.942672 master-0 kubenswrapper[8731]: I1205 12:48:48.942654 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cflj7" event={"ID":"77883ef6-1901-4be4-89e8-4d6ecfe766df","Type":"ContainerDied","Data":"e1fb3155f0a2c155a57ecfb157c92106694c7be09a5b743d900d93cd9f503437"} Dec 05 12:48:48.942768 master-0 kubenswrapper[8731]: I1205 12:48:48.942752 8731 scope.go:117] "RemoveContainer" containerID="0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1" Dec 05 12:48:48.962445 master-0 kubenswrapper[8731]: W1205 12:48:48.959665 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b72a36_ef75_4fa3_a6ec_c277b2f43140.slice/crio-2dcf709812bbe55e1243ca294179f5013ad6b318697998f0c8d459ef812875a2 WatchSource:0}: Error finding container 2dcf709812bbe55e1243ca294179f5013ad6b318697998f0c8d459ef812875a2: Status 404 returned error can't find the container with id 2dcf709812bbe55e1243ca294179f5013ad6b318697998f0c8d459ef812875a2 Dec 05 12:48:48.965492 master-0 kubenswrapper[8731]: I1205 12:48:48.965441 8731 scope.go:117] "RemoveContainer" containerID="fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d" Dec 05 12:48:49.026268 master-0 kubenswrapper[8731]: I1205 12:48:49.026226 8731 scope.go:117] "RemoveContainer" containerID="a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2" Dec 05 12:48:49.063874 master-0 kubenswrapper[8731]: I1205 12:48:49.063801 8731 scope.go:117] "RemoveContainer" containerID="0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1" Dec 05 12:48:49.064669 master-0 kubenswrapper[8731]: E1205 12:48:49.064619 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1\": container with ID starting with 0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1 not found: ID does not exist" containerID="0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1" Dec 05 12:48:49.064765 master-0 kubenswrapper[8731]: I1205 12:48:49.064670 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1"} err="failed to get container status \"0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1\": rpc error: code = NotFound desc = could not find container \"0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1\": container with ID starting with 0b1e267ba3c63f7b166d59d9ed5fdc6eb120cbc2869d9b85964c0acadb8b9fd1 not found: ID does not exist" Dec 05 12:48:49.064765 master-0 kubenswrapper[8731]: I1205 12:48:49.064704 8731 scope.go:117] "RemoveContainer" containerID="fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d" Dec 05 12:48:49.066733 master-0 kubenswrapper[8731]: E1205 12:48:49.066691 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d\": container with ID starting with fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d not found: ID does not exist" containerID="fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d" Dec 05 12:48:49.066733 master-0 kubenswrapper[8731]: I1205 12:48:49.066721 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d"} err="failed to get container status \"fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d\": rpc error: code = NotFound desc = could not find container \"fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d\": container with ID starting with fc68b79c809801da937a11bd9082b48f1f7b556a62f0b79db326ab28a391ad1d not found: ID does not exist" Dec 05 12:48:49.066733 master-0 kubenswrapper[8731]: I1205 12:48:49.066737 8731 scope.go:117] "RemoveContainer" containerID="a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2" Dec 05 12:48:49.067100 master-0 kubenswrapper[8731]: E1205 12:48:49.067055 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2\": container with ID starting with a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2 not found: ID does not exist" containerID="a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2" Dec 05 12:48:49.067100 master-0 kubenswrapper[8731]: I1205 12:48:49.067080 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2"} err="failed to get container status \"a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2\": rpc error: code = NotFound desc = could not find container \"a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2\": container with ID starting with a451942cf751c4001ecb45db03b8fc780e5782dcbaf61076816e785c286692e2 not found: ID does not exist" Dec 05 12:48:49.217501 master-0 kubenswrapper[8731]: I1205 12:48:49.216623 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv"] Dec 05 12:48:49.222145 master-0 kubenswrapper[8731]: I1205 12:48:49.222055 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cflj7"] Dec 05 12:48:49.346947 master-0 kubenswrapper[8731]: I1205 12:48:49.346853 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cflj7"] Dec 05 12:48:49.423003 master-0 kubenswrapper[8731]: I1205 12:48:49.422928 8731 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8dbbb5754-j7x5j"] Dec 05 12:48:49.584903 master-0 kubenswrapper[8731]: I1205 12:48:49.584850 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:49.584903 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:49.584903 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:49.584903 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:49.585196 master-0 kubenswrapper[8731]: I1205 12:48:49.584911 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:49.948515 master-0 kubenswrapper[8731]: I1205 12:48:49.947652 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77883ef6-1901-4be4-89e8-4d6ecfe766df" path="/var/lib/kubelet/pods/77883ef6-1901-4be4-89e8-4d6ecfe766df/volumes" Dec 05 12:48:49.957620 master-0 kubenswrapper[8731]: I1205 12:48:49.957541 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" event={"ID":"46b72a36-ef75-4fa3-a6ec-c277b2f43140","Type":"ContainerStarted","Data":"bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c"} Dec 05 12:48:49.957620 master-0 kubenswrapper[8731]: I1205 12:48:49.957608 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" event={"ID":"46b72a36-ef75-4fa3-a6ec-c277b2f43140","Type":"ContainerStarted","Data":"2dcf709812bbe55e1243ca294179f5013ad6b318697998f0c8d459ef812875a2"} Dec 05 12:48:49.957899 master-0 kubenswrapper[8731]: I1205 12:48:49.957821 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:49.962735 master-0 kubenswrapper[8731]: I1205 12:48:49.962678 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" event={"ID":"d3e283fe-a474-4f83-ad66-62971945060a","Type":"ContainerStarted","Data":"7ad783bf372e01adf00c6e45c58b553edb8704c6a0612bb491f7869b46f9b52d"} Dec 05 12:48:49.962812 master-0 kubenswrapper[8731]: I1205 12:48:49.962790 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" event={"ID":"d3e283fe-a474-4f83-ad66-62971945060a","Type":"ContainerStarted","Data":"7306701b7f1e349175a899928ef136fbd77aaa68bd4675a9b0f16eeeda9ca379"} Dec 05 12:48:49.964373 master-0 kubenswrapper[8731]: I1205 12:48:49.964339 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" event={"ID":"954c5c79-a96c-4c47-a4bc-024aaf4dc789","Type":"ContainerStarted","Data":"6a64d74f0d5ef7e0f5020ef79722fa9a1cfa622ec3d5ca7d9169d099609498b7"} Dec 05 12:48:49.964447 master-0 kubenswrapper[8731]: I1205 12:48:49.964381 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" event={"ID":"954c5c79-a96c-4c47-a4bc-024aaf4dc789","Type":"ContainerStarted","Data":"c1f8d00525a746947cf993ebf0bd13cbdeabfcd9444c040d4018d1355c19f19f"} Dec 05 12:48:49.979323 master-0 kubenswrapper[8731]: I1205 12:48:49.979218 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" podStartSLOduration=1.9791598160000001 podStartE2EDuration="1.979159816s" podCreationTimestamp="2025-12-05 12:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:48:49.973513581 +0000 UTC m=+1028.277497768" watchObservedRunningTime="2025-12-05 12:48:49.979159816 +0000 UTC m=+1028.283143993" Dec 05 12:48:49.994646 master-0 kubenswrapper[8731]: I1205 12:48:49.994440 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" podStartSLOduration=1.994415773 podStartE2EDuration="1.994415773s" podCreationTimestamp="2025-12-05 12:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:48:49.989541219 +0000 UTC m=+1028.293525396" watchObservedRunningTime="2025-12-05 12:48:49.994415773 +0000 UTC m=+1028.298399940" Dec 05 12:48:50.009164 master-0 kubenswrapper[8731]: I1205 12:48:50.009071 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" podStartSLOduration=2.009046293 podStartE2EDuration="2.009046293s" podCreationTimestamp="2025-12-05 12:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:48:50.005512186 +0000 UTC m=+1028.309496343" watchObservedRunningTime="2025-12-05 12:48:50.009046293 +0000 UTC m=+1028.313030460" Dec 05 12:48:50.042653 master-0 kubenswrapper[8731]: I1205 12:48:50.042545 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq"] Dec 05 12:48:50.043001 master-0 kubenswrapper[8731]: I1205 12:48:50.042881 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" podUID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" containerName="multus-admission-controller" containerID="cri-o://a6d8ffe90701aad701ac1d29ce8f42eac206024de7e62e03f130cba9a76b048e" gracePeriod=30 Dec 05 12:48:50.043086 master-0 kubenswrapper[8731]: I1205 12:48:50.043036 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" podUID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" containerName="kube-rbac-proxy" containerID="cri-o://ec7cd7b19e08539b7cab80696c72c19f718ae2a85d4adde460623354d34db0e3" gracePeriod=30 Dec 05 12:48:50.584612 master-0 kubenswrapper[8731]: I1205 12:48:50.584489 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:50.584612 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:50.584612 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:50.584612 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:50.584612 master-0 kubenswrapper[8731]: I1205 12:48:50.584594 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:50.972801 master-0 kubenswrapper[8731]: I1205 12:48:50.972723 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" event={"ID":"d3e283fe-a474-4f83-ad66-62971945060a","Type":"ContainerStarted","Data":"515973c663cd824493ca1b981576a0161a6d2ecb1bc5aa4db6d64554c07e31d5"} Dec 05 12:48:50.975684 master-0 kubenswrapper[8731]: I1205 12:48:50.975648 8731 generic.go:334] "Generic (PLEG): container finished" podID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" containerID="ec7cd7b19e08539b7cab80696c72c19f718ae2a85d4adde460623354d34db0e3" exitCode=0 Dec 05 12:48:50.976062 master-0 kubenswrapper[8731]: I1205 12:48:50.976037 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" event={"ID":"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb","Type":"ContainerDied","Data":"ec7cd7b19e08539b7cab80696c72c19f718ae2a85d4adde460623354d34db0e3"} Dec 05 12:48:50.997962 master-0 kubenswrapper[8731]: I1205 12:48:50.997886 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:48:51.586478 master-0 kubenswrapper[8731]: I1205 12:48:51.586408 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:51.586478 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:51.586478 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:51.586478 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:51.586835 master-0 kubenswrapper[8731]: I1205 12:48:51.586508 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:52.583952 master-0 kubenswrapper[8731]: I1205 12:48:52.583897 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:52.583952 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:52.583952 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:52.583952 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:52.584615 master-0 kubenswrapper[8731]: I1205 12:48:52.583963 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:53.028744 master-0 kubenswrapper[8731]: I1205 12:48:53.028690 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-49k2k"] Dec 05 12:48:53.029248 master-0 kubenswrapper[8731]: I1205 12:48:53.029220 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" podUID="46b72a36-ef75-4fa3-a6ec-c277b2f43140" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" gracePeriod=30 Dec 05 12:48:53.585750 master-0 kubenswrapper[8731]: I1205 12:48:53.585631 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:53.585750 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:53.585750 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:53.585750 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:53.585750 master-0 kubenswrapper[8731]: I1205 12:48:53.585729 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:54.002809 master-0 kubenswrapper[8731]: I1205 12:48:54.002743 8731 generic.go:334] "Generic (PLEG): container finished" podID="954c5c79-a96c-4c47-a4bc-024aaf4dc789" containerID="6a64d74f0d5ef7e0f5020ef79722fa9a1cfa622ec3d5ca7d9169d099609498b7" exitCode=0 Dec 05 12:48:54.002809 master-0 kubenswrapper[8731]: I1205 12:48:54.002799 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" event={"ID":"954c5c79-a96c-4c47-a4bc-024aaf4dc789","Type":"ContainerDied","Data":"6a64d74f0d5ef7e0f5020ef79722fa9a1cfa622ec3d5ca7d9169d099609498b7"} Dec 05 12:48:54.584154 master-0 kubenswrapper[8731]: I1205 12:48:54.584069 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:54.584154 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:54.584154 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:54.584154 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:54.584154 master-0 kubenswrapper[8731]: I1205 12:48:54.584134 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:55.357008 master-0 kubenswrapper[8731]: I1205 12:48:55.356918 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:55.524913 master-0 kubenswrapper[8731]: I1205 12:48:55.524811 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/954c5c79-a96c-4c47-a4bc-024aaf4dc789-secret-volume\") pod \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " Dec 05 12:48:55.525161 master-0 kubenswrapper[8731]: I1205 12:48:55.525026 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dtfb\" (UniqueName: \"kubernetes.io/projected/954c5c79-a96c-4c47-a4bc-024aaf4dc789-kube-api-access-7dtfb\") pod \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " Dec 05 12:48:55.525161 master-0 kubenswrapper[8731]: I1205 12:48:55.525083 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/954c5c79-a96c-4c47-a4bc-024aaf4dc789-config-volume\") pod \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\" (UID: \"954c5c79-a96c-4c47-a4bc-024aaf4dc789\") " Dec 05 12:48:55.526013 master-0 kubenswrapper[8731]: I1205 12:48:55.525928 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/954c5c79-a96c-4c47-a4bc-024aaf4dc789-config-volume" (OuterVolumeSpecName: "config-volume") pod "954c5c79-a96c-4c47-a4bc-024aaf4dc789" (UID: "954c5c79-a96c-4c47-a4bc-024aaf4dc789"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:48:55.528976 master-0 kubenswrapper[8731]: I1205 12:48:55.528856 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/954c5c79-a96c-4c47-a4bc-024aaf4dc789-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "954c5c79-a96c-4c47-a4bc-024aaf4dc789" (UID: "954c5c79-a96c-4c47-a4bc-024aaf4dc789"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:48:55.529392 master-0 kubenswrapper[8731]: I1205 12:48:55.529286 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/954c5c79-a96c-4c47-a4bc-024aaf4dc789-kube-api-access-7dtfb" (OuterVolumeSpecName: "kube-api-access-7dtfb") pod "954c5c79-a96c-4c47-a4bc-024aaf4dc789" (UID: "954c5c79-a96c-4c47-a4bc-024aaf4dc789"). InnerVolumeSpecName "kube-api-access-7dtfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:48:55.584219 master-0 kubenswrapper[8731]: I1205 12:48:55.584087 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:55.584219 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:55.584219 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:55.584219 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:55.584219 master-0 kubenswrapper[8731]: I1205 12:48:55.584169 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:55.627666 master-0 kubenswrapper[8731]: I1205 12:48:55.627591 8731 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/954c5c79-a96c-4c47-a4bc-024aaf4dc789-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:55.627666 master-0 kubenswrapper[8731]: I1205 12:48:55.627642 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dtfb\" (UniqueName: \"kubernetes.io/projected/954c5c79-a96c-4c47-a4bc-024aaf4dc789-kube-api-access-7dtfb\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:55.627666 master-0 kubenswrapper[8731]: I1205 12:48:55.627654 8731 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/954c5c79-a96c-4c47-a4bc-024aaf4dc789-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:56.016743 master-0 kubenswrapper[8731]: I1205 12:48:56.016688 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" event={"ID":"954c5c79-a96c-4c47-a4bc-024aaf4dc789","Type":"ContainerDied","Data":"c1f8d00525a746947cf993ebf0bd13cbdeabfcd9444c040d4018d1355c19f19f"} Dec 05 12:48:56.016743 master-0 kubenswrapper[8731]: I1205 12:48:56.016733 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1f8d00525a746947cf993ebf0bd13cbdeabfcd9444c040d4018d1355c19f19f" Dec 05 12:48:56.017006 master-0 kubenswrapper[8731]: I1205 12:48:56.016845 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:48:56.585299 master-0 kubenswrapper[8731]: I1205 12:48:56.585150 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:56.585299 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:56.585299 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:56.585299 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:56.585299 master-0 kubenswrapper[8731]: I1205 12:48:56.585292 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:57.248569 master-0 kubenswrapper[8731]: I1205 12:48:57.248498 8731 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 05 12:48:57.248816 master-0 kubenswrapper[8731]: I1205 12:48:57.248782 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" containerID="cri-o://89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4" gracePeriod=30 Dec 05 12:48:57.249759 master-0 kubenswrapper[8731]: I1205 12:48:57.249563 8731 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 05 12:48:57.249953 master-0 kubenswrapper[8731]: E1205 12:48:57.249914 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.249953 master-0 kubenswrapper[8731]: I1205 12:48:57.249943 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.250038 master-0 kubenswrapper[8731]: E1205 12:48:57.249963 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.250038 master-0 kubenswrapper[8731]: I1205 12:48:57.249972 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.250038 master-0 kubenswrapper[8731]: E1205 12:48:57.249991 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.250038 master-0 kubenswrapper[8731]: I1205 12:48:57.249999 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.250038 master-0 kubenswrapper[8731]: E1205 12:48:57.250012 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="954c5c79-a96c-4c47-a4bc-024aaf4dc789" containerName="collect-profiles" Dec 05 12:48:57.250038 master-0 kubenswrapper[8731]: I1205 12:48:57.250020 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="954c5c79-a96c-4c47-a4bc-024aaf4dc789" containerName="collect-profiles" Dec 05 12:48:57.250228 master-0 kubenswrapper[8731]: I1205 12:48:57.250157 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.250228 master-0 kubenswrapper[8731]: I1205 12:48:57.250203 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="954c5c79-a96c-4c47-a4bc-024aaf4dc789" containerName="collect-profiles" Dec 05 12:48:57.250228 master-0 kubenswrapper[8731]: I1205 12:48:57.250218 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.250327 master-0 kubenswrapper[8731]: I1205 12:48:57.250233 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.250389 master-0 kubenswrapper[8731]: E1205 12:48:57.250376 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.250389 master-0 kubenswrapper[8731]: I1205 12:48:57.250387 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.250582 master-0 kubenswrapper[8731]: I1205 12:48:57.250548 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e09e2af7200e6f9be469dbfd9bb1127" containerName="kube-scheduler" Dec 05 12:48:57.251669 master-0 kubenswrapper[8731]: I1205 12:48:57.251633 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:48:57.292618 master-0 kubenswrapper[8731]: I1205 12:48:57.292537 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 05 12:48:57.358397 master-0 kubenswrapper[8731]: I1205 12:48:57.358293 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:48:57.358699 master-0 kubenswrapper[8731]: I1205 12:48:57.358504 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:48:57.408778 master-0 kubenswrapper[8731]: I1205 12:48:57.408725 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:48:57.455603 master-0 kubenswrapper[8731]: I1205 12:48:57.455396 8731 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="16a0a4d4-31b5-4336-8201-b9d0ab84619e" Dec 05 12:48:57.460795 master-0 kubenswrapper[8731]: I1205 12:48:57.460709 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:48:57.460884 master-0 kubenswrapper[8731]: I1205 12:48:57.460855 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:48:57.461130 master-0 kubenswrapper[8731]: I1205 12:48:57.461014 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:48:57.461208 master-0 kubenswrapper[8731]: I1205 12:48:57.461029 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:48:57.562074 master-0 kubenswrapper[8731]: I1205 12:48:57.561932 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") pod \"5e09e2af7200e6f9be469dbfd9bb1127\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " Dec 05 12:48:57.562074 master-0 kubenswrapper[8731]: I1205 12:48:57.562035 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") pod \"5e09e2af7200e6f9be469dbfd9bb1127\" (UID: \"5e09e2af7200e6f9be469dbfd9bb1127\") " Dec 05 12:48:57.562341 master-0 kubenswrapper[8731]: I1205 12:48:57.562034 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs" (OuterVolumeSpecName: "logs") pod "5e09e2af7200e6f9be469dbfd9bb1127" (UID: "5e09e2af7200e6f9be469dbfd9bb1127"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:48:57.562341 master-0 kubenswrapper[8731]: I1205 12:48:57.562139 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets" (OuterVolumeSpecName: "secrets") pod "5e09e2af7200e6f9be469dbfd9bb1127" (UID: "5e09e2af7200e6f9be469dbfd9bb1127"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:48:57.562414 master-0 kubenswrapper[8731]: I1205 12:48:57.562383 8731 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-secrets\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:57.562414 master-0 kubenswrapper[8731]: I1205 12:48:57.562402 8731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5e09e2af7200e6f9be469dbfd9bb1127-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:57.571842 master-0 kubenswrapper[8731]: I1205 12:48:57.571784 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:57.585663 master-0 kubenswrapper[8731]: I1205 12:48:57.585601 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:57.585663 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:57.585663 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:57.585663 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:57.586310 master-0 kubenswrapper[8731]: I1205 12:48:57.585691 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:57.588285 master-0 kubenswrapper[8731]: I1205 12:48:57.588236 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:48:57.630591 master-0 kubenswrapper[8731]: I1205 12:48:57.630386 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r84v9"] Dec 05 12:48:57.943939 master-0 kubenswrapper[8731]: I1205 12:48:57.943815 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e09e2af7200e6f9be469dbfd9bb1127" path="/var/lib/kubelet/pods/5e09e2af7200e6f9be469dbfd9bb1127/volumes" Dec 05 12:48:57.944296 master-0 kubenswrapper[8731]: I1205 12:48:57.944245 8731 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 05 12:48:57.961479 master-0 kubenswrapper[8731]: I1205 12:48:57.961301 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 05 12:48:57.961479 master-0 kubenswrapper[8731]: I1205 12:48:57.961361 8731 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="16a0a4d4-31b5-4336-8201-b9d0ab84619e" Dec 05 12:48:57.963721 master-0 kubenswrapper[8731]: I1205 12:48:57.963652 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 05 12:48:57.963721 master-0 kubenswrapper[8731]: I1205 12:48:57.963712 8731 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="16a0a4d4-31b5-4336-8201-b9d0ab84619e" Dec 05 12:48:58.032909 master-0 kubenswrapper[8731]: I1205 12:48:58.032834 8731 generic.go:334] "Generic (PLEG): container finished" podID="21de9318-06b4-42ba-8791-6d22055a04f2" containerID="a6eeacf32c540b469027d242ad82a84ffbe4f8b8381d45f48601d0197961c30d" exitCode=0 Dec 05 12:48:58.032909 master-0 kubenswrapper[8731]: I1205 12:48:58.032906 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21de9318-06b4-42ba-8791-6d22055a04f2","Type":"ContainerDied","Data":"a6eeacf32c540b469027d242ad82a84ffbe4f8b8381d45f48601d0197961c30d"} Dec 05 12:48:58.035049 master-0 kubenswrapper[8731]: I1205 12:48:58.034765 8731 generic.go:334] "Generic (PLEG): container finished" podID="2cb8c983acca0c27a191b3f720d4b1e0" containerID="ce5bd605cc76993bca2c497ff38423a9bcba04863edec782efc7ee32483a630a" exitCode=0 Dec 05 12:48:58.035220 master-0 kubenswrapper[8731]: I1205 12:48:58.035176 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerDied","Data":"ce5bd605cc76993bca2c497ff38423a9bcba04863edec782efc7ee32483a630a"} Dec 05 12:48:58.035336 master-0 kubenswrapper[8731]: I1205 12:48:58.035225 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"b0475df4d5336da05f2cdbc3f74e49ad376be174c9b01bb8c74b713bd60e7ac6"} Dec 05 12:48:58.038208 master-0 kubenswrapper[8731]: I1205 12:48:58.038166 8731 generic.go:334] "Generic (PLEG): container finished" podID="5e09e2af7200e6f9be469dbfd9bb1127" containerID="89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4" exitCode=0 Dec 05 12:48:58.038381 master-0 kubenswrapper[8731]: I1205 12:48:58.038230 8731 scope.go:117] "RemoveContainer" containerID="89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4" Dec 05 12:48:58.038381 master-0 kubenswrapper[8731]: I1205 12:48:58.038266 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 05 12:48:58.038505 master-0 kubenswrapper[8731]: I1205 12:48:58.038383 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r84v9" podUID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerName="registry-server" containerID="cri-o://eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9" gracePeriod=2 Dec 05 12:48:58.063709 master-0 kubenswrapper[8731]: I1205 12:48:58.063601 8731 scope.go:117] "RemoveContainer" containerID="10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c" Dec 05 12:48:58.151995 master-0 kubenswrapper[8731]: I1205 12:48:58.151925 8731 scope.go:117] "RemoveContainer" containerID="89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4" Dec 05 12:48:58.153427 master-0 kubenswrapper[8731]: E1205 12:48:58.153357 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4\": container with ID starting with 89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4 not found: ID does not exist" containerID="89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4" Dec 05 12:48:58.153506 master-0 kubenswrapper[8731]: I1205 12:48:58.153440 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4"} err="failed to get container status \"89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4\": rpc error: code = NotFound desc = could not find container \"89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4\": container with ID starting with 89f655c6aef093abcb807f2098d9b888059fd9fc72675b02e6864da8c65272c4 not found: ID does not exist" Dec 05 12:48:58.153506 master-0 kubenswrapper[8731]: I1205 12:48:58.153482 8731 scope.go:117] "RemoveContainer" containerID="10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c" Dec 05 12:48:58.154681 master-0 kubenswrapper[8731]: E1205 12:48:58.154644 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c\": container with ID starting with 10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c not found: ID does not exist" containerID="10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c" Dec 05 12:48:58.154681 master-0 kubenswrapper[8731]: I1205 12:48:58.154673 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c"} err="failed to get container status \"10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c\": rpc error: code = NotFound desc = could not find container \"10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c\": container with ID starting with 10cc72919e024bf622844c0acf2e547b438332d060988df1527f059047162f8c not found: ID does not exist" Dec 05 12:48:58.454468 master-0 kubenswrapper[8731]: I1205 12:48:58.454406 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:58.584405 master-0 kubenswrapper[8731]: I1205 12:48:58.584362 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:58.584405 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:58.584405 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:58.584405 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:58.584689 master-0 kubenswrapper[8731]: I1205 12:48:58.584420 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:58.589419 master-0 kubenswrapper[8731]: I1205 12:48:58.588773 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-utilities\") pod \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " Dec 05 12:48:58.589419 master-0 kubenswrapper[8731]: I1205 12:48:58.589024 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-826gl\" (UniqueName: \"kubernetes.io/projected/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-kube-api-access-826gl\") pod \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " Dec 05 12:48:58.589419 master-0 kubenswrapper[8731]: I1205 12:48:58.589079 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-catalog-content\") pod \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\" (UID: \"6a0d8237-edfb-46b6-ad94-8aa3048ffa18\") " Dec 05 12:48:58.591014 master-0 kubenswrapper[8731]: I1205 12:48:58.589717 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-utilities" (OuterVolumeSpecName: "utilities") pod "6a0d8237-edfb-46b6-ad94-8aa3048ffa18" (UID: "6a0d8237-edfb-46b6-ad94-8aa3048ffa18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:48:58.594381 master-0 kubenswrapper[8731]: I1205 12:48:58.594295 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-kube-api-access-826gl" (OuterVolumeSpecName: "kube-api-access-826gl") pod "6a0d8237-edfb-46b6-ad94-8aa3048ffa18" (UID: "6a0d8237-edfb-46b6-ad94-8aa3048ffa18"). InnerVolumeSpecName "kube-api-access-826gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:48:58.672584 master-0 kubenswrapper[8731]: I1205 12:48:58.672417 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a0d8237-edfb-46b6-ad94-8aa3048ffa18" (UID: "6a0d8237-edfb-46b6-ad94-8aa3048ffa18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:48:58.691427 master-0 kubenswrapper[8731]: I1205 12:48:58.691071 8731 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:58.691427 master-0 kubenswrapper[8731]: I1205 12:48:58.691342 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-826gl\" (UniqueName: \"kubernetes.io/projected/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-kube-api-access-826gl\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:58.691427 master-0 kubenswrapper[8731]: I1205 12:48:58.691354 8731 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a0d8237-edfb-46b6-ad94-8aa3048ffa18-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:58.933291 master-0 kubenswrapper[8731]: E1205 12:48:58.933148 8731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 12:48:58.935119 master-0 kubenswrapper[8731]: E1205 12:48:58.935066 8731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 12:48:58.936875 master-0 kubenswrapper[8731]: E1205 12:48:58.936801 8731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 12:48:58.936940 master-0 kubenswrapper[8731]: E1205 12:48:58.936888 8731 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" podUID="46b72a36-ef75-4fa3-a6ec-c277b2f43140" containerName="kube-multus-additional-cni-plugins" Dec 05 12:48:59.048627 master-0 kubenswrapper[8731]: I1205 12:48:59.048435 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"2c505d1745e5c41c810aeede53577e7297a75c5a2221af8e371f406e5004dcbf"} Dec 05 12:48:59.048627 master-0 kubenswrapper[8731]: I1205 12:48:59.048518 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"ba110a7b76ad288df7047b8cf5908c2bd3487d9f6a715466f139c0f2eb3f27da"} Dec 05 12:48:59.048627 master-0 kubenswrapper[8731]: I1205 12:48:59.048533 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"b24c1b8d78045ff86297a6b78ba71b900f89c5e046061babf21a495bd9bf95d3"} Dec 05 12:48:59.049155 master-0 kubenswrapper[8731]: I1205 12:48:59.048719 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:48:59.055114 master-0 kubenswrapper[8731]: I1205 12:48:59.055016 8731 generic.go:334] "Generic (PLEG): container finished" podID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerID="eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9" exitCode=0 Dec 05 12:48:59.055114 master-0 kubenswrapper[8731]: I1205 12:48:59.055091 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r84v9" Dec 05 12:48:59.055632 master-0 kubenswrapper[8731]: I1205 12:48:59.055495 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r84v9" event={"ID":"6a0d8237-edfb-46b6-ad94-8aa3048ffa18","Type":"ContainerDied","Data":"eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9"} Dec 05 12:48:59.055632 master-0 kubenswrapper[8731]: I1205 12:48:59.055546 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r84v9" event={"ID":"6a0d8237-edfb-46b6-ad94-8aa3048ffa18","Type":"ContainerDied","Data":"77ab9cf5c777a03260a75ee3fe2d32c9c434762f232d9be384eb7c622545cec2"} Dec 05 12:48:59.055632 master-0 kubenswrapper[8731]: I1205 12:48:59.055569 8731 scope.go:117] "RemoveContainer" containerID="eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9" Dec 05 12:48:59.078703 master-0 kubenswrapper[8731]: I1205 12:48:59.078601 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.078573025 podStartE2EDuration="2.078573025s" podCreationTimestamp="2025-12-05 12:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:48:59.073701502 +0000 UTC m=+1037.377685689" watchObservedRunningTime="2025-12-05 12:48:59.078573025 +0000 UTC m=+1037.382557202" Dec 05 12:48:59.079885 master-0 kubenswrapper[8731]: I1205 12:48:59.079832 8731 scope.go:117] "RemoveContainer" containerID="54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a" Dec 05 12:48:59.097760 master-0 kubenswrapper[8731]: I1205 12:48:59.097664 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r84v9"] Dec 05 12:48:59.111024 master-0 kubenswrapper[8731]: I1205 12:48:59.110930 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r84v9"] Dec 05 12:48:59.123693 master-0 kubenswrapper[8731]: I1205 12:48:59.123640 8731 scope.go:117] "RemoveContainer" containerID="8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191" Dec 05 12:48:59.141882 master-0 kubenswrapper[8731]: I1205 12:48:59.141840 8731 scope.go:117] "RemoveContainer" containerID="eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9" Dec 05 12:48:59.142383 master-0 kubenswrapper[8731]: E1205 12:48:59.142341 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9\": container with ID starting with eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9 not found: ID does not exist" containerID="eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9" Dec 05 12:48:59.142435 master-0 kubenswrapper[8731]: I1205 12:48:59.142388 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9"} err="failed to get container status \"eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9\": rpc error: code = NotFound desc = could not find container \"eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9\": container with ID starting with eeed3559d4ed550b9bdd641ab07d48623a3826c015628bb894c9226b0eb979c9 not found: ID does not exist" Dec 05 12:48:59.142435 master-0 kubenswrapper[8731]: I1205 12:48:59.142417 8731 scope.go:117] "RemoveContainer" containerID="54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a" Dec 05 12:48:59.142805 master-0 kubenswrapper[8731]: E1205 12:48:59.142776 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a\": container with ID starting with 54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a not found: ID does not exist" containerID="54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a" Dec 05 12:48:59.142851 master-0 kubenswrapper[8731]: I1205 12:48:59.142812 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a"} err="failed to get container status \"54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a\": rpc error: code = NotFound desc = could not find container \"54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a\": container with ID starting with 54351d47dc54d56e33dbd0335b379b1a68fc268e275a555f56958b0526432a4a not found: ID does not exist" Dec 05 12:48:59.142851 master-0 kubenswrapper[8731]: I1205 12:48:59.142839 8731 scope.go:117] "RemoveContainer" containerID="8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191" Dec 05 12:48:59.143137 master-0 kubenswrapper[8731]: E1205 12:48:59.143112 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191\": container with ID starting with 8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191 not found: ID does not exist" containerID="8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191" Dec 05 12:48:59.143199 master-0 kubenswrapper[8731]: I1205 12:48:59.143140 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191"} err="failed to get container status \"8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191\": rpc error: code = NotFound desc = could not find container \"8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191\": container with ID starting with 8670e7d85f2a2830f8c04b62a1b670e5a5e760027f79fd6dc702643b5f01b191 not found: ID does not exist" Dec 05 12:48:59.302033 master-0 kubenswrapper[8731]: I1205 12:48:59.301980 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:48:59.403051 master-0 kubenswrapper[8731]: I1205 12:48:59.402806 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-kubelet-dir\") pod \"21de9318-06b4-42ba-8791-6d22055a04f2\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " Dec 05 12:48:59.403051 master-0 kubenswrapper[8731]: I1205 12:48:59.402909 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-var-lock\") pod \"21de9318-06b4-42ba-8791-6d22055a04f2\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " Dec 05 12:48:59.403051 master-0 kubenswrapper[8731]: I1205 12:48:59.403043 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21de9318-06b4-42ba-8791-6d22055a04f2-kube-api-access\") pod \"21de9318-06b4-42ba-8791-6d22055a04f2\" (UID: \"21de9318-06b4-42ba-8791-6d22055a04f2\") " Dec 05 12:48:59.403861 master-0 kubenswrapper[8731]: I1205 12:48:59.403811 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "21de9318-06b4-42ba-8791-6d22055a04f2" (UID: "21de9318-06b4-42ba-8791-6d22055a04f2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:48:59.403919 master-0 kubenswrapper[8731]: I1205 12:48:59.403884 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-var-lock" (OuterVolumeSpecName: "var-lock") pod "21de9318-06b4-42ba-8791-6d22055a04f2" (UID: "21de9318-06b4-42ba-8791-6d22055a04f2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:48:59.406600 master-0 kubenswrapper[8731]: I1205 12:48:59.406550 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21de9318-06b4-42ba-8791-6d22055a04f2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "21de9318-06b4-42ba-8791-6d22055a04f2" (UID: "21de9318-06b4-42ba-8791-6d22055a04f2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:48:59.505052 master-0 kubenswrapper[8731]: I1205 12:48:59.504947 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:59.505316 master-0 kubenswrapper[8731]: I1205 12:48:59.505222 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21de9318-06b4-42ba-8791-6d22055a04f2-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:59.505316 master-0 kubenswrapper[8731]: I1205 12:48:59.505242 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21de9318-06b4-42ba-8791-6d22055a04f2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:48:59.585706 master-0 kubenswrapper[8731]: I1205 12:48:59.585560 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:48:59.585706 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:48:59.585706 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:48:59.585706 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:48:59.585706 master-0 kubenswrapper[8731]: I1205 12:48:59.585650 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:48:59.946638 master-0 kubenswrapper[8731]: I1205 12:48:59.946546 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" path="/var/lib/kubelet/pods/6a0d8237-edfb-46b6-ad94-8aa3048ffa18/volumes" Dec 05 12:49:00.065764 master-0 kubenswrapper[8731]: I1205 12:49:00.065699 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21de9318-06b4-42ba-8791-6d22055a04f2","Type":"ContainerDied","Data":"6cb38a8f7e475b51ec4e82d15e81123c84bcaa6f22b937b869d4c561cbe1b95c"} Dec 05 12:49:00.065764 master-0 kubenswrapper[8731]: I1205 12:49:00.065766 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cb38a8f7e475b51ec4e82d15e81123c84bcaa6f22b937b869d4c561cbe1b95c" Dec 05 12:49:00.065764 master-0 kubenswrapper[8731]: I1205 12:49:00.065733 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:49:00.585022 master-0 kubenswrapper[8731]: I1205 12:49:00.584945 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:00.585022 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:00.585022 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:00.585022 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:00.585351 master-0 kubenswrapper[8731]: I1205 12:49:00.585033 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:01.585895 master-0 kubenswrapper[8731]: I1205 12:49:01.585832 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:01.585895 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:01.585895 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:01.585895 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:01.586672 master-0 kubenswrapper[8731]: I1205 12:49:01.585905 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:02.584521 master-0 kubenswrapper[8731]: I1205 12:49:02.584467 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:02.584521 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:02.584521 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:02.584521 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:02.584521 master-0 kubenswrapper[8731]: I1205 12:49:02.584535 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:03.583947 master-0 kubenswrapper[8731]: I1205 12:49:03.583861 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:03.583947 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:03.583947 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:03.583947 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:03.584986 master-0 kubenswrapper[8731]: I1205 12:49:03.583985 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:04.584678 master-0 kubenswrapper[8731]: I1205 12:49:04.584617 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:04.584678 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:04.584678 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:04.584678 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:04.585314 master-0 kubenswrapper[8731]: I1205 12:49:04.584710 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:05.585482 master-0 kubenswrapper[8731]: I1205 12:49:05.585417 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:05.585482 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:05.585482 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:05.585482 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:05.586201 master-0 kubenswrapper[8731]: I1205 12:49:05.585510 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:06.585665 master-0 kubenswrapper[8731]: I1205 12:49:06.585559 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:06.585665 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:06.585665 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:06.585665 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:06.586522 master-0 kubenswrapper[8731]: I1205 12:49:06.585685 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:07.615267 master-0 kubenswrapper[8731]: I1205 12:49:07.615199 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:07.615267 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:07.615267 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:07.615267 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:07.617093 master-0 kubenswrapper[8731]: I1205 12:49:07.615273 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:08.039521 master-0 kubenswrapper[8731]: I1205 12:49:08.039464 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_03d7ab51-31d5-4ee7-9262-38dc86e5cb77/installer/0.log" Dec 05 12:49:08.039773 master-0 kubenswrapper[8731]: I1205 12:49:08.039565 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:49:08.124777 master-0 kubenswrapper[8731]: I1205 12:49:08.124723 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_03d7ab51-31d5-4ee7-9262-38dc86e5cb77/installer/0.log" Dec 05 12:49:08.124993 master-0 kubenswrapper[8731]: I1205 12:49:08.124794 8731 generic.go:334] "Generic (PLEG): container finished" podID="03d7ab51-31d5-4ee7-9262-38dc86e5cb77" containerID="b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39" exitCode=1 Dec 05 12:49:08.124993 master-0 kubenswrapper[8731]: I1205 12:49:08.124834 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"03d7ab51-31d5-4ee7-9262-38dc86e5cb77","Type":"ContainerDied","Data":"b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39"} Dec 05 12:49:08.124993 master-0 kubenswrapper[8731]: I1205 12:49:08.124876 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"03d7ab51-31d5-4ee7-9262-38dc86e5cb77","Type":"ContainerDied","Data":"98bd4438caffed0c526c52d31971c736bf48b2ab011b662b4597f37de3a58ded"} Dec 05 12:49:08.124993 master-0 kubenswrapper[8731]: I1205 12:49:08.124889 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 05 12:49:08.125198 master-0 kubenswrapper[8731]: I1205 12:49:08.124901 8731 scope.go:117] "RemoveContainer" containerID="b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39" Dec 05 12:49:08.149274 master-0 kubenswrapper[8731]: I1205 12:49:08.149220 8731 scope.go:117] "RemoveContainer" containerID="b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39" Dec 05 12:49:08.149903 master-0 kubenswrapper[8731]: E1205 12:49:08.149835 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39\": container with ID starting with b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39 not found: ID does not exist" containerID="b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39" Dec 05 12:49:08.150009 master-0 kubenswrapper[8731]: I1205 12:49:08.149966 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39"} err="failed to get container status \"b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39\": rpc error: code = NotFound desc = could not find container \"b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39\": container with ID starting with b936b34c5ae006b163925aadc6ebb2148bb55b89c45f480ecf3ab65f0edcec39 not found: ID does not exist" Dec 05 12:49:08.215518 master-0 kubenswrapper[8731]: I1205 12:49:08.215373 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kubelet-dir\") pod \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " Dec 05 12:49:08.215518 master-0 kubenswrapper[8731]: I1205 12:49:08.215478 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kube-api-access\") pod \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " Dec 05 12:49:08.215518 master-0 kubenswrapper[8731]: I1205 12:49:08.215523 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-var-lock\") pod \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\" (UID: \"03d7ab51-31d5-4ee7-9262-38dc86e5cb77\") " Dec 05 12:49:08.215868 master-0 kubenswrapper[8731]: I1205 12:49:08.215565 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "03d7ab51-31d5-4ee7-9262-38dc86e5cb77" (UID: "03d7ab51-31d5-4ee7-9262-38dc86e5cb77"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:08.215868 master-0 kubenswrapper[8731]: I1205 12:49:08.215720 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-var-lock" (OuterVolumeSpecName: "var-lock") pod "03d7ab51-31d5-4ee7-9262-38dc86e5cb77" (UID: "03d7ab51-31d5-4ee7-9262-38dc86e5cb77"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:08.216015 master-0 kubenswrapper[8731]: I1205 12:49:08.215980 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:08.216015 master-0 kubenswrapper[8731]: I1205 12:49:08.216009 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:08.219628 master-0 kubenswrapper[8731]: I1205 12:49:08.219559 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "03d7ab51-31d5-4ee7-9262-38dc86e5cb77" (UID: "03d7ab51-31d5-4ee7-9262-38dc86e5cb77"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:49:08.319918 master-0 kubenswrapper[8731]: I1205 12:49:08.317979 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03d7ab51-31d5-4ee7-9262-38dc86e5cb77-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:08.477932 master-0 kubenswrapper[8731]: I1205 12:49:08.477846 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 05 12:49:08.487725 master-0 kubenswrapper[8731]: I1205 12:49:08.487595 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 05 12:49:08.585533 master-0 kubenswrapper[8731]: I1205 12:49:08.585296 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:08.585533 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:08.585533 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:08.585533 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:08.585533 master-0 kubenswrapper[8731]: I1205 12:49:08.585402 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:08.932845 master-0 kubenswrapper[8731]: E1205 12:49:08.932624 8731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 12:49:08.934132 master-0 kubenswrapper[8731]: E1205 12:49:08.934036 8731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 12:49:08.937110 master-0 kubenswrapper[8731]: E1205 12:49:08.936010 8731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 12:49:08.937110 master-0 kubenswrapper[8731]: E1205 12:49:08.936060 8731 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" podUID="46b72a36-ef75-4fa3-a6ec-c277b2f43140" containerName="kube-multus-additional-cni-plugins" Dec 05 12:49:09.591312 master-0 kubenswrapper[8731]: I1205 12:49:09.591241 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:09.591312 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:09.591312 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:09.591312 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:09.592015 master-0 kubenswrapper[8731]: I1205 12:49:09.591975 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:09.944374 master-0 kubenswrapper[8731]: I1205 12:49:09.944310 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d7ab51-31d5-4ee7-9262-38dc86e5cb77" path="/var/lib/kubelet/pods/03d7ab51-31d5-4ee7-9262-38dc86e5cb77/volumes" Dec 05 12:49:10.585249 master-0 kubenswrapper[8731]: I1205 12:49:10.585189 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:10.585249 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:10.585249 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:10.585249 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:10.585758 master-0 kubenswrapper[8731]: I1205 12:49:10.585259 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:11.586591 master-0 kubenswrapper[8731]: I1205 12:49:11.586437 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:11.586591 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:11.586591 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:11.586591 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:11.586591 master-0 kubenswrapper[8731]: I1205 12:49:11.586570 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:12.584373 master-0 kubenswrapper[8731]: I1205 12:49:12.584319 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:12.584373 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:12.584373 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:12.584373 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:12.584373 master-0 kubenswrapper[8731]: I1205 12:49:12.584376 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:13.583598 master-0 kubenswrapper[8731]: I1205 12:49:13.583500 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:13.583598 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:13.583598 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:13.583598 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:13.584453 master-0 kubenswrapper[8731]: I1205 12:49:13.583609 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:13.886479 master-0 kubenswrapper[8731]: I1205 12:49:13.886256 8731 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 05 12:49:13.886851 master-0 kubenswrapper[8731]: I1205 12:49:13.886533 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" containerID="cri-o://8fbf247ef3f15fe005ee46e673fbe0b71698dcc9f2759966a03a8cd2730f623b" gracePeriod=30 Dec 05 12:49:13.886851 master-0 kubenswrapper[8731]: I1205 12:49:13.886707 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" containerID="cri-o://3287f56a58ec6df79eb961042eccb67f5309daab6cc145e4e1caa74cca9833e8" gracePeriod=30 Dec 05 12:49:13.887925 master-0 kubenswrapper[8731]: I1205 12:49:13.887627 8731 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:49:13.888073 master-0 kubenswrapper[8731]: E1205 12:49:13.888044 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" Dec 05 12:49:13.888073 master-0 kubenswrapper[8731]: I1205 12:49:13.888073 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" Dec 05 12:49:13.888294 master-0 kubenswrapper[8731]: E1205 12:49:13.888097 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888294 master-0 kubenswrapper[8731]: I1205 12:49:13.888107 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888294 master-0 kubenswrapper[8731]: E1205 12:49:13.888127 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerName="extract-content" Dec 05 12:49:13.888294 master-0 kubenswrapper[8731]: I1205 12:49:13.888138 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerName="extract-content" Dec 05 12:49:13.888294 master-0 kubenswrapper[8731]: E1205 12:49:13.888159 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21de9318-06b4-42ba-8791-6d22055a04f2" containerName="installer" Dec 05 12:49:13.888294 master-0 kubenswrapper[8731]: I1205 12:49:13.888168 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="21de9318-06b4-42ba-8791-6d22055a04f2" containerName="installer" Dec 05 12:49:13.888294 master-0 kubenswrapper[8731]: E1205 12:49:13.888260 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888294 master-0 kubenswrapper[8731]: I1205 12:49:13.888280 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888294 master-0 kubenswrapper[8731]: E1205 12:49:13.888300 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d7ab51-31d5-4ee7-9262-38dc86e5cb77" containerName="installer" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: I1205 12:49:13.888313 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d7ab51-31d5-4ee7-9262-38dc86e5cb77" containerName="installer" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: E1205 12:49:13.888332 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: I1205 12:49:13.888343 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: E1205 12:49:13.888360 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: I1205 12:49:13.888369 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: E1205 12:49:13.888384 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerName="registry-server" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: I1205 12:49:13.888394 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerName="registry-server" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: E1205 12:49:13.888413 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: I1205 12:49:13.888425 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: E1205 12:49:13.888443 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: I1205 12:49:13.888454 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: E1205 12:49:13.888472 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerName="extract-utilities" Dec 05 12:49:13.888654 master-0 kubenswrapper[8731]: I1205 12:49:13.888484 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerName="extract-utilities" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.888685 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.888703 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.888713 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.888728 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.888748 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.888771 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.888790 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0d8237-edfb-46b6-ad94-8aa3048ffa18" containerName="registry-server" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.888803 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d7ab51-31d5-4ee7-9262-38dc86e5cb77" containerName="installer" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.888818 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="21de9318-06b4-42ba-8791-6d22055a04f2" containerName="installer" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.888835 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: E1205 12:49:13.889027 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.889038 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: E1205 12:49:13.889057 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.889065 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: E1205 12:49:13.889081 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.889091 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="cluster-policy-controller" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.889350 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.889357 master-0 kubenswrapper[8731]: I1205 12:49:13.889365 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.890100 master-0 kubenswrapper[8731]: I1205 12:49:13.889733 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:49:13.891382 master-0 kubenswrapper[8731]: I1205 12:49:13.890969 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:13.993123 master-0 kubenswrapper[8731]: I1205 12:49:13.993012 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:49:14.006718 master-0 kubenswrapper[8731]: I1205 12:49:14.006607 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:14.007006 master-0 kubenswrapper[8731]: I1205 12:49:14.006803 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:14.108307 master-0 kubenswrapper[8731]: I1205 12:49:14.107901 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:14.108307 master-0 kubenswrapper[8731]: I1205 12:49:14.107988 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:14.108307 master-0 kubenswrapper[8731]: I1205 12:49:14.108035 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:14.108307 master-0 kubenswrapper[8731]: I1205 12:49:14.108002 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:14.175290 master-0 kubenswrapper[8731]: I1205 12:49:14.175098 8731 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="3287f56a58ec6df79eb961042eccb67f5309daab6cc145e4e1caa74cca9833e8" exitCode=0 Dec 05 12:49:14.175290 master-0 kubenswrapper[8731]: I1205 12:49:14.175148 8731 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="8fbf247ef3f15fe005ee46e673fbe0b71698dcc9f2759966a03a8cd2730f623b" exitCode=0 Dec 05 12:49:14.175578 master-0 kubenswrapper[8731]: I1205 12:49:14.175279 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34418050489e8b48781fa5128a0548228f5bdb58f7e6a5f88226bbd7dacf7bb5" Dec 05 12:49:14.175578 master-0 kubenswrapper[8731]: I1205 12:49:14.175334 8731 scope.go:117] "RemoveContainer" containerID="18611e0e5dc9f18cd497caa4c1aa34e5a3613a1a4c04005e82d9c0e5aa492ed1" Dec 05 12:49:14.177856 master-0 kubenswrapper[8731]: I1205 12:49:14.177820 8731 generic.go:334] "Generic (PLEG): container finished" podID="4957e218-f580-41a9-866a-fd4f92a3c007" containerID="eed2e77d9f832d089463e6b1b5c8775d3273e95a2de91d82d1ec20f52035753f" exitCode=0 Dec 05 12:49:14.178008 master-0 kubenswrapper[8731]: I1205 12:49:14.177922 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"4957e218-f580-41a9-866a-fd4f92a3c007","Type":"ContainerDied","Data":"eed2e77d9f832d089463e6b1b5c8775d3273e95a2de91d82d1ec20f52035753f"} Dec 05 12:49:14.186614 master-0 kubenswrapper[8731]: I1205 12:49:14.186578 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:49:14.198154 master-0 kubenswrapper[8731]: I1205 12:49:14.198085 8731 scope.go:117] "RemoveContainer" containerID="878914476f342bbe09935d11750836541a3cd256e73418d2dbee280993c5f191" Dec 05 12:49:14.289827 master-0 kubenswrapper[8731]: I1205 12:49:14.289764 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:14.310717 master-0 kubenswrapper[8731]: I1205 12:49:14.310656 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") pod \"8b47694fcc32464ab24d09c23d6efb57\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " Dec 05 12:49:14.311224 master-0 kubenswrapper[8731]: I1205 12:49:14.310783 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") pod \"8b47694fcc32464ab24d09c23d6efb57\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " Dec 05 12:49:14.311224 master-0 kubenswrapper[8731]: I1205 12:49:14.310827 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") pod \"8b47694fcc32464ab24d09c23d6efb57\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " Dec 05 12:49:14.311224 master-0 kubenswrapper[8731]: I1205 12:49:14.310832 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "8b47694fcc32464ab24d09c23d6efb57" (UID: "8b47694fcc32464ab24d09c23d6efb57"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:14.311224 master-0 kubenswrapper[8731]: I1205 12:49:14.310850 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") pod \"8b47694fcc32464ab24d09c23d6efb57\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " Dec 05 12:49:14.311224 master-0 kubenswrapper[8731]: I1205 12:49:14.310907 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets" (OuterVolumeSpecName: "secrets") pod "8b47694fcc32464ab24d09c23d6efb57" (UID: "8b47694fcc32464ab24d09c23d6efb57"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:14.311224 master-0 kubenswrapper[8731]: I1205 12:49:14.310948 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs" (OuterVolumeSpecName: "logs") pod "8b47694fcc32464ab24d09c23d6efb57" (UID: "8b47694fcc32464ab24d09c23d6efb57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:14.311224 master-0 kubenswrapper[8731]: I1205 12:49:14.310985 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") pod \"8b47694fcc32464ab24d09c23d6efb57\" (UID: \"8b47694fcc32464ab24d09c23d6efb57\") " Dec 05 12:49:14.311224 master-0 kubenswrapper[8731]: I1205 12:49:14.310938 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "8b47694fcc32464ab24d09c23d6efb57" (UID: "8b47694fcc32464ab24d09c23d6efb57"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:14.311224 master-0 kubenswrapper[8731]: I1205 12:49:14.311010 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config" (OuterVolumeSpecName: "config") pod "8b47694fcc32464ab24d09c23d6efb57" (UID: "8b47694fcc32464ab24d09c23d6efb57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:14.311816 master-0 kubenswrapper[8731]: I1205 12:49:14.311546 8731 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:14.311816 master-0 kubenswrapper[8731]: I1205 12:49:14.311566 8731 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:14.311816 master-0 kubenswrapper[8731]: I1205 12:49:14.311581 8731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:14.311816 master-0 kubenswrapper[8731]: I1205 12:49:14.311594 8731 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-secrets\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:14.311816 master-0 kubenswrapper[8731]: I1205 12:49:14.311604 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b47694fcc32464ab24d09c23d6efb57-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:14.586160 master-0 kubenswrapper[8731]: I1205 12:49:14.584821 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:14.586160 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:14.586160 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:14.586160 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:14.586160 master-0 kubenswrapper[8731]: I1205 12:49:14.584902 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:15.195567 master-0 kubenswrapper[8731]: I1205 12:49:15.195468 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ab1992e269496bc39c1df6084e6e60fd","Type":"ContainerStarted","Data":"1b3283d0fac22ca78f337b1d5e3afe8d01431a02a7bb6f2fb90c61b14196aefb"} Dec 05 12:49:15.195567 master-0 kubenswrapper[8731]: I1205 12:49:15.195555 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ab1992e269496bc39c1df6084e6e60fd","Type":"ContainerStarted","Data":"8d14f1413c8e8a2ef6cd9ab523725814ba9ff7a6021dd1c6a68ef759cfabfdf3"} Dec 05 12:49:15.195567 master-0 kubenswrapper[8731]: I1205 12:49:15.195573 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ab1992e269496bc39c1df6084e6e60fd","Type":"ContainerStarted","Data":"91dbe5959251acff62db45931eb5a5e1e4e7af9bb363ef308eee803d4237a389"} Dec 05 12:49:15.195567 master-0 kubenswrapper[8731]: I1205 12:49:15.195586 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ab1992e269496bc39c1df6084e6e60fd","Type":"ContainerStarted","Data":"78eb0a378ee87ec426723278f27c3f8944db139eff4ee08e81e705d48c517d58"} Dec 05 12:49:15.197883 master-0 kubenswrapper[8731]: I1205 12:49:15.197838 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 05 12:49:15.568299 master-0 kubenswrapper[8731]: I1205 12:49:15.568244 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:49:15.587737 master-0 kubenswrapper[8731]: I1205 12:49:15.587595 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:15.587737 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:15.587737 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:15.587737 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:15.587737 master-0 kubenswrapper[8731]: I1205 12:49:15.587670 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:15.753717 master-0 kubenswrapper[8731]: I1205 12:49:15.753669 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-kubelet-dir\") pod \"4957e218-f580-41a9-866a-fd4f92a3c007\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " Dec 05 12:49:15.754142 master-0 kubenswrapper[8731]: I1205 12:49:15.753842 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4957e218-f580-41a9-866a-fd4f92a3c007" (UID: "4957e218-f580-41a9-866a-fd4f92a3c007"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:15.754287 master-0 kubenswrapper[8731]: I1205 12:49:15.754268 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-var-lock\") pod \"4957e218-f580-41a9-866a-fd4f92a3c007\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " Dec 05 12:49:15.754443 master-0 kubenswrapper[8731]: I1205 12:49:15.754428 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4957e218-f580-41a9-866a-fd4f92a3c007-kube-api-access\") pod \"4957e218-f580-41a9-866a-fd4f92a3c007\" (UID: \"4957e218-f580-41a9-866a-fd4f92a3c007\") " Dec 05 12:49:15.755468 master-0 kubenswrapper[8731]: I1205 12:49:15.754430 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-var-lock" (OuterVolumeSpecName: "var-lock") pod "4957e218-f580-41a9-866a-fd4f92a3c007" (UID: "4957e218-f580-41a9-866a-fd4f92a3c007"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:15.755746 master-0 kubenswrapper[8731]: I1205 12:49:15.755712 8731 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:15.755746 master-0 kubenswrapper[8731]: I1205 12:49:15.755740 8731 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4957e218-f580-41a9-866a-fd4f92a3c007-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:15.758496 master-0 kubenswrapper[8731]: I1205 12:49:15.758423 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4957e218-f580-41a9-866a-fd4f92a3c007-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4957e218-f580-41a9-866a-fd4f92a3c007" (UID: "4957e218-f580-41a9-866a-fd4f92a3c007"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:49:15.857358 master-0 kubenswrapper[8731]: I1205 12:49:15.857194 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4957e218-f580-41a9-866a-fd4f92a3c007-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:15.943220 master-0 kubenswrapper[8731]: I1205 12:49:15.943140 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b47694fcc32464ab24d09c23d6efb57" path="/var/lib/kubelet/pods/8b47694fcc32464ab24d09c23d6efb57/volumes" Dec 05 12:49:15.943748 master-0 kubenswrapper[8731]: I1205 12:49:15.943718 8731 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Dec 05 12:49:16.051683 master-0 kubenswrapper[8731]: I1205 12:49:16.051623 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 05 12:49:16.051683 master-0 kubenswrapper[8731]: I1205 12:49:16.051676 8731 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="a96da0ec-36e9-4b94-8721-a73a8473d5ff" Dec 05 12:49:16.062955 master-0 kubenswrapper[8731]: I1205 12:49:16.062889 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 05 12:49:16.062955 master-0 kubenswrapper[8731]: I1205 12:49:16.062942 8731 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="a96da0ec-36e9-4b94-8721-a73a8473d5ff" Dec 05 12:49:16.206637 master-0 kubenswrapper[8731]: I1205 12:49:16.206568 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"4957e218-f580-41a9-866a-fd4f92a3c007","Type":"ContainerDied","Data":"a533643c1db6301ec0bbc4a1c931a3217a3a4fe2dc4e69292c8e2163d1c11951"} Dec 05 12:49:16.206637 master-0 kubenswrapper[8731]: I1205 12:49:16.206619 8731 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a533643c1db6301ec0bbc4a1c931a3217a3a4fe2dc4e69292c8e2163d1c11951" Dec 05 12:49:16.206974 master-0 kubenswrapper[8731]: I1205 12:49:16.206705 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:49:16.212132 master-0 kubenswrapper[8731]: I1205 12:49:16.212073 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ab1992e269496bc39c1df6084e6e60fd","Type":"ContainerStarted","Data":"0a16bc5dbf4947d3592d7a160d069d5ae407c8eecca6478799c03089401c073c"} Dec 05 12:49:16.246081 master-0 kubenswrapper[8731]: I1205 12:49:16.245979 8731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=3.2459300620000002 podStartE2EDuration="3.245930062s" podCreationTimestamp="2025-12-05 12:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:49:16.243393633 +0000 UTC m=+1054.547377800" watchObservedRunningTime="2025-12-05 12:49:16.245930062 +0000 UTC m=+1054.549914229" Dec 05 12:49:16.586017 master-0 kubenswrapper[8731]: I1205 12:49:16.585789 8731 patch_prober.go:28] interesting pod/router-default-5465c8b4db-dzlmb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 12:49:16.586017 master-0 kubenswrapper[8731]: [-]has-synced failed: reason withheld Dec 05 12:49:16.586017 master-0 kubenswrapper[8731]: [+]process-running ok Dec 05 12:49:16.586017 master-0 kubenswrapper[8731]: healthz check failed Dec 05 12:49:16.586017 master-0 kubenswrapper[8731]: I1205 12:49:16.585938 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 12:49:16.586534 master-0 kubenswrapper[8731]: I1205 12:49:16.586052 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:49:16.587192 master-0 kubenswrapper[8731]: I1205 12:49:16.587137 8731 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"b7483a678d691fbf8a3207dd7d6ed1c739a3647a4a630049897502326cc17230"} pod="openshift-ingress/router-default-5465c8b4db-dzlmb" containerMessage="Container router failed startup probe, will be restarted" Dec 05 12:49:16.587262 master-0 kubenswrapper[8731]: I1205 12:49:16.587211 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" podUID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerName="router" containerID="cri-o://b7483a678d691fbf8a3207dd7d6ed1c739a3647a4a630049897502326cc17230" gracePeriod=3600 Dec 05 12:49:18.932889 master-0 kubenswrapper[8731]: E1205 12:49:18.932790 8731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 12:49:18.934635 master-0 kubenswrapper[8731]: E1205 12:49:18.934590 8731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 12:49:18.936994 master-0 kubenswrapper[8731]: E1205 12:49:18.936909 8731 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 05 12:49:18.937088 master-0 kubenswrapper[8731]: E1205 12:49:18.937018 8731 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" podUID="46b72a36-ef75-4fa3-a6ec-c277b2f43140" containerName="kube-multus-additional-cni-plugins" Dec 05 12:49:22.438212 master-0 kubenswrapper[8731]: I1205 12:49:20.243824 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-7dfc5b745f-xlrzq_cfc37275-4e59-4f73-8b08-c8ca8ec28bbb/multus-admission-controller/0.log" Dec 05 12:49:22.438212 master-0 kubenswrapper[8731]: I1205 12:49:20.243910 8731 generic.go:334] "Generic (PLEG): container finished" podID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" containerID="a6d8ffe90701aad701ac1d29ce8f42eac206024de7e62e03f130cba9a76b048e" exitCode=137 Dec 05 12:49:22.438212 master-0 kubenswrapper[8731]: I1205 12:49:20.243960 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" event={"ID":"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb","Type":"ContainerDied","Data":"a6d8ffe90701aad701ac1d29ce8f42eac206024de7e62e03f130cba9a76b048e"} Dec 05 12:49:22.496101 master-0 kubenswrapper[8731]: I1205 12:49:22.495940 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-7dfc5b745f-xlrzq_cfc37275-4e59-4f73-8b08-c8ca8ec28bbb/multus-admission-controller/0.log" Dec 05 12:49:22.496101 master-0 kubenswrapper[8731]: I1205 12:49:22.496073 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:49:22.580721 master-0 kubenswrapper[8731]: I1205 12:49:22.580622 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6mb6\" (UniqueName: \"kubernetes.io/projected/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-kube-api-access-z6mb6\") pod \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " Dec 05 12:49:22.580721 master-0 kubenswrapper[8731]: I1205 12:49:22.580728 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") pod \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\" (UID: \"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb\") " Dec 05 12:49:22.584146 master-0 kubenswrapper[8731]: I1205 12:49:22.584075 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:49:22.585958 master-0 kubenswrapper[8731]: I1205 12:49:22.585887 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-kube-api-access-z6mb6" (OuterVolumeSpecName: "kube-api-access-z6mb6") pod "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" (UID: "cfc37275-4e59-4f73-8b08-c8ca8ec28bbb"). InnerVolumeSpecName "kube-api-access-z6mb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:49:22.682894 master-0 kubenswrapper[8731]: I1205 12:49:22.682806 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6mb6\" (UniqueName: \"kubernetes.io/projected/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-kube-api-access-z6mb6\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:22.682894 master-0 kubenswrapper[8731]: I1205 12:49:22.682871 8731 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb-webhook-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:23.277216 master-0 kubenswrapper[8731]: I1205 12:49:23.277086 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-7dfc5b745f-xlrzq_cfc37275-4e59-4f73-8b08-c8ca8ec28bbb/multus-admission-controller/0.log" Dec 05 12:49:23.277520 master-0 kubenswrapper[8731]: I1205 12:49:23.277239 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" event={"ID":"cfc37275-4e59-4f73-8b08-c8ca8ec28bbb","Type":"ContainerDied","Data":"323592a10d8975a94a7a25bad1c995c5959062afe0321ce857efdc2c6ccb6ebc"} Dec 05 12:49:23.277520 master-0 kubenswrapper[8731]: I1205 12:49:23.277316 8731 scope.go:117] "RemoveContainer" containerID="ec7cd7b19e08539b7cab80696c72c19f718ae2a85d4adde460623354d34db0e3" Dec 05 12:49:23.277520 master-0 kubenswrapper[8731]: I1205 12:49:23.277323 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq" Dec 05 12:49:23.299399 master-0 kubenswrapper[8731]: I1205 12:49:23.299337 8731 scope.go:117] "RemoveContainer" containerID="a6d8ffe90701aad701ac1d29ce8f42eac206024de7e62e03f130cba9a76b048e" Dec 05 12:49:24.237349 master-0 kubenswrapper[8731]: I1205 12:49:24.237274 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-49k2k_46b72a36-ef75-4fa3-a6ec-c277b2f43140/kube-multus-additional-cni-plugins/0.log" Dec 05 12:49:24.237969 master-0 kubenswrapper[8731]: I1205 12:49:24.237513 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:49:24.251250 master-0 kubenswrapper[8731]: I1205 12:49:24.251139 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq"] Dec 05 12:49:24.291370 master-0 kubenswrapper[8731]: I1205 12:49:24.291299 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:24.291370 master-0 kubenswrapper[8731]: I1205 12:49:24.291348 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:24.291370 master-0 kubenswrapper[8731]: I1205 12:49:24.291364 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:24.292561 master-0 kubenswrapper[8731]: I1205 12:49:24.292528 8731 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 05 12:49:24.292656 master-0 kubenswrapper[8731]: I1205 12:49:24.292572 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 05 12:49:24.292964 master-0 kubenswrapper[8731]: I1205 12:49:24.292937 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:24.296093 master-0 kubenswrapper[8731]: I1205 12:49:24.295738 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-49k2k_46b72a36-ef75-4fa3-a6ec-c277b2f43140/kube-multus-additional-cni-plugins/0.log" Dec 05 12:49:24.296093 master-0 kubenswrapper[8731]: I1205 12:49:24.295799 8731 generic.go:334] "Generic (PLEG): container finished" podID="46b72a36-ef75-4fa3-a6ec-c277b2f43140" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" exitCode=137 Dec 05 12:49:24.296093 master-0 kubenswrapper[8731]: I1205 12:49:24.295837 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" event={"ID":"46b72a36-ef75-4fa3-a6ec-c277b2f43140","Type":"ContainerDied","Data":"bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c"} Dec 05 12:49:24.296093 master-0 kubenswrapper[8731]: I1205 12:49:24.295866 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" event={"ID":"46b72a36-ef75-4fa3-a6ec-c277b2f43140","Type":"ContainerDied","Data":"2dcf709812bbe55e1243ca294179f5013ad6b318697998f0c8d459ef812875a2"} Dec 05 12:49:24.296093 master-0 kubenswrapper[8731]: I1205 12:49:24.295888 8731 scope.go:117] "RemoveContainer" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" Dec 05 12:49:24.296093 master-0 kubenswrapper[8731]: I1205 12:49:24.296062 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-49k2k" Dec 05 12:49:24.297989 master-0 kubenswrapper[8731]: I1205 12:49:24.297953 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:24.300238 master-0 kubenswrapper[8731]: I1205 12:49:24.300168 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-7dfc5b745f-xlrzq"] Dec 05 12:49:24.311262 master-0 kubenswrapper[8731]: I1205 12:49:24.311169 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/46b72a36-ef75-4fa3-a6ec-c277b2f43140-ready\") pod \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " Dec 05 12:49:24.311486 master-0 kubenswrapper[8731]: I1205 12:49:24.311305 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pr7q\" (UniqueName: \"kubernetes.io/projected/46b72a36-ef75-4fa3-a6ec-c277b2f43140-kube-api-access-9pr7q\") pod \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " Dec 05 12:49:24.311486 master-0 kubenswrapper[8731]: I1205 12:49:24.311358 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46b72a36-ef75-4fa3-a6ec-c277b2f43140-tuning-conf-dir\") pod \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " Dec 05 12:49:24.311486 master-0 kubenswrapper[8731]: I1205 12:49:24.311421 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46b72a36-ef75-4fa3-a6ec-c277b2f43140-cni-sysctl-allowlist\") pod \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\" (UID: \"46b72a36-ef75-4fa3-a6ec-c277b2f43140\") " Dec 05 12:49:24.311601 master-0 kubenswrapper[8731]: I1205 12:49:24.311474 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46b72a36-ef75-4fa3-a6ec-c277b2f43140-ready" (OuterVolumeSpecName: "ready") pod "46b72a36-ef75-4fa3-a6ec-c277b2f43140" (UID: "46b72a36-ef75-4fa3-a6ec-c277b2f43140"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:49:24.311777 master-0 kubenswrapper[8731]: I1205 12:49:24.311734 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46b72a36-ef75-4fa3-a6ec-c277b2f43140-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "46b72a36-ef75-4fa3-a6ec-c277b2f43140" (UID: "46b72a36-ef75-4fa3-a6ec-c277b2f43140"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:24.311910 master-0 kubenswrapper[8731]: I1205 12:49:24.311754 8731 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/46b72a36-ef75-4fa3-a6ec-c277b2f43140-ready\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:24.312453 master-0 kubenswrapper[8731]: I1205 12:49:24.312405 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b72a36-ef75-4fa3-a6ec-c277b2f43140-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "46b72a36-ef75-4fa3-a6ec-c277b2f43140" (UID: "46b72a36-ef75-4fa3-a6ec-c277b2f43140"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:49:24.314407 master-0 kubenswrapper[8731]: I1205 12:49:24.314352 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b72a36-ef75-4fa3-a6ec-c277b2f43140-kube-api-access-9pr7q" (OuterVolumeSpecName: "kube-api-access-9pr7q") pod "46b72a36-ef75-4fa3-a6ec-c277b2f43140" (UID: "46b72a36-ef75-4fa3-a6ec-c277b2f43140"). InnerVolumeSpecName "kube-api-access-9pr7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:49:24.316653 master-0 kubenswrapper[8731]: I1205 12:49:24.316432 8731 scope.go:117] "RemoveContainer" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" Dec 05 12:49:24.317042 master-0 kubenswrapper[8731]: E1205 12:49:24.316970 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c\": container with ID starting with bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c not found: ID does not exist" containerID="bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c" Dec 05 12:49:24.317117 master-0 kubenswrapper[8731]: I1205 12:49:24.317030 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c"} err="failed to get container status \"bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c\": rpc error: code = NotFound desc = could not find container \"bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c\": container with ID starting with bb18a9428e3393b19c73d76047f493b77d248d71f18738520af523b677e9632c not found: ID does not exist" Dec 05 12:49:24.413378 master-0 kubenswrapper[8731]: I1205 12:49:24.413211 8731 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pr7q\" (UniqueName: \"kubernetes.io/projected/46b72a36-ef75-4fa3-a6ec-c277b2f43140-kube-api-access-9pr7q\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:24.413378 master-0 kubenswrapper[8731]: I1205 12:49:24.413275 8731 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46b72a36-ef75-4fa3-a6ec-c277b2f43140-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:24.413378 master-0 kubenswrapper[8731]: I1205 12:49:24.413290 8731 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46b72a36-ef75-4fa3-a6ec-c277b2f43140-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:24.907739 master-0 kubenswrapper[8731]: I1205 12:49:24.907679 8731 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-49k2k"] Dec 05 12:49:24.919544 master-0 kubenswrapper[8731]: I1205 12:49:24.919074 8731 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-49k2k"] Dec 05 12:49:25.312432 master-0 kubenswrapper[8731]: I1205 12:49:25.312360 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:25.944171 master-0 kubenswrapper[8731]: I1205 12:49:25.944078 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b72a36-ef75-4fa3-a6ec-c277b2f43140" path="/var/lib/kubelet/pods/46b72a36-ef75-4fa3-a6ec-c277b2f43140/volumes" Dec 05 12:49:25.944684 master-0 kubenswrapper[8731]: I1205 12:49:25.944669 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" path="/var/lib/kubelet/pods/cfc37275-4e59-4f73-8b08-c8ca8ec28bbb/volumes" Dec 05 12:49:34.290931 master-0 kubenswrapper[8731]: I1205 12:49:34.290829 8731 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 05 12:49:34.290931 master-0 kubenswrapper[8731]: I1205 12:49:34.290914 8731 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 05 12:49:40.427542 master-0 kubenswrapper[8731]: I1205 12:49:40.427453 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/4.log" Dec 05 12:49:40.428286 master-0 kubenswrapper[8731]: I1205 12:49:40.428235 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/3.log" Dec 05 12:49:40.428945 master-0 kubenswrapper[8731]: I1205 12:49:40.428895 8731 generic.go:334] "Generic (PLEG): container finished" podID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" containerID="a62572546062b2df435bc85f27bda94544b75d65580e59f21beaef134a43b821" exitCode=1 Dec 05 12:49:40.428990 master-0 kubenswrapper[8731]: I1205 12:49:40.428962 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerDied","Data":"a62572546062b2df435bc85f27bda94544b75d65580e59f21beaef134a43b821"} Dec 05 12:49:40.429022 master-0 kubenswrapper[8731]: I1205 12:49:40.429007 8731 scope.go:117] "RemoveContainer" containerID="9a04a03647acf84231bb505b6acd2c588670eb0bd70e0221386d9b53a3261e61" Dec 05 12:49:40.429870 master-0 kubenswrapper[8731]: I1205 12:49:40.429830 8731 scope.go:117] "RemoveContainer" containerID="a62572546062b2df435bc85f27bda94544b75d65580e59f21beaef134a43b821" Dec 05 12:49:40.430295 master-0 kubenswrapper[8731]: E1205 12:49:40.430260 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-7xrk6_openshift-ingress-operator(a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" Dec 05 12:49:41.437445 master-0 kubenswrapper[8731]: I1205 12:49:41.437383 8731 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/4.log" Dec 05 12:49:44.297183 master-0 kubenswrapper[8731]: I1205 12:49:44.297111 8731 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:44.303363 master-0 kubenswrapper[8731]: I1205 12:49:44.303316 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:49:47.594766 master-0 kubenswrapper[8731]: I1205 12:49:47.594653 8731 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:49:53.128673 master-0 kubenswrapper[8731]: I1205 12:49:53.128597 8731 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: E1205 12:49:53.129724 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b72a36-ef75-4fa3-a6ec-c277b2f43140" containerName="kube-multus-additional-cni-plugins" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: I1205 12:49:53.129778 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b72a36-ef75-4fa3-a6ec-c277b2f43140" containerName="kube-multus-additional-cni-plugins" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: E1205 12:49:53.129796 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" containerName="multus-admission-controller" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: I1205 12:49:53.129809 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" containerName="multus-admission-controller" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: E1205 12:49:53.129827 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4957e218-f580-41a9-866a-fd4f92a3c007" containerName="installer" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: I1205 12:49:53.129835 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="4957e218-f580-41a9-866a-fd4f92a3c007" containerName="installer" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: E1205 12:49:53.129858 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" containerName="kube-rbac-proxy" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: I1205 12:49:53.129871 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" containerName="kube-rbac-proxy" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: I1205 12:49:53.130021 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b72a36-ef75-4fa3-a6ec-c277b2f43140" containerName="kube-multus-additional-cni-plugins" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: I1205 12:49:53.130048 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="4957e218-f580-41a9-866a-fd4f92a3c007" containerName="installer" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: I1205 12:49:53.130069 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" containerName="multus-admission-controller" Dec 05 12:49:53.130295 master-0 kubenswrapper[8731]: I1205 12:49:53.130082 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc37275-4e59-4f73-8b08-c8ca8ec28bbb" containerName="kube-rbac-proxy" Dec 05 12:49:53.130837 master-0 kubenswrapper[8731]: I1205 12:49:53.130609 8731 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 05 12:49:53.130922 master-0 kubenswrapper[8731]: I1205 12:49:53.130876 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver" containerID="cri-o://8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc" gracePeriod=15 Dec 05 12:49:53.131130 master-0 kubenswrapper[8731]: I1205 12:49:53.131078 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.132728 master-0 kubenswrapper[8731]: I1205 12:49:53.131592 8731 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d" gracePeriod=15 Dec 05 12:49:53.132728 master-0 kubenswrapper[8731]: I1205 12:49:53.132215 8731 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 05 12:49:53.132728 master-0 kubenswrapper[8731]: E1205 12:49:53.132614 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="setup" Dec 05 12:49:53.132728 master-0 kubenswrapper[8731]: I1205 12:49:53.132637 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="setup" Dec 05 12:49:53.132728 master-0 kubenswrapper[8731]: E1205 12:49:53.132656 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver" Dec 05 12:49:53.132728 master-0 kubenswrapper[8731]: I1205 12:49:53.132665 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver" Dec 05 12:49:53.132728 master-0 kubenswrapper[8731]: E1205 12:49:53.132681 8731 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver-insecure-readyz" Dec 05 12:49:53.132728 master-0 kubenswrapper[8731]: I1205 12:49:53.132693 8731 state_mem.go:107] "Deleted CPUSet assignment" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver-insecure-readyz" Dec 05 12:49:53.134316 master-0 kubenswrapper[8731]: I1205 12:49:53.134281 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="setup" Dec 05 12:49:53.134521 master-0 kubenswrapper[8731]: I1205 12:49:53.134452 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver" Dec 05 12:49:53.134671 master-0 kubenswrapper[8731]: I1205 12:49:53.134657 8731 memory_manager.go:354] "RemoveStaleState removing state" podUID="d75143d9bc4a2dc15781dc51ccff632a" containerName="kube-apiserver-insecure-readyz" Dec 05 12:49:53.138570 master-0 kubenswrapper[8731]: I1205 12:49:53.138507 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.211326 master-0 kubenswrapper[8731]: I1205 12:49:53.211253 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 05 12:49:53.216976 master-0 kubenswrapper[8731]: I1205 12:49:53.216917 8731 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 05 12:49:53.234403 master-0 kubenswrapper[8731]: I1205 12:49:53.234341 8731 scope.go:117] "RemoveContainer" containerID="8fbf247ef3f15fe005ee46e673fbe0b71698dcc9f2759966a03a8cd2730f623b" Dec 05 12:49:53.245280 master-0 kubenswrapper[8731]: I1205 12:49:53.245235 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.249207 master-0 kubenswrapper[8731]: I1205 12:49:53.246332 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.249207 master-0 kubenswrapper[8731]: I1205 12:49:53.246397 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.249207 master-0 kubenswrapper[8731]: I1205 12:49:53.246437 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.249207 master-0 kubenswrapper[8731]: I1205 12:49:53.246457 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.249207 master-0 kubenswrapper[8731]: I1205 12:49:53.246525 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.249207 master-0 kubenswrapper[8731]: I1205 12:49:53.246558 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.249207 master-0 kubenswrapper[8731]: I1205 12:49:53.246590 8731 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.347432 master-0 kubenswrapper[8731]: I1205 12:49:53.347392 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.347613 master-0 kubenswrapper[8731]: I1205 12:49:53.347562 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.347669 master-0 kubenswrapper[8731]: I1205 12:49:53.347578 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.347761 master-0 kubenswrapper[8731]: I1205 12:49:53.347746 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.347846 master-0 kubenswrapper[8731]: I1205 12:49:53.347818 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.347923 master-0 kubenswrapper[8731]: I1205 12:49:53.347891 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.347973 master-0 kubenswrapper[8731]: I1205 12:49:53.347923 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.348011 master-0 kubenswrapper[8731]: I1205 12:49:53.347967 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.348112 master-0 kubenswrapper[8731]: I1205 12:49:53.348064 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.348112 master-0 kubenswrapper[8731]: I1205 12:49:53.348104 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.348215 master-0 kubenswrapper[8731]: I1205 12:49:53.348076 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.348215 master-0 kubenswrapper[8731]: I1205 12:49:53.348164 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.348290 master-0 kubenswrapper[8731]: I1205 12:49:53.348232 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.348326 master-0 kubenswrapper[8731]: I1205 12:49:53.348285 8731 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.348361 master-0 kubenswrapper[8731]: I1205 12:49:53.348338 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.348424 master-0 kubenswrapper[8731]: I1205 12:49:53.348369 8731 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.509878 master-0 kubenswrapper[8731]: I1205 12:49:53.509783 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:49:53.517389 master-0 kubenswrapper[8731]: I1205 12:49:53.517239 8731 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:49:53.531619 master-0 kubenswrapper[8731]: I1205 12:49:53.531572 8731 generic.go:334] "Generic (PLEG): container finished" podID="d75143d9bc4a2dc15781dc51ccff632a" containerID="17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d" exitCode=0 Dec 05 12:49:53.545602 master-0 kubenswrapper[8731]: W1205 12:49:53.545563 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb89698aa356a3bc32694e2b098f9a900.slice/crio-915ebf41ba29fa9d0d989a762295214f873dc379b870a92fba77c1d033c014f1 WatchSource:0}: Error finding container 915ebf41ba29fa9d0d989a762295214f873dc379b870a92fba77c1d033c014f1: Status 404 returned error can't find the container with id 915ebf41ba29fa9d0d989a762295214f873dc379b870a92fba77c1d033c014f1 Dec 05 12:49:53.549935 master-0 kubenswrapper[8731]: E1205 12:49:53.549727 8731 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.187e52a9e0b61c41 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:b89698aa356a3bc32694e2b098f9a900,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:49:53.548794945 +0000 UTC m=+1091.852779122,LastTimestamp:2025-12-05 12:49:53.548794945 +0000 UTC m=+1091.852779122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:49:53.550317 master-0 kubenswrapper[8731]: W1205 12:49:53.550288 8731 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda906debd0c35952850935aee2d607cce.slice/crio-02429253d825f84b6e8d3688d755956761f60c5751e2535a15ebb536be6b1f94 WatchSource:0}: Error finding container 02429253d825f84b6e8d3688d755956761f60c5751e2535a15ebb536be6b1f94: Status 404 returned error can't find the container with id 02429253d825f84b6e8d3688d755956761f60c5751e2535a15ebb536be6b1f94 Dec 05 12:49:53.935494 master-0 kubenswrapper[8731]: I1205 12:49:53.935422 8731 scope.go:117] "RemoveContainer" containerID="a62572546062b2df435bc85f27bda94544b75d65580e59f21beaef134a43b821" Dec 05 12:49:53.935846 master-0 kubenswrapper[8731]: E1205 12:49:53.935817 8731 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-8649c48786-7xrk6_openshift-ingress-operator(a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7)\"" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" Dec 05 12:49:53.939657 master-0 kubenswrapper[8731]: I1205 12:49:53.939387 8731 status_manager.go:851] "Failed to get status for pod" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-operator/pods/ingress-operator-8649c48786-7xrk6\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:49:53.940556 master-0 kubenswrapper[8731]: I1205 12:49:53.940496 8731 status_manager.go:851] "Failed to get status for pod" podUID="a906debd0c35952850935aee2d607cce" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:49:53.941447 master-0 kubenswrapper[8731]: I1205 12:49:53.941388 8731 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:49:54.548359 master-0 kubenswrapper[8731]: I1205 12:49:54.548261 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a906debd0c35952850935aee2d607cce","Type":"ContainerStarted","Data":"02429253d825f84b6e8d3688d755956761f60c5751e2535a15ebb536be6b1f94"} Dec 05 12:49:54.550930 master-0 kubenswrapper[8731]: I1205 12:49:54.550827 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"915ebf41ba29fa9d0d989a762295214f873dc379b870a92fba77c1d033c014f1"} Dec 05 12:49:54.869625 master-0 kubenswrapper[8731]: E1205 12:49:54.869376 8731 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.187e52a9e0b61c41 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:b89698aa356a3bc32694e2b098f9a900,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:49:53.548794945 +0000 UTC m=+1091.852779122,LastTimestamp:2025-12-05 12:49:53.548794945 +0000 UTC m=+1091.852779122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 05 12:49:55.560884 master-0 kubenswrapper[8731]: I1205 12:49:55.560836 8731 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a" exitCode=0 Dec 05 12:49:55.561461 master-0 kubenswrapper[8731]: I1205 12:49:55.560914 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerDied","Data":"1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a"} Dec 05 12:49:55.562997 master-0 kubenswrapper[8731]: I1205 12:49:55.562927 8731 status_manager.go:851] "Failed to get status for pod" podUID="a906debd0c35952850935aee2d607cce" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:49:55.563207 master-0 kubenswrapper[8731]: I1205 12:49:55.563139 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a906debd0c35952850935aee2d607cce","Type":"ContainerStarted","Data":"4d7c7fd9f6be698bd81fc9eb6c8b4d1eab76e44ec95ef9874a47a2596768ed58"} Dec 05 12:49:55.563875 master-0 kubenswrapper[8731]: I1205 12:49:55.563828 8731 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:49:55.564531 master-0 kubenswrapper[8731]: I1205 12:49:55.564494 8731 status_manager.go:851] "Failed to get status for pod" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-operator/pods/ingress-operator-8649c48786-7xrk6\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:49:55.565405 master-0 kubenswrapper[8731]: I1205 12:49:55.565306 8731 status_manager.go:851] "Failed to get status for pod" podUID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-operator/pods/ingress-operator-8649c48786-7xrk6\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:49:55.566073 master-0 kubenswrapper[8731]: I1205 12:49:55.566027 8731 status_manager.go:851] "Failed to get status for pod" podUID="a906debd0c35952850935aee2d607cce" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:49:55.566762 master-0 kubenswrapper[8731]: I1205 12:49:55.566694 8731 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:49:56.406845 master-0 kubenswrapper[8731]: I1205 12:49:56.406781 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.494998 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495056 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495076 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495100 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495201 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495229 8731 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") pod \"d75143d9bc4a2dc15781dc51ccff632a\" (UID: \"d75143d9bc4a2dc15781dc51ccff632a\") " Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495343 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495359 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config" (OuterVolumeSpecName: "config") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495449 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs" (OuterVolumeSpecName: "logs") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495403 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495481 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets" (OuterVolumeSpecName: "secrets") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.495475 8731 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d75143d9bc4a2dc15781dc51ccff632a" (UID: "d75143d9bc4a2dc15781dc51ccff632a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.496004 8731 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.496022 8731 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.496035 8731 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.496045 8731 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.496060 8731 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:56.496283 master-0 kubenswrapper[8731]: I1205 12:49:56.496070 8731 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d75143d9bc4a2dc15781dc51ccff632a-secrets\") on node \"master-0\" DevicePath \"\"" Dec 05 12:49:56.573978 master-0 kubenswrapper[8731]: I1205 12:49:56.573737 8731 generic.go:334] "Generic (PLEG): container finished" podID="d75143d9bc4a2dc15781dc51ccff632a" containerID="8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc" exitCode=0 Dec 05 12:49:56.573978 master-0 kubenswrapper[8731]: I1205 12:49:56.573833 8731 scope.go:117] "RemoveContainer" containerID="17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d" Dec 05 12:49:56.573978 master-0 kubenswrapper[8731]: I1205 12:49:56.573970 8731 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 05 12:49:56.577919 master-0 kubenswrapper[8731]: I1205 12:49:56.577864 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4"} Dec 05 12:49:56.596517 master-0 kubenswrapper[8731]: I1205 12:49:56.594433 8731 scope.go:117] "RemoveContainer" containerID="8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc" Dec 05 12:49:56.633024 master-0 kubenswrapper[8731]: I1205 12:49:56.632345 8731 scope.go:117] "RemoveContainer" containerID="697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334" Dec 05 12:49:56.673770 master-0 kubenswrapper[8731]: I1205 12:49:56.673724 8731 scope.go:117] "RemoveContainer" containerID="17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d" Dec 05 12:49:56.674411 master-0 kubenswrapper[8731]: E1205 12:49:56.674334 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d\": container with ID starting with 17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d not found: ID does not exist" containerID="17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d" Dec 05 12:49:56.674485 master-0 kubenswrapper[8731]: I1205 12:49:56.674435 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d"} err="failed to get container status \"17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d\": rpc error: code = NotFound desc = could not find container \"17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d\": container with ID starting with 17618b3a98b21ba173e16cc99dae400fa4afea110eb46e1cd0bececa0e704d0d not found: ID does not exist" Dec 05 12:49:56.674531 master-0 kubenswrapper[8731]: I1205 12:49:56.674494 8731 scope.go:117] "RemoveContainer" containerID="8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc" Dec 05 12:49:56.675048 master-0 kubenswrapper[8731]: E1205 12:49:56.675006 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc\": container with ID starting with 8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc not found: ID does not exist" containerID="8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc" Dec 05 12:49:56.675098 master-0 kubenswrapper[8731]: I1205 12:49:56.675046 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc"} err="failed to get container status \"8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc\": rpc error: code = NotFound desc = could not find container \"8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc\": container with ID starting with 8afe0da63d99f2297054afe39b61890ca549453e8d197ef5a9c1c3976a1f2afc not found: ID does not exist" Dec 05 12:49:56.675098 master-0 kubenswrapper[8731]: I1205 12:49:56.675067 8731 scope.go:117] "RemoveContainer" containerID="697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334" Dec 05 12:49:56.676019 master-0 kubenswrapper[8731]: E1205 12:49:56.675709 8731 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334\": container with ID starting with 697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334 not found: ID does not exist" containerID="697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334" Dec 05 12:49:56.676083 master-0 kubenswrapper[8731]: I1205 12:49:56.676029 8731 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334"} err="failed to get container status \"697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334\": rpc error: code = NotFound desc = could not find container \"697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334\": container with ID starting with 697d3c24c504f4edabe923e2993cba7e7017b70ed34b4cb71d455e86377b9334 not found: ID does not exist" Dec 05 12:49:57.662805 master-0 kubenswrapper[8731]: I1205 12:49:57.661010 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543"} Dec 05 12:49:57.662805 master-0 kubenswrapper[8731]: I1205 12:49:57.661086 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571"} Dec 05 12:49:57.662805 master-0 kubenswrapper[8731]: I1205 12:49:57.661099 8731 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658"} Dec 05 12:49:57.941558 master-0 kubenswrapper[8731]: I1205 12:49:57.941500 8731 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d75143d9bc4a2dc15781dc51ccff632a" path="/var/lib/kubelet/pods/d75143d9bc4a2dc15781dc51ccff632a/volumes" Dec 05 12:49:57.941912 master-0 kubenswrapper[8731]: I1205 12:49:57.941874 8731 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 05 12:49:58.680010 master-0 kubenswrapper[8731]: I1205 12:49:58.679568 8731 generic.go:334] "Generic (PLEG): container finished" podID="4d215811-6210-4ec2-8356-f1533dc43f65" containerID="419f6f30a7830337f1a96ed401ad15741b6815b1dc5b3d9cd59d5f9c8beb4aa8" exitCode=0 Dec 05 12:50:02.717985 master-0 kubenswrapper[8731]: I1205 12:50:02.717912 8731 generic.go:334] "Generic (PLEG): container finished" podID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerID="b7483a678d691fbf8a3207dd7d6ed1c739a3647a4a630049897502326cc17230" exitCode=0 Dec 05 12:50:02.721657 master-0 systemd[1]: Stopping Kubernetes Kubelet... Dec 05 12:50:02.755733 master-0 systemd[1]: kubelet.service: Deactivated successfully. Dec 05 12:50:02.756008 master-0 systemd[1]: Stopped Kubernetes Kubelet. Dec 05 12:50:02.769021 master-0 systemd[1]: kubelet.service: Consumed 2min 39.345s CPU time. Dec 05 12:50:02.806365 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 05 12:50:02.957993 master-0 kubenswrapper[29936]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:50:02.957993 master-0 kubenswrapper[29936]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 12:50:02.957993 master-0 kubenswrapper[29936]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:50:02.957993 master-0 kubenswrapper[29936]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:50:02.957993 master-0 kubenswrapper[29936]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 12:50:02.957993 master-0 kubenswrapper[29936]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 12:50:02.958877 master-0 kubenswrapper[29936]: I1205 12:50:02.958109 29936 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979124 29936 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979161 29936 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979166 29936 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979172 29936 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979194 29936 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979200 29936 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979205 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979211 29936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979218 29936 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979224 29936 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979228 29936 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979232 29936 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979236 29936 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979240 29936 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979244 29936 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979247 29936 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979251 29936 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979255 29936 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:50:02.979889 master-0 kubenswrapper[29936]: W1205 12:50:02.979258 29936 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979262 29936 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979265 29936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979272 29936 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979277 29936 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979280 29936 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979295 29936 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979298 29936 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979302 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979327 29936 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979340 29936 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979344 29936 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979348 29936 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979352 29936 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979357 29936 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979360 29936 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979364 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979368 29936 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979372 29936 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:50:02.981257 master-0 kubenswrapper[29936]: W1205 12:50:02.979376 29936 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979380 29936 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979383 29936 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979387 29936 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979390 29936 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979393 29936 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979397 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979400 29936 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979404 29936 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979407 29936 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979411 29936 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979415 29936 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979418 29936 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979421 29936 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979425 29936 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979428 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979433 29936 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979437 29936 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979441 29936 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979445 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:50:02.982001 master-0 kubenswrapper[29936]: W1205 12:50:02.979448 29936 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979453 29936 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979458 29936 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979464 29936 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979468 29936 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979471 29936 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979475 29936 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979478 29936 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979482 29936 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979485 29936 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979489 29936 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979492 29936 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979496 29936 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979499 29936 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: W1205 12:50:02.979502 29936 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: I1205 12:50:02.979626 29936 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: I1205 12:50:02.979641 29936 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: I1205 12:50:02.979650 29936 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: I1205 12:50:02.979656 29936 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: I1205 12:50:02.979662 29936 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: I1205 12:50:02.979667 29936 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 12:50:02.982949 master-0 kubenswrapper[29936]: I1205 12:50:02.979673 29936 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979691 29936 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979696 29936 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979701 29936 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979706 29936 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979711 29936 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979715 29936 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979720 29936 flags.go:64] FLAG: --cgroup-root="" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979724 29936 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979729 29936 flags.go:64] FLAG: --client-ca-file="" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979733 29936 flags.go:64] FLAG: --cloud-config="" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979737 29936 flags.go:64] FLAG: --cloud-provider="" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979742 29936 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979748 29936 flags.go:64] FLAG: --cluster-domain="" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979754 29936 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979759 29936 flags.go:64] FLAG: --config-dir="" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979763 29936 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979768 29936 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979775 29936 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979779 29936 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979784 29936 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979789 29936 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979794 29936 flags.go:64] FLAG: --contention-profiling="false" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979799 29936 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979804 29936 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 12:50:02.983745 master-0 kubenswrapper[29936]: I1205 12:50:02.979809 29936 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979815 29936 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979823 29936 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979829 29936 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979834 29936 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979840 29936 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979845 29936 flags.go:64] FLAG: --enable-server="true" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979850 29936 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979865 29936 flags.go:64] FLAG: --event-burst="100" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979869 29936 flags.go:64] FLAG: --event-qps="50" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979874 29936 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979879 29936 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979884 29936 flags.go:64] FLAG: --eviction-hard="" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979891 29936 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979895 29936 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979900 29936 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979904 29936 flags.go:64] FLAG: --eviction-soft="" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979909 29936 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979914 29936 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979922 29936 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979927 29936 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979932 29936 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979937 29936 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979943 29936 flags.go:64] FLAG: --feature-gates="" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979950 29936 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 12:50:02.984899 master-0 kubenswrapper[29936]: I1205 12:50:02.979955 29936 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.979959 29936 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.979964 29936 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.979968 29936 flags.go:64] FLAG: --healthz-port="10248" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.979973 29936 flags.go:64] FLAG: --help="false" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.979978 29936 flags.go:64] FLAG: --hostname-override="" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.979982 29936 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.979987 29936 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.979991 29936 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.979995 29936 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980000 29936 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980004 29936 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980008 29936 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980012 29936 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980016 29936 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980023 29936 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980028 29936 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980032 29936 flags.go:64] FLAG: --kube-reserved="" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980037 29936 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980041 29936 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980045 29936 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980049 29936 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980054 29936 flags.go:64] FLAG: --lock-file="" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980058 29936 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980062 29936 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 12:50:02.986169 master-0 kubenswrapper[29936]: I1205 12:50:02.980067 29936 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980076 29936 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980080 29936 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980085 29936 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980090 29936 flags.go:64] FLAG: --logging-format="text" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980096 29936 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980102 29936 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980106 29936 flags.go:64] FLAG: --manifest-url="" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980111 29936 flags.go:64] FLAG: --manifest-url-header="" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980118 29936 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980124 29936 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980129 29936 flags.go:64] FLAG: --max-pods="110" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980134 29936 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980138 29936 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980143 29936 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980148 29936 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980152 29936 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980157 29936 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980162 29936 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980200 29936 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980205 29936 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980210 29936 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980217 29936 flags.go:64] FLAG: --pod-cidr="" Dec 05 12:50:02.994172 master-0 kubenswrapper[29936]: I1205 12:50:02.980221 29936 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a70b2a95140d1e90978f36cc9889013ae34bd232662c5424002274385669ed9" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980232 29936 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980237 29936 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980242 29936 flags.go:64] FLAG: --pods-per-core="0" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980248 29936 flags.go:64] FLAG: --port="10250" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980254 29936 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980259 29936 flags.go:64] FLAG: --provider-id="" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980265 29936 flags.go:64] FLAG: --qos-reserved="" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980271 29936 flags.go:64] FLAG: --read-only-port="10255" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980276 29936 flags.go:64] FLAG: --register-node="true" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980285 29936 flags.go:64] FLAG: --register-schedulable="true" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980290 29936 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980301 29936 flags.go:64] FLAG: --registry-burst="10" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980307 29936 flags.go:64] FLAG: --registry-qps="5" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980313 29936 flags.go:64] FLAG: --reserved-cpus="" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980318 29936 flags.go:64] FLAG: --reserved-memory="" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980326 29936 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980331 29936 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980337 29936 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980342 29936 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980346 29936 flags.go:64] FLAG: --runonce="false" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980351 29936 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980356 29936 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980361 29936 flags.go:64] FLAG: --seccomp-default="false" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980366 29936 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980370 29936 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 12:50:02.995049 master-0 kubenswrapper[29936]: I1205 12:50:02.980375 29936 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980379 29936 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980384 29936 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980388 29936 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980392 29936 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980397 29936 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980403 29936 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980408 29936 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980413 29936 flags.go:64] FLAG: --system-cgroups="" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980417 29936 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980423 29936 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980427 29936 flags.go:64] FLAG: --tls-cert-file="" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980432 29936 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980438 29936 flags.go:64] FLAG: --tls-min-version="" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980442 29936 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980446 29936 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980453 29936 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980458 29936 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980473 29936 flags.go:64] FLAG: --v="2" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980481 29936 flags.go:64] FLAG: --version="false" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980488 29936 flags.go:64] FLAG: --vmodule="" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980493 29936 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: I1205 12:50:02.980498 29936 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: W1205 12:50:02.980617 29936 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: W1205 12:50:02.980623 29936 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:50:02.996083 master-0 kubenswrapper[29936]: W1205 12:50:02.980628 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980632 29936 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980636 29936 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980640 29936 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980645 29936 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980649 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980653 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980657 29936 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980661 29936 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980665 29936 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980669 29936 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980672 29936 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980676 29936 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980683 29936 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980687 29936 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980691 29936 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980695 29936 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980698 29936 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980702 29936 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980705 29936 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:50:03.001410 master-0 kubenswrapper[29936]: W1205 12:50:02.980709 29936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980713 29936 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980716 29936 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980722 29936 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980726 29936 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980730 29936 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980735 29936 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980739 29936 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980742 29936 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980746 29936 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980750 29936 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980753 29936 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980758 29936 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980763 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980777 29936 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980781 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980785 29936 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980789 29936 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980792 29936 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:50:03.002267 master-0 kubenswrapper[29936]: W1205 12:50:02.980796 29936 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980800 29936 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980804 29936 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980807 29936 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980811 29936 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980815 29936 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980821 29936 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980825 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980828 29936 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980832 29936 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980837 29936 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980841 29936 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980846 29936 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980850 29936 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980855 29936 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980858 29936 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980864 29936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980869 29936 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980873 29936 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980876 29936 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:50:03.005154 master-0 kubenswrapper[29936]: W1205 12:50:02.980880 29936 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:02.980884 29936 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:02.980888 29936 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:02.980891 29936 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:02.980895 29936 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:02.980899 29936 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:02.980904 29936 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:02.980908 29936 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:02.980911 29936 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:02.980915 29936 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:02.980919 29936 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: I1205 12:50:02.980934 29936 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: I1205 12:50:03.004869 29936 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: I1205 12:50:03.004920 29936 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:03.004996 29936 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:50:03.006482 master-0 kubenswrapper[29936]: W1205 12:50:03.005003 29936 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005007 29936 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005012 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005016 29936 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005020 29936 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005024 29936 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005029 29936 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005034 29936 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005039 29936 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005044 29936 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005048 29936 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005052 29936 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005056 29936 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005060 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005064 29936 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005067 29936 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005072 29936 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005075 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005079 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:50:03.008624 master-0 kubenswrapper[29936]: W1205 12:50:03.005083 29936 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005086 29936 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005090 29936 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005094 29936 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005100 29936 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005104 29936 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005108 29936 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005112 29936 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005116 29936 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005119 29936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005123 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005127 29936 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005131 29936 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005134 29936 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005140 29936 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005143 29936 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005147 29936 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005151 29936 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005155 29936 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005159 29936 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:50:03.009995 master-0 kubenswrapper[29936]: W1205 12:50:03.005162 29936 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005167 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005170 29936 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005174 29936 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005193 29936 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005197 29936 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005201 29936 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005204 29936 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005208 29936 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005213 29936 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005217 29936 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005222 29936 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005228 29936 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005232 29936 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005236 29936 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005240 29936 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005244 29936 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005247 29936 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005251 29936 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005254 29936 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:50:03.011411 master-0 kubenswrapper[29936]: W1205 12:50:03.005258 29936 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005262 29936 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005265 29936 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005282 29936 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005287 29936 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005293 29936 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005298 29936 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005303 29936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005307 29936 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005311 29936 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005315 29936 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005319 29936 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: I1205 12:50:03.005326 29936 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005456 29936 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 12:50:03.012431 master-0 kubenswrapper[29936]: W1205 12:50:03.005464 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005468 29936 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005472 29936 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005477 29936 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005481 29936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005486 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005490 29936 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005494 29936 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005498 29936 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005501 29936 feature_gate.go:330] unrecognized feature gate: Example Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005505 29936 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005508 29936 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005512 29936 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005515 29936 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005519 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005523 29936 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005527 29936 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005530 29936 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005534 29936 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005537 29936 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 12:50:03.013018 master-0 kubenswrapper[29936]: W1205 12:50:03.005541 29936 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005544 29936 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005548 29936 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005552 29936 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005557 29936 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005562 29936 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005566 29936 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005570 29936 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005573 29936 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005577 29936 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005581 29936 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005585 29936 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005589 29936 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005593 29936 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005597 29936 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005600 29936 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005604 29936 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005607 29936 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005611 29936 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005615 29936 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 12:50:03.013813 master-0 kubenswrapper[29936]: W1205 12:50:03.005619 29936 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005623 29936 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005626 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005630 29936 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005633 29936 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005637 29936 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005640 29936 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005644 29936 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005647 29936 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005651 29936 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005655 29936 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005658 29936 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005662 29936 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005665 29936 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005669 29936 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005672 29936 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005676 29936 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005679 29936 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005683 29936 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005687 29936 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 12:50:03.014732 master-0 kubenswrapper[29936]: W1205 12:50:03.005690 29936 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: W1205 12:50:03.005695 29936 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: W1205 12:50:03.005700 29936 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: W1205 12:50:03.005704 29936 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: W1205 12:50:03.005707 29936 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: W1205 12:50:03.005712 29936 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: W1205 12:50:03.005716 29936 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: W1205 12:50:03.005721 29936 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: W1205 12:50:03.005726 29936 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: W1205 12:50:03.005729 29936 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: W1205 12:50:03.005733 29936 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: I1205 12:50:03.005739 29936 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 12:50:03.015602 master-0 kubenswrapper[29936]: I1205 12:50:03.005907 29936 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 12:50:03.016734 master-0 kubenswrapper[29936]: I1205 12:50:03.016504 29936 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 12:50:03.021165 master-0 kubenswrapper[29936]: I1205 12:50:03.021092 29936 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 12:50:03.021589 master-0 kubenswrapper[29936]: I1205 12:50:03.021553 29936 server.go:997] "Starting client certificate rotation" Dec 05 12:50:03.021589 master-0 kubenswrapper[29936]: I1205 12:50:03.021577 29936 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 12:50:03.021973 master-0 kubenswrapper[29936]: I1205 12:50:03.021755 29936 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-06 12:20:41 +0000 UTC, rotation deadline is 2025-12-06 06:40:05.577808421 +0000 UTC Dec 05 12:50:03.021973 master-0 kubenswrapper[29936]: I1205 12:50:03.021864 29936 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h50m2.555946581s for next certificate rotation Dec 05 12:50:03.032324 master-0 kubenswrapper[29936]: I1205 12:50:03.022368 29936 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 12:50:03.032324 master-0 kubenswrapper[29936]: I1205 12:50:03.023631 29936 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 12:50:03.032324 master-0 kubenswrapper[29936]: I1205 12:50:03.027703 29936 log.go:25] "Validated CRI v1 runtime API" Dec 05 12:50:03.036848 master-0 kubenswrapper[29936]: I1205 12:50:03.036516 29936 log.go:25] "Validated CRI v1 image API" Dec 05 12:50:03.038268 master-0 kubenswrapper[29936]: I1205 12:50:03.038233 29936 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 12:50:03.059289 master-0 kubenswrapper[29936]: I1205 12:50:03.059201 29936 fs.go:135] Filesystem UUIDs: map[4623d87d-4611-48ee-a0ce-68b00f5d84bd:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Dec 05 12:50:03.060406 master-0 kubenswrapper[29936]: I1205 12:50:03.059249 29936 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/010fcb3fd705e5d750eedd1adb06872aa524c08fbc85d2a921261129ee9bc96b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/010fcb3fd705e5d750eedd1adb06872aa524c08fbc85d2a921261129ee9bc96b/userdata/shm major:0 minor:1100 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/02429253d825f84b6e8d3688d755956761f60c5751e2535a15ebb536be6b1f94/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/02429253d825f84b6e8d3688d755956761f60c5751e2535a15ebb536be6b1f94/userdata/shm major:0 minor:112 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/029b733e2c6ad9f0e336ec7c4af189bd8388fe0d1d5f30c3280c2f24f4c1e475/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/029b733e2c6ad9f0e336ec7c4af189bd8388fe0d1d5f30c3280c2f24f4c1e475/userdata/shm major:0 minor:596 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/04a1540e033fc0d53be3a8dfa10cb49b28b11738b911cb185f8d919660d6db47/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/04a1540e033fc0d53be3a8dfa10cb49b28b11738b911cb185f8d919660d6db47/userdata/shm major:0 minor:913 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/04f451fea9668a794e9e554df0005ce70f405943bf1c6d084959d7f333152fc6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/04f451fea9668a794e9e554df0005ce70f405943bf1c6d084959d7f333152fc6/userdata/shm major:0 minor:598 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/065b5ff0754f03af8b21df75fad6ff50fe29b9c92ca5f839b6b57c232043c975/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/065b5ff0754f03af8b21df75fad6ff50fe29b9c92ca5f839b6b57c232043c975/userdata/shm major:0 minor:716 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/07bd9adb3dd2a54b1348564cac3ab912144772686d957ab49d9bf60d68718f5e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/07bd9adb3dd2a54b1348564cac3ab912144772686d957ab49d9bf60d68718f5e/userdata/shm major:0 minor:562 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0b836f01dcb43b6af667ba219b4059e3935a66980e122a92a279a33e963cb964/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0b836f01dcb43b6af667ba219b4059e3935a66980e122a92a279a33e963cb964/userdata/shm major:0 minor:910 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0b9e8ef8efad8c6e16cd6e6a39269d9f5b02a38a45cb5b422afaa90713381fcb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0b9e8ef8efad8c6e16cd6e6a39269d9f5b02a38a45cb5b422afaa90713381fcb/userdata/shm major:0 minor:706 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0f8a1e4d8de6a06f67857b43e08d70d6ce0e19926ff25b49cbea007cf56e4e61/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0f8a1e4d8de6a06f67857b43e08d70d6ce0e19926ff25b49cbea007cf56e4e61/userdata/shm major:0 minor:1103 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1234ab8fb98aae2372aaa8236a21f36a20e417c28feeae32f634a7022c473171/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1234ab8fb98aae2372aaa8236a21f36a20e417c28feeae32f634a7022c473171/userdata/shm major:0 minor:1548 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/12b2377bacbd62ee93e11591af977d559716347304347ca9deca90451df150b7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/12b2377bacbd62ee93e11591af977d559716347304347ca9deca90451df150b7/userdata/shm major:0 minor:976 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/12c707b6a686095bb6b918fa64b447ec88e080a7e32878fed57fd6822470f9a2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/12c707b6a686095bb6b918fa64b447ec88e080a7e32878fed57fd6822470f9a2/userdata/shm major:0 minor:955 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/130205999d123cc10c914ecc3cb22cde267becfbe33db09ccb0559c952bdc40f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/130205999d123cc10c914ecc3cb22cde267becfbe33db09ccb0559c952bdc40f/userdata/shm major:0 minor:1273 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/164d69c4a697b3689889d3ab2e5a66ca6c9ed1089292b441ab9282cdde612925/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/164d69c4a697b3689889d3ab2e5a66ca6c9ed1089292b441ab9282cdde612925/userdata/shm major:0 minor:1399 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/19edfec7b5dad95038c7d84a7af049f95270320317e900ea90d94c12477f0556/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/19edfec7b5dad95038c7d84a7af049f95270320317e900ea90d94c12477f0556/userdata/shm major:0 minor:709 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1e7e859b537def1a21239198a62664ddf26773c1c6f156f411606722ed8cb4e6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1e7e859b537def1a21239198a62664ddf26773c1c6f156f411606722ed8cb4e6/userdata/shm major:0 minor:986 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/24b6227b14f227965d3702a28c5ff0f7f572415f72495d988769ab39d10c0d8f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/24b6227b14f227965d3702a28c5ff0f7f572415f72495d988769ab39d10c0d8f/userdata/shm major:0 minor:916 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4/userdata/shm major:0 minor:328 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8/userdata/shm major:0 minor:122 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3a9d8373a41ae93e2045d1c0300d43339b0c915de4cad9048741918269853b51/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3a9d8373a41ae93e2045d1c0300d43339b0c915de4cad9048741918269853b51/userdata/shm major:0 minor:344 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782/userdata/shm major:0 minor:320 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3fc55735af7e7e6d6e15c1fa34cd05fc0468a74822467cb4ea7df9c2efc6cd2f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3fc55735af7e7e6d6e15c1fa34cd05fc0468a74822467cb4ea7df9c2efc6cd2f/userdata/shm major:0 minor:1267 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/44e741be030df14b7e9e415d32f4095c562d693609b8dc4bd8ec51c21503bbca/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/44e741be030df14b7e9e415d32f4095c562d693609b8dc4bd8ec51c21503bbca/userdata/shm major:0 minor:334 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/46252d0271f63a839c2cf8d137d190a08ca8c85ab8a7cd49fe478dd080504839/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/46252d0271f63a839c2cf8d137d190a08ca8c85ab8a7cd49fe478dd080504839/userdata/shm major:0 minor:1098 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33/userdata/shm major:0 minor:316 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/49f2f301b501743d7a4254bc3eeb040151fb199e2a4d9ec64ddce3a74ce66f5b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/49f2f301b501743d7a4254bc3eeb040151fb199e2a4d9ec64ddce3a74ce66f5b/userdata/shm major:0 minor:325 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4ed24c6b6f900a1eeba45b567c2d9336f6c8e081eea3b175ce81e0e583f37f25/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4ed24c6b6f900a1eeba45b567c2d9336f6c8e081eea3b175ce81e0e583f37f25/userdata/shm major:0 minor:981 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5404e1e33c358f139ce43aadf9014fd74254490d058389642b99e6aa71216243/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5404e1e33c358f139ce43aadf9014fd74254490d058389642b99e6aa71216243/userdata/shm major:0 minor:970 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/54d1c55b3ab43714c6f9d30fae64742364176327ec7be4503594ab7c679b2007/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/54d1c55b3ab43714c6f9d30fae64742364176327ec7be4503594ab7c679b2007/userdata/shm major:0 minor:454 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/570d4cae37b4f398ab8be13ab3899c325813f0073ace4d7fbe1d38d0fbd654b9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/570d4cae37b4f398ab8be13ab3899c325813f0073ace4d7fbe1d38d0fbd654b9/userdata/shm major:0 minor:1434 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/62b006cd51c7d10f8e6f8e36ec2fbd7c2b472a5db5854f2056fdbe13f97f07e2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/62b006cd51c7d10f8e6f8e36ec2fbd7c2b472a5db5854f2056fdbe13f97f07e2/userdata/shm major:0 minor:1004 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6a7ef281a34ccfa6602eba73eaa73316754fbb0bb6c1935a7c44c597fce5d077/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6a7ef281a34ccfa6602eba73eaa73316754fbb0bb6c1935a7c44c597fce5d077/userdata/shm major:0 minor:1301 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6beaecf0540643cd8682361578d468ced3e3fd0c3495c281547ab1933154b6de/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6beaecf0540643cd8682361578d468ced3e3fd0c3495c281547ab1933154b6de/userdata/shm major:0 minor:1104 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e/userdata/shm major:0 minor:360 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7306701b7f1e349175a899928ef136fbd77aaa68bd4675a9b0f16eeeda9ca379/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7306701b7f1e349175a899928ef136fbd77aaa68bd4675a9b0f16eeeda9ca379/userdata/shm major:0 minor:1629 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/743ece8bb6e404056a2fb9957949cb0a30330d99bb6dbc633553c08d0fb45759/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/743ece8bb6e404056a2fb9957949cb0a30330d99bb6dbc633553c08d0fb45759/userdata/shm major:0 minor:704 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/77da36c6bf5d09d68dbf2de017a655a5a15b25fda32cba3288a3d8b2cc4b44c0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/77da36c6bf5d09d68dbf2de017a655a5a15b25fda32cba3288a3d8b2cc4b44c0/userdata/shm major:0 minor:376 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/78eb0a378ee87ec426723278f27c3f8944db139eff4ee08e81e705d48c517d58/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/78eb0a378ee87ec426723278f27c3f8944db139eff4ee08e81e705d48c517d58/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/86aa525c2c153f5cbd8c5b3603c3c0fdcde107672a7bd7aeacc117267683bb33/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/86aa525c2c153f5cbd8c5b3603c3c0fdcde107672a7bd7aeacc117267683bb33/userdata/shm major:0 minor:1464 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/915ebf41ba29fa9d0d989a762295214f873dc379b870a92fba77c1d033c014f1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/915ebf41ba29fa9d0d989a762295214f873dc379b870a92fba77c1d033c014f1/userdata/shm major:0 minor:113 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/92eddccae7e06f02f48401d5d5f367dae7b9b78b2bbc84b00b68ec03e90321c1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/92eddccae7e06f02f48401d5d5f367dae7b9b78b2bbc84b00b68ec03e90321c1/userdata/shm major:0 minor:717 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/95fb5697edafbf4a316d98f995b9941dad32b61de9fdb2705dcb30f672d4ab5b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/95fb5697edafbf4a316d98f995b9941dad32b61de9fdb2705dcb30f672d4ab5b/userdata/shm major:0 minor:1334 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9a083a2de33da77d47cd60a3708aaf6bb8591ce81eba8d8e42788e2c8c58ecd3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9a083a2de33da77d47cd60a3708aaf6bb8591ce81eba8d8e42788e2c8c58ecd3/userdata/shm major:0 minor:707 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9abf289d98169b2aa959495298e72df522e02a710723a8c85b99355af8b7eae3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9abf289d98169b2aa959495298e72df522e02a710723a8c85b99355af8b7eae3/userdata/shm major:0 minor:632 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9ca3179bcac9021f22c3e7255b372820926d29356fd67cac276625618bd240a6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9ca3179bcac9021f22c3e7255b372820926d29356fd67cac276625618bd240a6/userdata/shm major:0 minor:963 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9cdc542e09a2b9f60d00a132f1101f8d7a3bb737b3bcc4086c2409ef17b05c7e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9cdc542e09a2b9f60d00a132f1101f8d7a3bb737b3bcc4086c2409ef17b05c7e/userdata/shm major:0 minor:715 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9cfdae6ccb167d4a6f250b34ce3b8d4ec56326be1aca0a0b497bcb1caa6ac3cf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9cfdae6ccb167d4a6f250b34ce3b8d4ec56326be1aca0a0b497bcb1caa6ac3cf/userdata/shm major:0 minor:451 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9f1e76d4f58fcd22a9b3bb1871e5fda992687c0e5181ed09e4aadbd1b7953465/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9f1e76d4f58fcd22a9b3bb1871e5fda992687c0e5181ed09e4aadbd1b7953465/userdata/shm major:0 minor:546 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9fd6db41eb8dc90e6efffc25bb3c93739722e6824dad0dcb9a786720bc6514c4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9fd6db41eb8dc90e6efffc25bb3c93739722e6824dad0dcb9a786720bc6514c4/userdata/shm major:0 minor:337 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733/userdata/shm major:0 minor:160 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a484ee5e7b41d00e01ba54d4ad8789422ba018cb058ac26feb10517be87018de/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a484ee5e7b41d00e01ba54d4ad8789422ba018cb058ac26feb10517be87018de/userdata/shm major:0 minor:428 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a8ddc41afaf0c618d55e894f2ce13b792424c9105a66a883a048089812798f25/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a8ddc41afaf0c618d55e894f2ce13b792424c9105a66a883a048089812798f25/userdata/shm major:0 minor:1635 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/abe43915cc1089507c40de3eaceadf732ca7d07e2f0e1b5a070959328db4199f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/abe43915cc1089507c40de3eaceadf732ca7d07e2f0e1b5a070959328db4199f/userdata/shm major:0 minor:980 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ae3644549c6caccb0e5b76cf093dd16f97c66829b7bc2c724be0d4328e24c56e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ae3644549c6caccb0e5b76cf093dd16f97c66829b7bc2c724be0d4328e24c56e/userdata/shm major:0 minor:131 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b0475df4d5336da05f2cdbc3f74e49ad376be174c9b01bb8c74b713bd60e7ac6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b0475df4d5336da05f2cdbc3f74e49ad376be174c9b01bb8c74b713bd60e7ac6/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab/userdata/shm major:0 minor:327 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b/userdata/shm major:0 minor:149 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bba3aa271baddd92ed5881d6af79fb82b3a45fce07083a5cd051cbeeb1a01428/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bba3aa271baddd92ed5881d6af79fb82b3a45fce07083a5cd051cbeeb1a01428/userdata/shm major:0 minor:321 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bd884dd8fbf0cb13a01d3369dc09dbcaf952157e210620f5c83187eab601232c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bd884dd8fbf0cb13a01d3369dc09dbcaf952157e210620f5c83187eab601232c/userdata/shm major:0 minor:961 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c13f876ed14f7005d250ab3203aedc5ac3d9bddbbff7570300b321a40f59bd5c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c13f876ed14f7005d250ab3203aedc5ac3d9bddbbff7570300b321a40f59bd5c/userdata/shm major:0 minor:1222 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c44264ca51ad61ed3b05ffa4c975691fd7debf64dbafd9a640308d225a077e0b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c44264ca51ad61ed3b05ffa4c975691fd7debf64dbafd9a640308d225a077e0b/userdata/shm major:0 minor:713 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c5997a9e57f36847e6cb187afed936a398d9d89f0a3c5fbdaa0cdcf0b16bbffd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c5997a9e57f36847e6cb187afed936a398d9d89f0a3c5fbdaa0cdcf0b16bbffd/userdata/shm major:0 minor:1262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6/userdata/shm major:0 minor:350 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ce6d6f50d1ea16153d0bcd0e4641d90ef903c01636f33ef60f26b9dcbbaecad8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ce6d6f50d1ea16153d0bcd0e4641d90ef903c01636f33ef60f26b9dcbbaecad8/userdata/shm major:0 minor:1101 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dac2262b7105102ce37a8db95766fbd5753d50bed12fb86441b8247f4653fc04/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dac2262b7105102ce37a8db95766fbd5753d50bed12fb86441b8247f4653fc04/userdata/shm major:0 minor:956 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd/userdata/shm major:0 minor:311 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e67f95f822c645d6f2dd2098e7e055983609569dd0acfdc0e0bea037bf8d6c03/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e67f95f822c645d6f2dd2098e7e055983609569dd0acfdc0e0bea037bf8d6c03/userdata/shm major:0 minor:719 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ecdffd0c2fc8d747077d4ca5dcb541da82682f6d035455ac42566e8514bfadc3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ecdffd0c2fc8d747077d4ca5dcb541da82682f6d035455ac42566e8514bfadc3/userdata/shm major:0 minor:332 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134/userdata/shm major:0 minor:164 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ed538f41551e0e7b372ee4dcc843f84e56fe8d6677fe847816efda02bfd61218/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ed538f41551e0e7b372ee4dcc843f84e56fe8d6677fe847816efda02bfd61218/userdata/shm major:0 minor:945 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d/userdata/shm major:0 minor:187 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fa27a4561538d102c835ff1b231e3510011f63fe691f54410ca3547822dc8742/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fa27a4561538d102c835ff1b231e3510011f63fe691f54410ca3547822dc8742/userdata/shm major:0 minor:1466 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fafe50d6690c2fbac658b4db9e7e7d0a871a9941f8ee2fd5f2fce340df7fd5f6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fafe50d6690c2fbac658b4db9e7e7d0a871a9941f8ee2fd5f2fce340df7fd5f6/userdata/shm major:0 minor:1470 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fb396b2885c697fc62cb75681d56dacee81e32f235fe9f427b2f065f721f39f2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fb396b2885c697fc62cb75681d56dacee81e32f235fe9f427b2f065f721f39f2/userdata/shm major:0 minor:1102 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0dda6d9b-cb3a-413a-85af-ef08f15ea42e/volumes/kubernetes.io~projected/kube-api-access-62nqj:{mountpoint:/var/lib/kubelet/pods/0dda6d9b-cb3a-413a-85af-ef08f15ea42e/volumes/kubernetes.io~projected/kube-api-access-62nqj major:0 minor:300 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0dda6d9b-cb3a-413a-85af-ef08f15ea42e/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/0dda6d9b-cb3a-413a-85af-ef08f15ea42e/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:693 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1478a21e-b6ac-46fb-ad01-805ac71f0a79/volumes/kubernetes.io~projected/kube-api-access-fz4q6:{mountpoint:/var/lib/kubelet/pods/1478a21e-b6ac-46fb-ad01-805ac71f0a79/volumes/kubernetes.io~projected/kube-api-access-fz4q6 major:0 minor:1295 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1478a21e-b6ac-46fb-ad01-805ac71f0a79/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/1478a21e-b6ac-46fb-ad01-805ac71f0a79/volumes/kubernetes.io~secret/proxy-tls major:0 minor:1291 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/153fec1f-a10b-4c6c-a997-60fa80c13a86/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/153fec1f-a10b-4c6c-a997-60fa80c13a86/volumes/kubernetes.io~projected/ca-certs major:0 minor:561 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/153fec1f-a10b-4c6c-a997-60fa80c13a86/volumes/kubernetes.io~projected/kube-api-access-dr2r9:{mountpoint:/var/lib/kubelet/pods/153fec1f-a10b-4c6c-a997-60fa80c13a86/volumes/kubernetes.io~projected/kube-api-access-dr2r9 major:0 minor:544 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~projected/kube-api-access major:0 minor:324 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~secret/serving-cert major:0 minor:305 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1e6babfe-724a-4eab-bb3b-bc318bf57b70/volumes/kubernetes.io~projected/kube-api-access-c2gd8:{mountpoint:/var/lib/kubelet/pods/1e6babfe-724a-4eab-bb3b-bc318bf57b70/volumes/kubernetes.io~projected/kube-api-access-c2gd8 major:0 minor:318 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1e6babfe-724a-4eab-bb3b-bc318bf57b70/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/1e6babfe-724a-4eab-bb3b-bc318bf57b70/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:694 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1ee7a76b-cf1d-4513-b314-5aa314da818d/volumes/kubernetes.io~projected/kube-api-access-lkdtr:{mountpoint:/var/lib/kubelet/pods/1ee7a76b-cf1d-4513-b314-5aa314da818d/volumes/kubernetes.io~projected/kube-api-access-lkdtr major:0 minor:943 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1ee7a76b-cf1d-4513-b314-5aa314da818d/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/1ee7a76b-cf1d-4513-b314-5aa314da818d/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:934 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~projected/kube-api-access-6dwm5:{mountpoint:/var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~projected/kube-api-access-6dwm5 major:0 minor:1220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~secret/default-certificate major:0 minor:1211 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~secret/metrics-certs major:0 minor:1213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~secret/stats-auth major:0 minor:1219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/365bf663-fd5b-44df-a327-0438995c015d/volumes/kubernetes.io~projected/kube-api-access-lqjgb:{mountpoint:/var/lib/kubelet/pods/365bf663-fd5b-44df-a327-0438995c015d/volumes/kubernetes.io~projected/kube-api-access-lqjgb major:0 minor:957 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/365bf663-fd5b-44df-a327-0438995c015d/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/365bf663-fd5b-44df-a327-0438995c015d/volumes/kubernetes.io~secret/proxy-tls major:0 minor:954 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:341 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/kube-api-access-t5hdg:{mountpoint:/var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/kube-api-access-t5hdg major:0 minor:309 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:701 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b741029-0eb5-409b-b7f1-95e8385dc400/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/3b741029-0eb5-409b-b7f1-95e8385dc400/volumes/kubernetes.io~projected/ca-certs major:0 minor:492 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b741029-0eb5-409b-b7f1-95e8385dc400/volumes/kubernetes.io~projected/kube-api-access-5g7mj:{mountpoint:/var/lib/kubelet/pods/3b741029-0eb5-409b-b7f1-95e8385dc400/volumes/kubernetes.io~projected/kube-api-access-5g7mj major:0 minor:464 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3b741029-0eb5-409b-b7f1-95e8385dc400/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/3b741029-0eb5-409b-b7f1-95e8385dc400/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:595 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3d96c85a-fc88-46af-83d5-6c71ec6e2c23/volumes/kubernetes.io~projected/kube-api-access-ss5kh:{mountpoint:/var/lib/kubelet/pods/3d96c85a-fc88-46af-83d5-6c71ec6e2c23/volumes/kubernetes.io~projected/kube-api-access-ss5kh major:0 minor:907 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3d96c85a-fc88-46af-83d5-6c71ec6e2c23/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/3d96c85a-fc88-46af-83d5-6c71ec6e2c23/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:794 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~projected/kube-api-access-dmq98:{mountpoint:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~projected/kube-api-access-dmq98 major:0 minor:159 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:158 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/480c1f6e-0e13-49f9-bc4e-07350842f16c/volumes/kubernetes.io~projected/kube-api-access-48ns8:{mountpoint:/var/lib/kubelet/pods/480c1f6e-0e13-49f9-bc4e-07350842f16c/volumes/kubernetes.io~projected/kube-api-access-48ns8 major:0 minor:453 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/49760d62-02e5-4882-b47f-663102b04946/volumes/kubernetes.io~projected/kube-api-access-26x2z:{mountpoint:/var/lib/kubelet/pods/49760d62-02e5-4882-b47f-663102b04946/volumes/kubernetes.io~projected/kube-api-access-26x2z major:0 minor:335 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~projected/kube-api-access-xtjln:{mountpoint:/var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~projected/kube-api-access-xtjln major:0 minor:310 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~secret/serving-cert major:0 minor:308 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4d215811-6210-4ec2-8356-f1533dc43f65/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/4d215811-6210-4ec2-8356-f1533dc43f65/volumes/kubernetes.io~projected/kube-api-access major:0 minor:1634 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2/volumes/kubernetes.io~projected/kube-api-access-4bjs8:{mountpoint:/var/lib/kubelet/pods/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2/volumes/kubernetes.io~projected/kube-api-access-4bjs8 major:0 minor:1458 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config major:0 minor:1456 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2/volumes/kubernetes.io~secret/kube-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2/volumes/kubernetes.io~secret/kube-state-metrics-tls major:0 minor:1462 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/531b8927-92db-4e9d-9a0a-12ff948cdaad/volumes/kubernetes.io~projected/kube-api-access-xqblj:{mountpoint:/var/lib/kubelet/pods/531b8927-92db-4e9d-9a0a-12ff948cdaad/volumes/kubernetes.io~projected/kube-api-access-xqblj major:0 minor:953 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/531b8927-92db-4e9d-9a0a-12ff948cdaad/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/531b8927-92db-4e9d-9a0a-12ff948cdaad/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:942 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58187662-b502-4d90-95ce-2aa91a81d256/volumes/kubernetes.io~projected/kube-api-access-ps4ws:{mountpoint:/var/lib/kubelet/pods/58187662-b502-4d90-95ce-2aa91a81d256/volumes/kubernetes.io~projected/kube-api-access-ps4ws major:0 minor:312 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58187662-b502-4d90-95ce-2aa91a81d256/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/58187662-b502-4d90-95ce-2aa91a81d256/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:700 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~projected/kube-api-access major:0 minor:315 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~secret/serving-cert major:0 minor:306 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76/volumes/kubernetes.io~projected/kube-api-access-nqvfm:{mountpoint:/var/lib/kubelet/pods/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76/volumes/kubernetes.io~projected/kube-api-access-nqvfm major:0 minor:1459 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config major:0 minor:1455 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76/volumes/kubernetes.io~secret/openshift-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76/volumes/kubernetes.io~secret/openshift-state-metrics-tls major:0 minor:1461 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~projected/kube-api-access-996h9:{mountpoint:/var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~projected/kube-api-access-996h9 major:0 minor:76 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f0c6889-0739-48a3-99cd-6db9d1f83242/volumes/kubernetes.io~projected/kube-api-access-p5p5d:{mountpoint:/var/lib/kubelet/pods/5f0c6889-0739-48a3-99cd-6db9d1f83242/volumes/kubernetes.io~projected/kube-api-access-p5p5d major:0 minor:329 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f0c6889-0739-48a3-99cd-6db9d1f83242/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/5f0c6889-0739-48a3-99cd-6db9d1f83242/volumes/kubernetes.io~secret/metrics-tls major:0 minor:695 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/60327040-f782-4cda-a32d-52a4f183073c/volumes/kubernetes.io~projected/kube-api-access-zp957:{mountpoint:/var/lib/kubelet/pods/60327040-f782-4cda-a32d-52a4f183073c/volumes/kubernetes.io~projected/kube-api-access-zp957 major:0 minor:1460 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/60327040-f782-4cda-a32d-52a4f183073c/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/60327040-f782-4cda-a32d-52a4f183073c/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config major:0 minor:1457 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/60327040-f782-4cda-a32d-52a4f183073c/volumes/kubernetes.io~secret/node-exporter-tls:{mountpoint:/var/lib/kubelet/pods/60327040-f782-4cda-a32d-52a4f183073c/volumes/kubernetes.io~secret/node-exporter-tls major:0 minor:1463 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/665c4362-e2e5-4f96-92c0-1746c63c7422/volumes/kubernetes.io~projected/kube-api-access-lckv7:{mountpoint:/var/lib/kubelet/pods/665c4362-e2e5-4f96-92c0-1746c63c7422/volumes/kubernetes.io~projected/kube-api-access-lckv7 major:0 minor:796 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/665c4362-e2e5-4f96-92c0-1746c63c7422/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/665c4362-e2e5-4f96-92c0-1746c63c7422/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:705 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/708bf629-9949-4b79-a88a-c73ba033475b/volumes/kubernetes.io~projected/kube-api-access-6vx2z:{mountpoint:/var/lib/kubelet/pods/708bf629-9949-4b79-a88a-c73ba033475b/volumes/kubernetes.io~projected/kube-api-access-6vx2z major:0 minor:148 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~projected/kube-api-access-dtvzs:{mountpoint:/var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~projected/kube-api-access-dtvzs major:0 minor:295 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~secret/serving-cert major:0 minor:289 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f/volumes/kubernetes.io~projected/kube-api-access-b6wsq:{mountpoint:/var/lib/kubelet/pods/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f/volumes/kubernetes.io~projected/kube-api-access-b6wsq major:0 minor:906 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:795 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d0792bf-e2da-4ee7-91fe-032299cea42f/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/7d0792bf-e2da-4ee7-91fe-032299cea42f/volumes/kubernetes.io~projected/kube-api-access major:0 minor:993 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d0792bf-e2da-4ee7-91fe-032299cea42f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7d0792bf-e2da-4ee7-91fe-032299cea42f/volumes/kubernetes.io~secret/serving-cert major:0 minor:988 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~projected/kube-api-access major:0 minor:297 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~secret/serving-cert major:0 minor:288 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8defe125-1529-4091-adff-e9d17a2b298f/volumes/kubernetes.io~projected/kube-api-access-jpxqg:{mountpoint:/var/lib/kubelet/pods/8defe125-1529-4091-adff-e9d17a2b298f/volumes/kubernetes.io~projected/kube-api-access-jpxqg major:0 minor:1090 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/909ed395-8ad3-4350-95e3-b4b19c682f92/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/909ed395-8ad3-4350-95e3-b4b19c682f92/volumes/kubernetes.io~secret/tls-certificates major:0 minor:1212 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/95d8fb27-8b2b-4749-add3-9e9b16edb693/volumes/kubernetes.io~projected/kube-api-access-fb42t:{mountpoint:/var/lib/kubelet/pods/95d8fb27-8b2b-4749-add3-9e9b16edb693/volumes/kubernetes.io~projected/kube-api-access-fb42t major:0 minor:1096 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/95d8fb27-8b2b-4749-add3-9e9b16edb693/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/95d8fb27-8b2b-4749-add3-9e9b16edb693/volumes/kubernetes.io~secret/proxy-tls major:0 minor:1095 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/99996137-2621-458b-980d-584b3640d4ad/volumes/kubernetes.io~projected/kube-api-access-c69rc:{mountpoint:/var/lib/kubelet/pods/99996137-2621-458b-980d-584b3640d4ad/volumes/kubernetes.io~projected/kube-api-access-c69rc major:0 minor:1433 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/99996137-2621-458b-980d-584b3640d4ad/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/99996137-2621-458b-980d-584b3640d4ad/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config major:0 minor:1431 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/99996137-2621-458b-980d-584b3640d4ad/volumes/kubernetes.io~secret/prometheus-operator-tls:{mountpoint:/var/lib/kubelet/pods/99996137-2621-458b-980d-584b3640d4ad/volumes/kubernetes.io~secret/prometheus-operator-tls major:0 minor:1432 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9c31f89c-b01b-4853-a901-bccc25441a46/volumes/kubernetes.io~projected/kube-api-access-czcmr:{mountpoint:/var/lib/kubelet/pods/9c31f89c-b01b-4853-a901-bccc25441a46/volumes/kubernetes.io~projected/kube-api-access-czcmr major:0 minor:1097 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a14df948-1ec4-4785-ad33-28d1e7063959/volumes/kubernetes.io~projected/kube-api-access-2g7n7:{mountpoint:/var/lib/kubelet/pods/a14df948-1ec4-4785-ad33-28d1e7063959/volumes/kubernetes.io~projected/kube-api-access-2g7n7 major:0 minor:915 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a14df948-1ec4-4785-ad33-28d1e7063959/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a14df948-1ec4-4785-ad33-28d1e7063959/volumes/kubernetes.io~secret/serving-cert major:0 minor:912 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a280c582-685e-47ac-bf6b-248aa0c129a9/volumes/kubernetes.io~projected/kube-api-access-xkqq7:{mountpoint:/var/lib/kubelet/pods/a280c582-685e-47ac-bf6b-248aa0c129a9/volumes/kubernetes.io~projected/kube-api-access-xkqq7 major:0 minor:951 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a280c582-685e-47ac-bf6b-248aa0c129a9/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/a280c582-685e-47ac-bf6b-248aa0c129a9/volumes/kubernetes.io~secret/cert major:0 minor:944 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a280c582-685e-47ac-bf6b-248aa0c129a9/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/a280c582-685e-47ac-bf6b-248aa0c129a9/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:937 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~projected/kube-api-access-nml2g:{mountpoint:/var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~projected/kube-api-access-nml2g major:0 minor:299 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:697 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:702 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a45f340c-0eca-4460-8961-4ca360467eeb/volumes/kubernetes.io~projected/kube-api-access-r7ftf:{mountpoint:/var/lib/kubelet/pods/a45f340c-0eca-4460-8961-4ca360467eeb/volumes/kubernetes.io~projected/kube-api-access-r7ftf major:0 minor:929 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a45f340c-0eca-4460-8961-4ca360467eeb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a45f340c-0eca-4460-8961-4ca360467eeb/volumes/kubernetes.io~secret/serving-cert major:0 minor:925 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:296 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/kube-api-access-fxxw7:{mountpoint:/var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/kube-api-access-fxxw7 major:0 minor:294 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~secret/metrics-tls major:0 minor:692 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~projected/kube-api-access-bnwdh:{mountpoint:/var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~projected/kube-api-access-bnwdh major:0 minor:1547 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~secret/client-ca-bundle:{mountpoint:/var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~secret/client-ca-bundle major:0 minor:1544 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~secret/secret-metrics-client-certs:{mountpoint:/var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~secret/secret-metrics-client-certs major:0 minor:1545 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~secret/secret-metrics-server-tls:{mountpoint:/var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~secret/secret-metrics-server-tls major:0 minor:1546 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~projected/kube-api-access-9z8h9:{mountpoint:/var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~projected/kube-api-access-9z8h9 major:0 minor:157 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:156 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b13885ef-d2b5-4591-825d-446cf8729bc1/volumes/kubernetes.io~projected/kube-api-access-xmjkp:{mountpoint:/var/lib/kubelet/pods/b13885ef-d2b5-4591-825d-446cf8729bc1/volumes/kubernetes.io~projected/kube-api-access-xmjkp major:0 minor:1091 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b13885ef-d2b5-4591-825d-446cf8729bc1/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/b13885ef-d2b5-4591-825d-446cf8729bc1/volumes/kubernetes.io~secret Dec 05 12:50:03.060813 master-0 kubenswrapper[29936]: /apiservice-cert major:0 minor:1094 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b13885ef-d2b5-4591-825d-446cf8729bc1/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/b13885ef-d2b5-4591-825d-446cf8729bc1/volumes/kubernetes.io~secret/webhook-cert major:0 minor:1093 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b74e0607-6ed0-4119-8870-895b7d336830/volumes/kubernetes.io~projected/kube-api-access-72wst:{mountpoint:/var/lib/kubelet/pods/b74e0607-6ed0-4119-8870-895b7d336830/volumes/kubernetes.io~projected/kube-api-access-72wst major:0 minor:1089 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~projected/kube-api-access-mvbfq:{mountpoint:/var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~projected/kube-api-access-mvbfq major:0 minor:185 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~secret/webhook-cert major:0 minor:186 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/volumes/kubernetes.io~projected/kube-api-access-hfl8f:{mountpoint:/var/lib/kubelet/pods/b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/volumes/kubernetes.io~projected/kube-api-access-hfl8f major:0 minor:450 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~projected/kube-api-access-55qpg:{mountpoint:/var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~projected/kube-api-access-55qpg major:0 minor:314 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~secret/serving-cert major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bc18a83a-998e-458e-87f0-d5368da52e1b/volumes/kubernetes.io~projected/kube-api-access-bmjn7:{mountpoint:/var/lib/kubelet/pods/bc18a83a-998e-458e-87f0-d5368da52e1b/volumes/kubernetes.io~projected/kube-api-access-bmjn7 major:0 minor:798 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c39d2089-d3bf-4556-b6ef-c362a08c21a2/volumes/kubernetes.io~projected/kube-api-access-mr9jd:{mountpoint:/var/lib/kubelet/pods/c39d2089-d3bf-4556-b6ef-c362a08c21a2/volumes/kubernetes.io~projected/kube-api-access-mr9jd major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c39d2089-d3bf-4556-b6ef-c362a08c21a2/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c39d2089-d3bf-4556-b6ef-c362a08c21a2/volumes/kubernetes.io~secret/serving-cert major:0 minor:70 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6/volumes/kubernetes.io~projected/kube-api-access-mlnqb:{mountpoint:/var/lib/kubelet/pods/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6/volumes/kubernetes.io~projected/kube-api-access-mlnqb major:0 minor:343 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~projected/kube-api-access-ph9w6:{mountpoint:/var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~projected/kube-api-access-ph9w6 major:0 minor:302 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:292 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce9e2a6b-8ce7-477c-8bc7-24033243eabe/volumes/kubernetes.io~projected/kube-api-access-422c9:{mountpoint:/var/lib/kubelet/pods/ce9e2a6b-8ce7-477c-8bc7-24033243eabe/volumes/kubernetes.io~projected/kube-api-access-422c9 major:0 minor:903 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce9e2a6b-8ce7-477c-8bc7-24033243eabe/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/ce9e2a6b-8ce7-477c-8bc7-24033243eabe/volumes/kubernetes.io~secret/metrics-tls major:0 minor:902 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf8247a1-703a-46b3-9a33-25a73b27ab99/volumes/kubernetes.io~projected/kube-api-access-fqdxl:{mountpoint:/var/lib/kubelet/pods/cf8247a1-703a-46b3-9a33-25a73b27ab99/volumes/kubernetes.io~projected/kube-api-access-fqdxl major:0 minor:531 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf8247a1-703a-46b3-9a33-25a73b27ab99/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/cf8247a1-703a-46b3-9a33-25a73b27ab99/volumes/kubernetes.io~secret/signing-key major:0 minor:530 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d3e283fe-a474-4f83-ad66-62971945060a/volumes/kubernetes.io~projected/kube-api-access-pjbwh:{mountpoint:/var/lib/kubelet/pods/d3e283fe-a474-4f83-ad66-62971945060a/volumes/kubernetes.io~projected/kube-api-access-pjbwh major:0 minor:1613 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d3e283fe-a474-4f83-ad66-62971945060a/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/d3e283fe-a474-4f83-ad66-62971945060a/volumes/kubernetes.io~secret/webhook-certs major:0 minor:1588 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~projected/kube-api-access-2tngh:{mountpoint:/var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~projected/kube-api-access-2tngh major:0 minor:293 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~secret/serving-cert major:0 minor:291 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~projected/kube-api-access-wjp62:{mountpoint:/var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~projected/kube-api-access-wjp62 major:0 minor:626 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~secret/encryption-config major:0 minor:623 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~secret/etcd-client major:0 minor:621 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~secret/serving-cert major:0 minor:622 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db27bee9-3d33-4c4a-b38b-72f7cec77c7a/volumes/kubernetes.io~projected/kube-api-access-2nbxt:{mountpoint:/var/lib/kubelet/pods/db27bee9-3d33-4c4a-b38b-72f7cec77c7a/volumes/kubernetes.io~projected/kube-api-access-2nbxt major:0 minor:1265 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db27bee9-3d33-4c4a-b38b-72f7cec77c7a/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/db27bee9-3d33-4c4a-b38b-72f7cec77c7a/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:1263 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db2e54b6-4879-40f4-9359-a8b0c31e76c2/volumes/kubernetes.io~projected/kube-api-access-nwz29:{mountpoint:/var/lib/kubelet/pods/db2e54b6-4879-40f4-9359-a8b0c31e76c2/volumes/kubernetes.io~projected/kube-api-access-nwz29 major:0 minor:947 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db2e54b6-4879-40f4-9359-a8b0c31e76c2/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/db2e54b6-4879-40f4-9359-a8b0c31e76c2/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:936 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db2e54b6-4879-40f4-9359-a8b0c31e76c2/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/db2e54b6-4879-40f4-9359-a8b0c31e76c2/volumes/kubernetes.io~secret/srv-cert major:0 minor:940 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dbe144b5-3b78-4946-bbf9-b825b0e47b07/volumes/kubernetes.io~projected/kube-api-access-mbg7w:{mountpoint:/var/lib/kubelet/pods/dbe144b5-3b78-4946-bbf9-b825b0e47b07/volumes/kubernetes.io~projected/kube-api-access-mbg7w major:0 minor:1321 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dbe144b5-3b78-4946-bbf9-b825b0e47b07/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/dbe144b5-3b78-4946-bbf9-b825b0e47b07/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:1316 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dc5db54b-094f-4c36-a0ad-042e9fc2b61d/volumes/kubernetes.io~projected/kube-api-access-tjkjz:{mountpoint:/var/lib/kubelet/pods/dc5db54b-094f-4c36-a0ad-042e9fc2b61d/volumes/kubernetes.io~projected/kube-api-access-tjkjz major:0 minor:1392 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dc5db54b-094f-4c36-a0ad-042e9fc2b61d/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/dc5db54b-094f-4c36-a0ad-042e9fc2b61d/volumes/kubernetes.io~secret/certs major:0 minor:1391 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dc5db54b-094f-4c36-a0ad-042e9fc2b61d/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/dc5db54b-094f-4c36-a0ad-042e9fc2b61d/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:1390 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2e2d968-9946-4711-aaf0-3e3a03bff415/volumes/kubernetes.io~projected/kube-api-access-pxwwh:{mountpoint:/var/lib/kubelet/pods/e2e2d968-9946-4711-aaf0-3e3a03bff415/volumes/kubernetes.io~projected/kube-api-access-pxwwh major:0 minor:1221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e5dfcb1e-1231-4f07-8c21-748965718729/volumes/kubernetes.io~projected/kube-api-access-pb46q:{mountpoint:/var/lib/kubelet/pods/e5dfcb1e-1231-4f07-8c21-748965718729/volumes/kubernetes.io~projected/kube-api-access-pb46q major:0 minor:922 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e5dfcb1e-1231-4f07-8c21-748965718729/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/e5dfcb1e-1231-4f07-8c21-748965718729/volumes/kubernetes.io~secret/cert major:0 minor:77 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e943438b-1de8-435c-8a19-accd6a6292a4/volumes/kubernetes.io~projected/kube-api-access-lfknz:{mountpoint:/var/lib/kubelet/pods/e943438b-1de8-435c-8a19-accd6a6292a4/volumes/kubernetes.io~projected/kube-api-access-lfknz major:0 minor:101 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e943438b-1de8-435c-8a19-accd6a6292a4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e943438b-1de8-435c-8a19-accd6a6292a4/volumes/kubernetes.io~secret/serving-cert major:0 minor:74 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ebfbe878-1796-4a20-b3f0-76165038252e/volumes/kubernetes.io~projected/kube-api-access-tncxt:{mountpoint:/var/lib/kubelet/pods/ebfbe878-1796-4a20-b3f0-76165038252e/volumes/kubernetes.io~projected/kube-api-access-tncxt major:0 minor:1092 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~projected/kube-api-access-dh58c:{mountpoint:/var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~projected/kube-api-access-dh58c major:0 minor:585 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~secret/encryption-config major:0 minor:576 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~secret/etcd-client major:0 minor:577 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~secret/serving-cert major:0 minor:594 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~projected/kube-api-access-wwcr9:{mountpoint:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~projected/kube-api-access-wwcr9 major:0 minor:319 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/etcd-client major:0 minor:307 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/serving-cert major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f2635f9f-219b-4d03-b5b3-496c0c836fae/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/f2635f9f-219b-4d03-b5b3-496c0c836fae/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:905 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f2635f9f-219b-4d03-b5b3-496c0c836fae/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/f2635f9f-219b-4d03-b5b3-496c0c836fae/volumes/kubernetes.io~empty-dir/tmp major:0 minor:904 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f2635f9f-219b-4d03-b5b3-496c0c836fae/volumes/kubernetes.io~projected/kube-api-access-fwrwm:{mountpoint:/var/lib/kubelet/pods/f2635f9f-219b-4d03-b5b3-496c0c836fae/volumes/kubernetes.io~projected/kube-api-access-fwrwm major:0 minor:703 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~projected/kube-api-access-flxbg:{mountpoint:/var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~projected/kube-api-access-flxbg major:0 minor:301 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~secret/serving-cert major:0 minor:290 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f4a70855-80b5-4d6a-bed1-b42364940de0/volumes/kubernetes.io~projected/kube-api-access-69z2l:{mountpoint:/var/lib/kubelet/pods/f4a70855-80b5-4d6a-bed1-b42364940de0/volumes/kubernetes.io~projected/kube-api-access-69z2l major:0 minor:342 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1/volumes/kubernetes.io~projected/kube-api-access-x59kd:{mountpoint:/var/lib/kubelet/pods/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1/volumes/kubernetes.io~projected/kube-api-access-x59kd major:0 minor:141 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb/volumes/kubernetes.io~projected/kube-api-access-ht5kr:{mountpoint:/var/lib/kubelet/pods/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb/volumes/kubernetes.io~projected/kube-api-access-ht5kr major:0 minor:941 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:933 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb/volumes/kubernetes.io~secret/srv-cert major:0 minor:935 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb7003a6-4341-49eb-bec3-76ba8610fa12/volumes/kubernetes.io~projected/kube-api-access-69n5s:{mountpoint:/var/lib/kubelet/pods/fb7003a6-4341-49eb-bec3-76ba8610fa12/volumes/kubernetes.io~projected/kube-api-access-69n5s major:0 minor:153 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb7003a6-4341-49eb-bec3-76ba8610fa12/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/fb7003a6-4341-49eb-bec3-76ba8610fa12/volumes/kubernetes.io~secret/metrics-certs major:0 minor:699 fsType:tmpfs blockSize:0} overlay_0-1000:{mountpoint:/var/lib/containers/storage/overlay/bb4d6650cc1c096ac75f133750b04900ac2d8f44f50ddc5c0b536fda8275a5a1/merged major:0 minor:1000 fsType:overlay blockSize:0} overlay_0-1002:{mountpoint:/var/lib/containers/storage/overlay/9dda1f3f34e0303afbae816945588102c4d738a752c5302b23eb88b32e44b960/merged major:0 minor:1002 fsType:overlay blockSize:0} overlay_0-1006:{mountpoint:/var/lib/containers/storage/overlay/6246eba8a051e9b5e2de01050b6047cd7f3441d1ed246efbbf93d6bfae023c70/merged major:0 minor:1006 fsType:overlay blockSize:0} overlay_0-1008:{mountpoint:/var/lib/containers/storage/overlay/ec5422016fb94803f1a3fb459cc4c5d5527eb955bf26c2cfa5f430742463c226/merged major:0 minor:1008 fsType:overlay blockSize:0} overlay_0-1013:{mountpoint:/var/lib/containers/storage/overlay/e7791b6695a00085e6b3e6ce9c5d094e71df1f7573df0d0977c589777bcc88df/merged major:0 minor:1013 fsType:overlay blockSize:0} overlay_0-1016:{mountpoint:/var/lib/containers/storage/overlay/53e7022ccd8695fbd2b338ede0f5b737fc6313bfffce994e711e06c83ba27951/merged major:0 minor:1016 fsType:overlay blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/e1c4d5651f2729f41d2001a7c39c91661174691353785a3ed30cdfd14e1f0418/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-1028:{mountpoint:/var/lib/containers/storage/overlay/d893d8086a3b647439d6a74d24d713eac6fd1f896964a27d4f59b5dddcbe7798/merged major:0 minor:1028 fsType:overlay blockSize:0} overlay_0-1030:{mountpoint:/var/lib/containers/storage/overlay/3cd5be4df7d34108794f793af50a86ed0ccf80e567131348b22f971a780d8fc7/merged major:0 minor:1030 fsType:overlay blockSize:0} overlay_0-1032:{mountpoint:/var/lib/containers/storage/overlay/90219a7ac10ec24359318b8fb458874f3fbe7c2a6ae594a1420b80cc9c120c31/merged major:0 minor:1032 fsType:overlay blockSize:0} overlay_0-1037:{mountpoint:/var/lib/containers/storage/overlay/c562075f858776703056b6b776ba18d02a94c6ad8101fe50526dfcf85e23e390/merged major:0 minor:1037 fsType:overlay blockSize:0} overlay_0-1039:{mountpoint:/var/lib/containers/storage/overlay/99bf4613c89ca214f5f83dc0124b5c25489c849c67e413b523f15b45348be722/merged major:0 minor:1039 fsType:overlay blockSize:0} overlay_0-1041:{mountpoint:/var/lib/containers/storage/overlay/8781af25d3f36398f54f4bcd122199bfdda97f67c89c66170857e68c8a50069e/merged major:0 minor:1041 fsType:overlay blockSize:0} overlay_0-106:{mountpoint:/var/lib/containers/storage/overlay/ced8dc0fc55558cacea6d7f809beb3bf84fd7c9fd9ebc3a453c433bec9c2a84c/merged major:0 minor:106 fsType:overlay blockSize:0} overlay_0-1060:{mountpoint:/var/lib/containers/storage/overlay/603c2cb76bc1f72209f06e2e7b5f40d50178ae8469714cebbcc8005baf1d6b04/merged major:0 minor:1060 fsType:overlay blockSize:0} overlay_0-1065:{mountpoint:/var/lib/containers/storage/overlay/b2de9611956c12b242d880897c61c4bdbb6f68da0c4cff332fe3003a246dc4f1/merged major:0 minor:1065 fsType:overlay blockSize:0} overlay_0-1081:{mountpoint:/var/lib/containers/storage/overlay/e822d09f34244854ebeb1d68c3b2c4961900714f23ec876860aa6f383f68cbf2/merged major:0 minor:1081 fsType:overlay blockSize:0} overlay_0-110:{mountpoint:/var/lib/containers/storage/overlay/d9869d4e5a0676fa0f66e4a7fe16c741b99064d09b572e81e4de00661266fcea/merged major:0 minor:110 fsType:overlay blockSize:0} overlay_0-1110:{mountpoint:/var/lib/containers/storage/overlay/5c46db716295c449c8b016ce9583ef441c14fc5935394cde18cd8e01ac4fdcf9/merged major:0 minor:1110 fsType:overlay blockSize:0} overlay_0-1114:{mountpoint:/var/lib/containers/storage/overlay/e3dbcf71e7d27f20fc1f223f2e113217d7d88c7d959cf8dbf76ac842f9874e23/merged major:0 minor:1114 fsType:overlay blockSize:0} overlay_0-1116:{mountpoint:/var/lib/containers/storage/overlay/8575f6630586a95de827bb1fa592bc3c42af6d032113bf3ebf77542e17921981/merged major:0 minor:1116 fsType:overlay blockSize:0} overlay_0-1118:{mountpoint:/var/lib/containers/storage/overlay/d2906024b93930b1e648ba0b68b1aff48649fe2e8c77168533f346987cb56712/merged major:0 minor:1118 fsType:overlay blockSize:0} overlay_0-1122:{mountpoint:/var/lib/containers/storage/overlay/646643a747b96d5084810571db0f71aacef4e8abe74472ff4138b04a3699f244/merged major:0 minor:1122 fsType:overlay blockSize:0} overlay_0-1123:{mountpoint:/var/lib/containers/storage/overlay/e929dea98a2272533640ef861a6e2c6a48ac1371374d1561b9c226268c191d36/merged major:0 minor:1123 fsType:overlay blockSize:0} overlay_0-1128:{mountpoint:/var/lib/containers/storage/overlay/2c407ee1ba52e8fdc969224d42e05b6ffeede71b1e4a3fc50e1be4aef13b926f/merged major:0 minor:1128 fsType:overlay blockSize:0} overlay_0-1133:{mountpoint:/var/lib/containers/storage/overlay/e05fb5b361fea28f3c74382d48073d9261662382f6dcdcde73f0fc26f71e2492/merged major:0 minor:1133 fsType:overlay blockSize:0} overlay_0-1136:{mountpoint:/var/lib/containers/storage/overlay/86d46a504d0764c8c4d704ab96bf7085c945662620e7ed569423363bab2cc4b3/merged major:0 minor:1136 fsType:overlay blockSize:0} overlay_0-1146:{mountpoint:/var/lib/containers/storage/overlay/bb1b02afca0191c7017e775ac8ff250f4f2e86eb52d4d01e0edb88df2b42ac8e/merged major:0 minor:1146 fsType:overlay blockSize:0} overlay_0-1148:{mountpoint:/var/lib/containers/storage/overlay/d27708a835faa5414258f6d4f6ed44fd4b043061f1cd5df30d66155d9a03d56c/merged major:0 minor:1148 fsType:overlay blockSize:0} overlay_0-1154:{mountpoint:/var/lib/containers/storage/overlay/1eee6d6ef0f0d10b24aad23b0b167e65df73ae1da22ae65385dfa623b16c861f/merged major:0 minor:1154 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/7b812ccde5b79b08b2dca803239e216748ceac39303bbb94be5fb32eb091a7bf/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-1162:{mountpoint:/var/lib/containers/storage/overlay/d1a5f189cf187efbc7727a90facf4b521d37fa2d51a8a08069c0d918cfa63932/merged major:0 minor:1162 fsType:overlay blockSize:0} overlay_0-1168:{mountpoint:/var/lib/containers/storage/overlay/6100d00274f0504cfb14150b19e3848634229a2e94808b4463d2f20817be17c7/merged major:0 minor:1168 fsType:overlay blockSize:0} overlay_0-1174:{mountpoint:/var/lib/containers/storage/overlay/2f724a2459f5dd8a73a4646463d2c287af1cd7b8521ec605152e9ee5632b53ec/merged major:0 minor:1174 fsType:overlay blockSize:0} overlay_0-118:{mountpoint:/var/lib/containers/storage/overlay/6a95826d9e1b3cbe089ccac69132deb7bfe8bf4c3344dfad5f9c0e1a97ee26bf/merged major:0 minor:118 fsType:overlay blockSize:0} overlay_0-1186:{mountpoint:/var/lib/containers/storage/overlay/0d996573e2b070402f7376b15153a8ef62f4bceafd2035a9c3c83c390088e75c/merged major:0 minor:1186 fsType:overlay blockSize:0} overlay_0-1188:{mountpoint:/var/lib/containers/storage/overlay/58e75cd35636bdc539571c455194edc66bd5e3a9c422ed8088aec5041e2d78e0/merged major:0 minor:1188 fsType:overlay blockSize:0} overlay_0-1190:{mountpoint:/var/lib/containers/storage/overlay/c152d4caffc7b68abb0072db946c4dd3b31804f6137c28b02c0b74ca56603be4/merged major:0 minor:1190 fsType:overlay blockSize:0} overlay_0-1192:{mountpoint:/var/lib/containers/storage/overlay/831fa043ba38e323729dc1cc6f19dd07113863df615a4fe095f650df35073532/merged major:0 minor:1192 fsType:overlay blockSize:0} overlay_0-1194:{mountpoint:/var/lib/containers/storage/overlay/7420a84e8224af0afc34807200ffbac2e3f89069b99dc80a15e8446fdad4d7e9/merged major:0 minor:1194 fsType:overlay blockSize:0} overlay_0-1196:{mountpoint:/var/lib/containers/storage/overlay/81cbc8dba9a775f61acb8739e97fec73a3cd32b8d362c12fab2eed1d094d8793/merged major:0 minor:1196 fsType:overlay blockSize:0} overlay_0-1198:{mountpoint:/var/lib/containers/storage/overlay/a2842bf206eb5ef04ea10c937d1393b5eabfc5c5882ffac0f0ea8015a848a80e/merged major:0 minor:1198 fsType:overlay blockSize:0} overlay_0-120:{mountpoint:/var/lib/containers/storage/overlay/78dd4e03cf7b36ee1c3c9c4f3ca9b9425e1567b6fee5c91e1890cff7148f32f4/merged major:0 minor:120 fsType:overlay blockSize:0} overlay_0-1200:{mountpoint:/var/lib/containers/storage/overlay/4041727b12e3491f392a31f1ef0d515cb9ba11db10a7bcbad4e951d7304919d4/merged major:0 minor:1200 fsType:overlay blockSize:0} overlay_0-1202:{mountpoint:/var/lib/containers/storage/overlay/f3936fee10bcd62b4ebd42741355ac23ca50154ddf26b90c2d49ff5ca8665ede/merged major:0 minor:1202 fsType:overlay blockSize:0} overlay_0-1204:{mountpoint:/var/lib/containers/storage/overlay/292293edc9238d3b90160e92864cdd8a6041dbc9be5015c9e37804d1f0b3b778/merged major:0 minor:1204 fsType:overlay blockSize:0} overlay_0-1209:{mountpoint:/var/lib/containers/storage/overlay/ce192173dfd9c2a5937d54550ead36f00a2ff07a3eb748261991e1ed1cee0ba8/merged major:0 minor:1209 fsType:overlay blockSize:0} overlay_0-1215:{mountpoint:/var/lib/containers/storage/overlay/3fa7439690cd2c3a525cdec674d3fdce54cade7ac518d482c30c3ddb9e8a4d6a/merged major:0 minor:1215 fsType:overlay blockSize:0} overlay_0-124:{mountpoint:/var/lib/containers/storage/overlay/41a63c485b6f01daf711198e34ae3bd8919d0c4b03a87f453aba290d7e667c4b/merged major:0 minor:124 fsType:overlay blockSize:0} overlay_0-1240:{mountpoint:/var/lib/containers/storage/overlay/5ed7142a056b9d71a4df3bc0795aa604b476fbfb33f35b91b84e2abdc38fd813/merged major:0 minor:1240 fsType:overlay blockSize:0} overlay_0-1243:{mountpoint:/var/lib/containers/storage/overlay/b0639d9da383f4926c26bd80508c74fd86f7fce5f128386ab6e8554d4dc139fb/merged major:0 minor:1243 fsType:overlay blockSize:0} overlay_0-1248:{mountpoint:/var/lib/containers/storage/overlay/5faa6b7e6cc963757710dd23b5dedc5f7490389716357586c510a5fa8970dbe5/merged major:0 minor:1248 fsType:overlay blockSize:0} overlay_0-1250:{mountpoint:/var/lib/containers/storage/overlay/005de5d59e67f0f1e0d4b059561bc85b62ea99b8f00a83874b27b6e4d1753ec0/merged major:0 minor:1250 fsType:overlay blockSize:0} overlay_0-1257:{mountpoint:/var/lib/containers/storage/overlay/07d78fa72bac1ae6f9fe5ba4118c2b0effa5a8bdc03dc09f5d1911b6b5a09254/merged major:0 minor:1257 fsType:overlay blockSize:0} overlay_0-1258:{mountpoint:/var/lib/containers/storage/overlay/c62cc6630da77c5a4ef520bc59d4b23fffc6d1a80bc0da18ceca85132257ae0d/merged major:0 minor:1258 fsType:overlay blockSize:0} overlay_0-126:{mountpoint:/var/lib/containers/storage/overlay/c3bb86d274f22c380bfdb563eb386b6ddec88c1c7c4c9e3efd95417f98a20314/merged major:0 minor:126 fsType:overlay blockSize:0} overlay_0-1274:{mountpoint:/var/lib/containers/storage/overlay/c3c1a32ed0a64bb11adbbf9ea1bf85079236da9f0bde0767492b883e7a70041c/merged major:0 minor:1274 fsType:overlay blockSize:0} overlay_0-1276:{mountpoint:/var/lib/containers/storage/overlay/10cc448d977ea639c399c69cc650c7389026794a36ac04bab5e85ffa0e10a05a/merged major:0 minor:1276 fsType:overlay blockSize:0} overlay_0-1282:{mountpoint:/var/lib/containers/storage/overlay/1eb7daaffaeb558c6b59cdaaeb07beda3117c4f736850c75660cced41e60d566/merged major:0 minor:1282 fsType:overlay blockSize:0} overlay_0-1299:{mountpoint:/var/lib/containers/storage/overlay/3d397a49bc8c42dafb687275e08b3c0c7642c0a33a600d8944e475f1f5de72d8/merged major:0 minor:1299 fsType:overlay blockSize:0} overlay_0-1310:{mountpoint:/var/lib/containers/storage/overlay/a352434d99e4cecc6995c4ad7ddc2ded2cd40db64f030208dfb876d4ca756a8c/merged major:0 minor:1310 fsType:overlay blockSize:0} overlay_0-1312:{mountpoint:/var/lib/containers/storage/overlay/7fcff13c0ca98e294731f6f4b55b4d09a2af9e2a716794285d2fe6f6c8632167/merged major:0 minor:1312 fsType:overlay blockSize:0} overlay_0-1314:{mountpoint:/var/lib/containers/storage/overlay/e2b062eb184c71d05088f43be26a712c577d92ebc0c929dbef09c384036009c6/merged major:0 minor:1314 fsType:overlay blockSize:0} overlay_0-1322:{mountpoint:/var/lib/containers/storage/overlay/16ee4f7014f370e3771d1a80792419f7cb068685e85e5a256aa34e098671743d/merged major:0 minor:1322 fsType:overlay blockSize:0} overlay_0-1324:{mountpoint:/var/lib/containers/storage/overlay/55ad8790627afa38a7169cdfccf9c9ea7dd45d2607a1f2f011e448fea04b93c6/merged major:0 minor:1324 fsType:overlay blockSize:0} overlay_0-1330:{mountpoint:/var/lib/containers/storage/overlay/2188dbd431a3a9f1bf0bd6eaaf8c0c3a70a1c1ae960d4c9e29bc947297e4c441/merged major:0 minor:1330 fsType:overlay blockSize:0} overlay_0-1336:{mountpoint:/var/lib/containers/storage/overlay/1ea5fa3b437da4ebfc37671b12f5cc2981eafe90f3728685495bbf15d21b282e/merged major:0 minor:1336 fsType:overlay blockSize:0} overlay_0-1338:{mountpoint:/var/lib/containers/storage/overlay/08d0afec45c432bb33a4ae1447ce5cb419bf05205b58c4e4bb08dc958c231c29/merged major:0 minor:1338 fsType:overlay blockSize:0} overlay_0-1369:{mountpoint:/var/lib/containers/storage/overlay/09c999d85e5e7befc392829d736a170c155ee8e7db83a3116c95be11e9b2b015/merged major:0 minor:1369 fsType:overlay blockSize:0} overlay_0-1374:{mountpoint:/var/lib/containers/storage/overlay/a37e7bd356b28b3425c38f4d42f04bfd5b5535494874c38afbbf99b27565ac9f/merged major:0 minor:1374 fsType:overlay blockSize:0} overlay_0-1376:{mountpoint:/var/lib/containers/storage/overlay/c8a8ae06f6259012816fcdf39f2be362a21f6f5d96cfb1b5e93a3b1ca8d49c80/merged major:0 minor:1376 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/18cfe0eddc75a85df1a3a6677cf37599156d507390609e8632048f87e76485d5/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-1381:{mountpoint:/var/lib/containers/storage/overlay/43237102bba4d83c7bb96a60bf3af7b9382c38daa3e66e059ae327d3986a220c/merged major:0 minor:1381 fsType:overlay blockSize:0} overlay_0-1382:{mountpoint:/var/lib/containers/storage/overlay/1c29831c486b8548d25ae08c934e24f7dbdb92dbafbad5a97f1d0a51d9e91202/merged major:0 minor:1382 fsType:overlay blockSize:0} overlay_0-139:{mountpoint:/var/lib/containers/storage/overlay/7c9eed8dbadc7d21bfe259abb0e0b78eb49b24a4ef20fa11f8b16f9f112edcf0/merged major:0 minor:139 fsType:overlay blockSize:0} overlay_0-1393:{mountpoint:/var/lib/containers/storage/overlay/0b50431d5ed35fdcf3fee1a3c6b8bc5a5200ae5bd0c9f61c6b5338268e0d43a7/merged major:0 minor:1393 fsType:overlay blockSize:0} overlay_0-1395:{mountpoint:/var/lib/containers/storage/overlay/6f7aff4611223ff10e0ceecea845475ad1012f952ea2e9ecb377020f93bc5594/merged major:0 minor:1395 fsType:overlay blockSize:0} overlay_0-1401:{mountpoint:/var/lib/containers/storage/overlay/a5e618aa39d2a89708f46c5bf91fbb22f74bc885b76d39f3c50067dd92e557fa/merged major:0 minor:1401 fsType:overlay blockSize:0} overlay_0-1403:{mountpoint:/var/lib/containers/storage/overlay/a23ebac727d1d95ad1365bfbaf9384046d0de0bdebeafdbdf0a9741a308a0039/merged major:0 minor:1403 fsType:overlay blockSize:0} overlay_0-1424:{mountpoint:/var/lib/containers/storage/overlay/00ca85e7c2469a7c986b201c3e4e734deb81c309c24f04807e8383d81f3641ce/merged major:0 minor:1424 fsType:overlay blockSize:0} overlay_0-1436:{mountpoint:/var/lib/containers/storage/overlay/47f676f410e5f0a112d200103ebf5a7e575672b5325944a945991edc74753127/merged major:0 minor:1436 fsType:overlay blockSize:0} overlay_0-1438:{mountpoint:/var/lib/containers/storage/overlay/5e3bd2912455d4270db3e5ff6434cfcc44e45f18bccb296b3f509be264a49bde/merged major:0 minor:1438 fsType:overlay blockSize:0} overlay_0-144:{mountpoint:/var/lib/containers/storage/overlay/cb9b117a6e3f0d02be5147ac9286575ef896871d113013e8381ee3ad2b0b427a/merged major:0 minor:144 fsType:overlay blockSize:0} overlay_0-1447:{mountpoint:/var/lib/containers/storage/overlay/d5eebcfb17f14c918db0bd82bb31694b443584d6b74d69f7aa011427c84cc677/merged major:0 minor:1447 fsType:overlay blockSize:0} overlay_0-146:{mountpoint:/var/lib/containers/storage/overlay/0ab9808f7c0e1763419abec4fd6e94c4b7ed80fd3fd1f54dbaeaf32c853c4339/merged major:0 minor:146 fsType:overlay blockSize:0} overlay_0-1468:{mountpoint:/var/lib/containers/storage/overlay/29ea85555273f6d7f69bbbbed78e13a2b6bf617e3fb39ac1a035cdbac683d70c/merged major:0 minor:1468 fsType:overlay blockSize:0} overlay_0-1472:{mountpoint:/var/lib/containers/storage/overlay/af9d208c6b392eb259c86d457efaf7e857d2612431c90ab06c448bb82650421c/merged major:0 minor:1472 fsType:overlay blockSize:0} overlay_0-1474:{mountpoint:/var/lib/containers/storage/overlay/62a5cdecba2e6937e86fc6dc1d239c621b79d62deb5cc4460cdb07f9ee6f5447/merged major:0 minor:1474 fsType:overlay blockSize:0} overlay_0-1476:{mountpoint:/var/lib/containers/storage/overlay/bd9ef174ce00e4934a7c5590957b0b97ff786f1f7aaeb926e57e4ed29eb0153d/merged major:0 minor:1476 fsType:overlay blockSize:0} overlay_0-1485:{mountpoint:/var/lib/containers/storage/overlay/1b6cb7e85d800d2af5f55e1eed3e7ff2bf22600610ba21a79c4b00c9d5a2e80c/merged major:0 minor:1485 fsType:overlay blockSize:0} overlay_0-1493:{mountpoint:/var/lib/containers/storage/overlay/167a8704beb5144d649fef807587accabb21dbbbf18549277969d9675de0a80e/merged major:0 minor:1493 fsType:overlay blockSize:0} overlay_0-1495:{mountpoint:/var/lib/containers/storage/overlay/b354e8807c56585d7bc2528bb9af8a86e41bc3c7b09161501519323b611c2674/merged major:0 minor:1495 fsType:overlay blockSize:0} overlay_0-1497:{mountpoint:/var/lib/containers/storage/overlay/dbb27a257a12dd2c020c594b890de5a4a4c721439775e41ab856a9f839dbf937/merged major:0 minor:1497 fsType:overlay blockSize:0} overlay_0-1507:{mountpoint:/var/lib/containers/storage/overlay/180f8299c5c0a7a73c659eb9b1d28db54214cb9156a34b8f4431a2a6158f2a3e/merged major:0 minor:1507 fsType:overlay blockSize:0} overlay_0-151:{mountpoint:/var/lib/containers/storage/overlay/caca56b5d27ec29990282d0668116750acbba2d7e3d9779738bc16e5c448ab9a/merged major:0 minor:151 fsType:overlay blockSize:0} overlay_0-1520:{mountpoint:/var/lib/containers/storage/overlay/ed946878b5df611d6d12a91ad1ab2492a55264b3903bf143d4b5521da0157531/merged major:0 minor:1520 fsType:overlay blockSize:0} overlay_0-1528:{mountpoint:/var/lib/containers/storage/overlay/49e3ea1dc28b555e7f9f07631f9423675e4424c494d4f66f16e23523654556a7/merged major:0 minor:1528 fsType:overlay blockSize:0} overlay_0-1536:{mountpoint:/var/lib/containers/storage/overlay/988f3a8b87387752cc2dfea2e6a58d1b71c409d4876fb5e91b8b679e77e36c1b/merged major:0 minor:1536 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/9fd902f996c13fe0242574cdafad3b2c5ff3681aa7b0fc7635f8f860481968c3/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-1550:{mountpoint:/var/lib/containers/storage/overlay/022e7c2fcd0d47f673f4050f3b888e88157bc4c5372a6b0f7e29c2b646ad8701/merged major:0 minor:1550 fsType:overlay blockSize:0} overlay_0-1552:{mountpoint:/var/lib/containers/storage/overlay/793befc3a9cea40deaa45c4979df8f1ac70b4c6925c4cc84c463f254d2cf8682/merged major:0 minor:1552 fsType:overlay blockSize:0} overlay_0-1562:{mountpoint:/var/lib/containers/storage/overlay/b8cea52ad68ccd0422569e674efa2e64710b6812ca99edf3b990c0da6e35a732/merged major:0 minor:1562 fsType:overlay blockSize:0} overlay_0-1565:{mountpoint:/var/lib/containers/storage/overlay/26d6d4e1c977ec4248427996da29af77383bb41b07141724a16bb157b7b2ba52/merged major:0 minor:1565 fsType:overlay blockSize:0} overlay_0-1566:{mountpoint:/var/lib/containers/storage/overlay/2c04f3440b7ad0a63003f99ebafca90a1144cdc37c95b89b3c7bad417dc269d7/merged major:0 minor:1566 fsType:overlay blockSize:0} overlay_0-1576:{mountpoint:/var/lib/containers/storage/overlay/b8edad652af3a909bd6de5367ea3892c3b5e142d66557966e6fb2649a20d5338/merged major:0 minor:1576 fsType:overlay blockSize:0} overlay_0-1577:{mountpoint:/var/lib/containers/storage/overlay/03ed6dec60f7ae32f85ef6e77cabde2325bc9fab7c4d1b147e568a23d088c9dd/merged major:0 minor:1577 fsType:overlay blockSize:0} overlay_0-1589:{mountpoint:/var/lib/containers/storage/overlay/8dfe930dc026de14413cf50d9efd31bc15aba9ae2d7c54b7108fabb328301d82/merged major:0 minor:1589 fsType:overlay blockSize:0} overlay_0-1595:{mountpoint:/var/lib/containers/storage/overlay/5d7b7292642d520a98620ddb4df0f7d16c3435e594d49d52a0b9bd59e89bd68e/merged major:0 minor:1595 fsType:overlay blockSize:0} overlay_0-1615:{mountpoint:/var/lib/containers/storage/overlay/6051f57cd5d66bb1d95e31980da761e8bf743c71992c62e1459820457b675b06/merged major:0 minor:1615 fsType:overlay blockSize:0} overlay_0-1619:{mountpoint:/var/lib/containers/storage/overlay/5406f10958b1fffbbde0ecba91892c4853c8a90e0d626ef0a2cc0b7da4966f4c/merged major:0 minor:1619 fsType:overlay blockSize:0} overlay_0-162:{mountpoint:/var/lib/containers/storage/overlay/5cd3bf058edf49ff3ffbbd533611e3e6f91d8142b72269ad7b85352f5944ef79/merged major:0 minor:162 fsType:overlay blockSize:0} overlay_0-1640:{mountpoint:/var/lib/containers/storage/overlay/3d42473bc4f945a84c0affb3713f98df88b3088c3457a191aa0761250dde8c17/merged major:0 minor:1640 fsType:overlay blockSize:0} overlay_0-1646:{mountpoint:/var/lib/containers/storage/overlay/f8f6bf17095fb670aff71d6c4c19b360a476b19fb3ba34e88921c18238e17873/merged major:0 minor:1646 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/bca31ded97068415413387593851c0977265e7e7169339d6e791b5ff6aa32b7f/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-1665:{mountpoint:/var/lib/containers/storage/overlay/44006f8932475663489524c9bbbf087b32a99332a9c71630082b119c3b5803a4/merged major:0 minor:1665 fsType:overlay blockSize:0} overlay_0-1674:{mountpoint:/var/lib/containers/storage/overlay/0be77c4b7a097e96bf005f4f488edb88c327ffa826b1580e2316f9562b4f2a06/merged major:0 minor:1674 fsType:overlay blockSize:0} overlay_0-1676:{mountpoint:/var/lib/containers/storage/overlay/994db31d10fa39a0efd3f6e1c39701676f8c749cac081b73b7902fa74e862510/merged major:0 minor:1676 fsType:overlay blockSize:0} overlay_0-168:{mountpoint:/var/lib/containers/storage/overlay/27fa107044f8e2b4bf52ef0af9b888461d16231f2467d98248fd7170e24645fc/merged major:0 minor:168 fsType:overlay blockSize:0} overlay_0-1685:{mountpoint:/var/lib/containers/storage/overlay/17d94842fa193f2af2bdf5e54b300f9370a8c0ac7b6b79e680064b5e42b481ef/merged major:0 minor:1685 fsType:overlay blockSize:0} overlay_0-170:{mountpoint:/var/lib/containers/storage/overlay/f396a5163e890309cd50cbe125aa426b841dfdb0dcc6080f06f52b23f5be551b/merged major:0 minor:170 fsType:overlay blockSize:0} overlay_0-172:{mountpoint:/var/lib/containers/storage/overlay/5c65662a322b4ae50a6c0cdb0aaf7519ed441eab69704037b551581159fd5f64/merged major:0 minor:172 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/ac62cd12baf8c7a5d2ed3c5aa1f2735c50dcc3076bf74dcfdfbad53a68ac998e/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-183:{mountpoint:/var/lib/containers/storage/overlay/939c3753384f1514c3b853d68e4b24a4a2a080bb9f674fdc6e890c86e8c37f2b/merged major:0 minor:183 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/269c002f4c930e7db768dc98adf0f0a333ee354a44fa1614a670ce6df48d0ad0/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-191:{mountpoint:/var/lib/containers/storage/overlay/409b9a9c36edf4083b8c0c2dc75f654720588680463712b46fcec148cb1a912d/merged major:0 minor:191 fsType:overlay blockSize:0} overlay_0-193:{mountpoint:/var/lib/containers/storage/overlay/f712703810076067635ff08101d4fb0010dd2cebceeb9318a17e00faee44a766/merged major:0 minor:193 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/00358ee9d1b6d66485bd6258619d87322518f8598c3f12edc50c54b802edcb15/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-197:{mountpoint:/var/lib/containers/storage/overlay/f65afd34ddd49773a2c4ff9b7affb041aff8c6067349ccac6ff871b1b710454d/merged major:0 minor:197 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/9e3cadddfe6201f0c6f29135f52d4a2ca63642665652cd9a587cf54141709b6f/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-202:{mountpoint:/var/lib/containers/storage/overlay/6e0ce96fb587ac7d21f986f22002cf9abdaeedbf12b51228f1f0d5fa49cc167d/merged major:0 minor:202 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/182ac96263feee5bca0f0e2143f00e777790c4e5735ff3ba8ddbe112d125ddff/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-206:{mountpoint:/var/lib/containers/storage/overlay/8659c156470e53fe5c973827ee699001769b2242553bea709e756b4e67b3d9d5/merged major:0 minor:206 fsType:overlay blockSize:0} overlay_0-207:{mountpoint:/var/lib/containers/storage/overlay/c3a498f5ba15a0ed7729cb26f6f3c228ba4026d3a7f0497488f0b54116bdfe2a/merged major:0 minor:207 fsType:overlay blockSize:0} overlay_0-209:{mountpoint:/var/lib/containers/storage/overlay/79604ab4c62deaecd3102942c71c528ed97b5f96aac5ac5736d9466f4777edd1/merged major:0 minor:209 fsType:overlay blockSize:0} overlay_0-219:{mountpoint:/var/lib/containers/storage/overlay/1094d5ec8cf7311c19801ff2621601a52045a2ad41158146fe9d2eee53042e5c/merged major:0 minor:219 fsType:overlay blockSize:0} overlay_0-228:{mountpoint:/var/lib/containers/storage/overlay/297ace21ddd432d81d7b585c1f45b6cfd386564184684f1e7ee048b441e78a40/merged major:0 minor:228 fsType:overlay blockSize:0} overlay_0-236:{mountpoint:/var/lib/containers/storage/overlay/76738f80f3bd0288fddd5a2c1ddfac2cb8aa863f26e8cff9a0e8faca265722ab/merged major:0 minor:236 fsType:overlay blockSize:0} overlay_0-244:{mountpoint:/var/lib/containers/storage/overlay/6deec249017972248057f2fad13f5c9fc72fb5ba9980da2625467e86de5b8aa0/merged major:0 minor:244 fsType:overlay blockSize:0} overlay_0-252:{mountpoint:/var/lib/containers/storage/overlay/e3b3474e3c26528abebded8429a3051db3fe95979c71ae59152deb3ae0668d40/merged major:0 minor:252 fsType:overlay blockSize:0} overlay_0-260:{mountpoint:/var/lib/containers/storage/overlay/4ac9ba4f6247f04e187d7fe8a1d8377f91bf4b8c55f8b24bd6ee401f089bdb01/merged major:0 minor:260 fsType:overlay blockSize:0} overlay_0-268:{mountpoint:/var/lib/containers/storage/overlay/3e822920d6ed6f2b76af28ff2be7525444850654efc38e57b420fbca87d9864f/merged major:0 minor:268 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/3224cce41c1fa6ac4bc26df67118122dc4d84a179332d3e14c569bc9639f3262/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/2ec937be02c006e8ea30f0c26d738878ef91fcb8b10f853675598f257c084c85/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-298:{mountpoint:/var/lib/containers/storage/overlay/4ff000b7b286a4e2c514a4f1ff8275d11baa0b22f8d717b8ab0c391543404341/merged major:0 minor:298 fsType:overlay blockSize:0} overlay_0-338:{mountpoint:/var/lib/containers/storage/overlay/4a664c51113d273b7d013c6c40831a5c27a717b27b9e9258f466a9e8218e1f42/merged major:0 minor:338 fsType:overlay blockSize:0} overlay_0-346:{mountpoint:/var/lib/containers/storage/overlay/95f70ee53f3561b38690bac06c2e6778e48d100ce469a85b8db0f5aa99e21b1d/merged major:0 minor:346 fsType:overlay blockSize:0} overlay_0-348:{mountpoint:/var/lib/containers/storage/overlay/6d67683af6fd1ed1094663b55adecef20a4d8f64a540ba84705b4b98152e82f1/merged major:0 minor:348 fsType:overlay blockSize:0} overlay_0-351:{mountpoint:/var/lib/containers/storage/overlay/8749c62c3a7fb854198a5e327d0b7160f22eeadd093d63723505af98920f563e/merged major:0 minor:351 fsType:overlay blockSize:0} overlay_0-356:{mountpoint:/var/lib/containers/storage/overlay/180f64707d36d3a3a449455105de1db58d613f89875b7dd3055e3eaa2e4cd837/merged major:0 minor:356 fsType:overlay blockSize:0} overlay_0-358:{mountpoint:/var/lib/containers/storage/overlay/8fd0db57a1b6addcec7e68a35f0aa1c280da87f0c23e452949f20eddb658417c/merged major:0 minor:358 fsType:overlay blockSize:0} overlay_0-362:{mountpoint:/var/lib/containers/storage/overlay/be87fd787b02484e30a261a48a388c79c4042d4175481763c6feb4044fd6db8b/merged major:0 minor:362 fsType:overlay blockSize:0} overlay_0-364:{mountpoint:/var/lib/containers/storage/overlay/9546388ec8993231c68c32ca39851d2ed8dcc814833d9fa999526d11ff3a9cde/merged major:0 minor:364 fsType:overlay blockSize:0} overlay_0-366:{mountpoint:/var/lib/containers/storage/overlay/4e55472085cd5241a0d7ac0f8b0557312fd2ed995a41d253dacacd625f35afe4/merged major:0 minor:366 fsType:overlay blockSize:0} overlay_0-368:{mountpoint:/var/lib/containers/storage/overlay/fae14a26a50255f69a0e16428a56449ac4183e03e41b9a663b5409bb10e95950/merged major:0 minor:368 fsType:overlay blockSize:0} overlay_0-370:{mountpoint:/var/lib/containers/storage/overlay/a3bfc79070e9d4b9ec3df7498044adc650bb6467643cf465e47146211d906477/merged major:0 minor:370 fsType:overlay blockSize:0} overlay_0-372:{mountpoint:/var/lib/containers/storage/overlay/68a7f71e3b40e9782a03553f85ef7bdc14c259064c6296ddd229c55a2612e3ca/merged major:0 minor:372 fsType:overlay blockSize:0} overlay_0-374:{mountpoint:/var/lib/containers/storage/overlay/2a0150cc2c7e3c6d8d319702844184974ac344e27d05464cdbee4621a5f950dc/merged major:0 minor:374 fsType:overlay blockSize:0} overlay_0-377:{mountpoint:/var/lib/containers/storage/overlay/b88bc2281eb7390a96b0c0e7b2a5812b227be330cecc8a17392e1dd02555e65d/merged major:0 minor:377 fsType:overlay blockSize:0} overlay_0-380:{mountpoint:/var/lib/containers/storage/overlay/d889cadc0a7f3170e4b6bc5aafcf11670968e2610d43faf51bd580dbbdaff4c3/merged major:0 minor:380 fsType:overlay blockSize:0} overlay_0-390:{mountpoint:/var/lib/containers/storage/overlay/db1d3852c9bb1aa6e01fcbf5f1bd27200b172b147e54dc0ecf076c4a9440dc0a/merged major:0 minor:390 fsType:overlay blockSize:0} overlay_0-392:{mountpoint:/var/lib/containers/storage/overlay/7d54de20a5e4499e9262038fffd1e23faca7330d9887e8a137d96760b9180e85/merged major:0 minor:392 fsType:overlay blockSize:0} overlay_0-395:{mountpoint:/var/lib/containers/storage/overlay/ced8e36d2e672d1d9e1f659917478c242c85d829af37d39b95dcaf8643eaeedc/merged major:0 minor:395 fsType:overlay blockSize:0} overlay_0-397:{mountpoint:/var/lib/containers/storage/overlay/5af4e532a35d8ec107ac2e13adc59fbd8e0a1f2b1fe16d5d6842b569cba77b30/merged major:0 minor:397 fsType:overlay blockSize:0} overlay_0-398:{mountpoint:/var/lib/containers/storage/overlay/b25340a09b06217b943d6ee900b1b4a638a87b74a3c5d01a41e1698531a4deeb/merged major:0 minor:398 fsType:overlay blockSize:0} overlay_0-401:{mountpoint:/var/lib/containers/storage/overlay/82c43d7528efb823375d36118ae6ae4c2bbbe577690c33feb97be5beea0e96f7/merged major:0 minor:401 fsType:overlay blockSize:0} overlay_0-406:{mountpoint:/var/lib/containers/storage/overlay/62a8d20da7a3b43fb4c36cdf98d88620c8075b0c728c46a9dfe6421e3e5c1649/merged major:0 minor:406 fsType:overlay blockSize:0} overlay_0-41:{mountpoint:/var/lib/containers/storage/overlay/aec51b73f5901fd2fcd279913b60e62a7f1317958abbc2a5bbf9dedb63c9c857/merged major:0 minor:41 fsType:overlay blockSize:0} overlay_0-412:{mountpoint:/var/lib/containers/storage/overlay/75a5157dd25b3821aaab71bb43e2bd158e828378a027302feeab07a999edcebd/merged major:0 minor:412 fsType:overlay blockSize:0} overlay_0-415:{mountpoint:/var/lib/containers/storage/overlay/5bb061c4880d859d9be2ea6ef3c79eb016502ab3e2ebeb72445efccf82b320a6/merged major:0 minor:415 fsType:overlay blockSize:0} overlay_0-420:{mountpoint:/var/lib/containers/storage/overlay/0e3f37dd39a08f8568a628d60fec58c48bd7fb132671cfa7f2cbd9248802441d/merged major:0 minor:420 fsType:overlay blockSize:0} overlay_0-426:{mountpoint:/var/lib/containers/storage/overlay/e2302fafca233e1c324f72faa3d2b379202a5821b4444c66779d4fe020be0df5/merged major:0 minor:426 fsType:overlay blockSize:0} overlay_0-433:{mountpoint:/var/lib/containers/storage/overlay/d8e0b85be31e21fa0444d9e5bed2834567dfa2e815506a0f3902ecd639578cf5/merged major:0 minor:433 fsType:overlay blockSize:0} overlay_0-435:{mountpoint:/var/lib/containers/storage/overlay/dcde0ab5e97d6c3bc1b6ede934adb8068711e6d9e5f3844b4e0f07ef42374b28/merged major:0 minor:435 fsType:overlay blockSize:0} overlay_0-437:{mountpoint:/var/lib/containers/storage/overlay/c37c73efdbcac03dee86cb74a1671adfe4dd5ad8178a1db9579b9767a5891a11/merged major:0 minor:437 fsType:overlay blockSize:0} overlay_0-440:{mountpoint:/var/lib/containers/storage/overlay/1e189c3193086226f37da9b49e1244844a79e0753ffc52800c42b8a5bc01749d/merged major:0 minor:440 fsType:overlay blockSize:0} overlay_0-441:{mountpoint:/var/lib/containers/storage/overlay/e533c8ac6f50eadd4483f9e55ed74dd81d02757e3d974486a4323a8db0613889/merged major:0 minor:441 fsType:overlay blockSize:0} overlay_0-444:{mountpoint:/var/lib/containers/storage/overlay/4fd94b7ead1f1ca24eb33c14fbe87cc4b0749be0135ce49bc61c75d75014ed5c/merged major:0 minor:444 fsType:overlay blockSize:0} overlay_0-446:{mountpoint:/var/lib/containers/storage/overlay/a754a6c991d175015552b09f8d85329fad98d3ff40b10f6603e668a90ab1dd9b/merged major:0 minor:446 fsType:overlay blockSize:0} overlay_0-456:{mountpoint:/var/lib/containers/storage/overlay/e2582b1b1f1cdde6d2a5ee63969b8e0eb8c19ac44a45fe8c91b977206cd2f0c6/merged major:0 minor:456 fsType:overlay blockSize:0} overlay_0-460:{mountpoint:/var/lib/containers/storage/overlay/c04761481ea979e0a3627cbd6e4333ba228e5687adb1344aabdcfd469c4fb6e9/merged major:0 minor:460 fsType:overlay blockSize:0} overlay_0-463:{mountpoint:/var/lib/containers/storage/overlay/56a9041209ed8221f99f47232e4d558ed5c8feed3acb75751c09cc3ad9037bc7/merged major:0 minor:463 fsType:overlay blockSize:0} ove Dec 05 12:50:03.061239 master-0 kubenswrapper[29936]: rlay_0-466:{mountpoint:/var/lib/containers/storage/overlay/d4603addb56e2d1de31e87bfee6b3349167cc7084b2b5502f6fb4b2dbd36f0c8/merged major:0 minor:466 fsType:overlay blockSize:0} overlay_0-470:{mountpoint:/var/lib/containers/storage/overlay/5cf581c1f06d4a1893ff8213d4a7b8e473c20a03b4ec15c41aef4b2225a4819d/merged major:0 minor:470 fsType:overlay blockSize:0} overlay_0-477:{mountpoint:/var/lib/containers/storage/overlay/6bf978adbef86c15709051fc4c802390668a896d0e893de1ea1d4f4a2a5660cd/merged major:0 minor:477 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/5d3ac4b0f711c5592ad91f20c9287574882a9e4c3c8056f4bb312af6b71d20b9/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-481:{mountpoint:/var/lib/containers/storage/overlay/ac2e36c77a52dd344b20249bd5fb9ae75adbb5787da9ba7ad4e2f6f063a1ff94/merged major:0 minor:481 fsType:overlay blockSize:0} overlay_0-485:{mountpoint:/var/lib/containers/storage/overlay/abd32147566bfff5d8ff0924da310a7eabc4e36c08d7a891d07dcbca71859025/merged major:0 minor:485 fsType:overlay blockSize:0} overlay_0-501:{mountpoint:/var/lib/containers/storage/overlay/dcd46977a9e53502150370cd11b16d8956baae4dbde2af6b435a95e3781d9373/merged major:0 minor:501 fsType:overlay blockSize:0} overlay_0-503:{mountpoint:/var/lib/containers/storage/overlay/4d2f9035c7e62e14ae2942c5d1028f4b392da5b1f353cc0cb544730981deaa92/merged major:0 minor:503 fsType:overlay blockSize:0} overlay_0-505:{mountpoint:/var/lib/containers/storage/overlay/e726eb01d2eaeaaeceae48326d5ace876dee2c0cf7830cd7fab7c398d3a32514/merged major:0 minor:505 fsType:overlay blockSize:0} overlay_0-507:{mountpoint:/var/lib/containers/storage/overlay/3f2dd8800472cc547c5d379fa298744a69eca76ecc0d618834dd12a328ed2fbb/merged major:0 minor:507 fsType:overlay blockSize:0} overlay_0-509:{mountpoint:/var/lib/containers/storage/overlay/a0cc592c2761ac56fd402e9e213347d92bc747b27b24045d172bf0ca3a8507d6/merged major:0 minor:509 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/32ff0bba3746c8b83682d66b4c205c44506e45f73f614f40bd59b1a18184b9a2/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-521:{mountpoint:/var/lib/containers/storage/overlay/3cfb7ca9d171280a6f7f76f0b492d7093a32ce8c386a19bce72c55a7d9367222/merged major:0 minor:521 fsType:overlay blockSize:0} overlay_0-523:{mountpoint:/var/lib/containers/storage/overlay/a931c45614063da068c6e77c2bfaf55f094220859f4853c0aa8549827f75fc2c/merged major:0 minor:523 fsType:overlay blockSize:0} overlay_0-538:{mountpoint:/var/lib/containers/storage/overlay/7fdd3f462d2f531865b64074e9c3f9d7ec0f586b2257f6986cd3b18551cb339a/merged major:0 minor:538 fsType:overlay blockSize:0} overlay_0-550:{mountpoint:/var/lib/containers/storage/overlay/821cdedf2c7032541ef6c7e91b18f2938ec20c2dec7045a512523f66bdc538ed/merged major:0 minor:550 fsType:overlay blockSize:0} overlay_0-552:{mountpoint:/var/lib/containers/storage/overlay/f49aa8f1520f27a1b4b28d946cc3f4fa7d21b2d1c5c16524339826a45498565d/merged major:0 minor:552 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/0f208c6e81feb4f53230a27a0d62529133cdcbd6c08905b510683d1d5507146f/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-566:{mountpoint:/var/lib/containers/storage/overlay/b14deb1ebe15afb706b9d33ed82857ae0280e5d79d245362e441c6a3f34625bf/merged major:0 minor:566 fsType:overlay blockSize:0} overlay_0-568:{mountpoint:/var/lib/containers/storage/overlay/36633c78cd32a68e26f109a69bde61f4e0170091fb61bac9c9f2a81521182971/merged major:0 minor:568 fsType:overlay blockSize:0} overlay_0-570:{mountpoint:/var/lib/containers/storage/overlay/8596cc5fa0f4364650b768d731deb07351fc4e4024917ab4c0577483ec2fecae/merged major:0 minor:570 fsType:overlay blockSize:0} overlay_0-578:{mountpoint:/var/lib/containers/storage/overlay/bc12d5952a309d5cec69eee538aaffa937df12ab9f60faf4b08f55a981128a66/merged major:0 minor:578 fsType:overlay blockSize:0} overlay_0-58:{mountpoint:/var/lib/containers/storage/overlay/33a159e28438638ac2d2d46c318403ff392230e55a698e418a6fa61d0053ed7f/merged major:0 minor:58 fsType:overlay blockSize:0} overlay_0-586:{mountpoint:/var/lib/containers/storage/overlay/1636b5ba65c66941e484fec6829ce7d9f553f7c8b193fed740325ee69cd6da1c/merged major:0 minor:586 fsType:overlay blockSize:0} overlay_0-600:{mountpoint:/var/lib/containers/storage/overlay/98f53818000530312ee877870cf3f333c13fe4c90f736970291c3a1e7b20ea83/merged major:0 minor:600 fsType:overlay blockSize:0} overlay_0-602:{mountpoint:/var/lib/containers/storage/overlay/c7e993ce1ad0eae1303f1f0e897d8deab7249ec42ad953acd50488dfd2cf76e5/merged major:0 minor:602 fsType:overlay blockSize:0} overlay_0-604:{mountpoint:/var/lib/containers/storage/overlay/b3d175bab8cad9ef7b6b1dbbdc443a7208bfc2a1c310fbb6b6dd3548544f8dc3/merged major:0 minor:604 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/e3923e8cce784e95881e391276cd0c1646d15a40caeabff733b9e204e0c03e21/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-628:{mountpoint:/var/lib/containers/storage/overlay/d1065c79c21806d2ae644bd931167e5157d001f7df123e345bfa23363b56c82d/merged major:0 minor:628 fsType:overlay blockSize:0} overlay_0-630:{mountpoint:/var/lib/containers/storage/overlay/a5935f11fb0739b5c2bcf7e418f3d3f2ed4d099e60d3c1a60f197a0410225ff9/merged major:0 minor:630 fsType:overlay blockSize:0} overlay_0-634:{mountpoint:/var/lib/containers/storage/overlay/fa99e68ba1c4bcbab30f789d8d9160d66ffb2208aa28fad42546ca717e1d8e22/merged major:0 minor:634 fsType:overlay blockSize:0} overlay_0-640:{mountpoint:/var/lib/containers/storage/overlay/47760bd56bc29ceac1f6fab0bab47bf6398dbc3935695fff188a13c2c9d2f042/merged major:0 minor:640 fsType:overlay blockSize:0} overlay_0-649:{mountpoint:/var/lib/containers/storage/overlay/0937b078453590486644337f909bc3ccd0afb0d23cbd1af7b0c91c61c284ebe8/merged major:0 minor:649 fsType:overlay blockSize:0} overlay_0-652:{mountpoint:/var/lib/containers/storage/overlay/5e6e27d940b25017750d95f7a5a284bf300fdf839d1fd4f8654a5ee5599c570f/merged major:0 minor:652 fsType:overlay blockSize:0} overlay_0-654:{mountpoint:/var/lib/containers/storage/overlay/bfff27e242048a93df35fe44d4b6289269f58758e1923545124588919ea31a78/merged major:0 minor:654 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/9f1ff526e181b56abc7611673b866736ce126cb83ac65cb1463e251a1e5d653b/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-665:{mountpoint:/var/lib/containers/storage/overlay/5c296c629728e1d0ae6a877fb1c81a0daf233875ee3e820e430cd310f88d834f/merged major:0 minor:665 fsType:overlay blockSize:0} overlay_0-671:{mountpoint:/var/lib/containers/storage/overlay/00318ad7f89a7f316d4563178d7e73f5f60f322ef08ef2a7ca423a5b4d54e138/merged major:0 minor:671 fsType:overlay blockSize:0} overlay_0-674:{mountpoint:/var/lib/containers/storage/overlay/8876b4888064815887bb8fe5376928008112e3b9301bff61860e49fd53fe823b/merged major:0 minor:674 fsType:overlay blockSize:0} overlay_0-676:{mountpoint:/var/lib/containers/storage/overlay/adfda51304475c93ebd14962981ad30fa530c540baab8244911f50ed46d127dd/merged major:0 minor:676 fsType:overlay blockSize:0} overlay_0-68:{mountpoint:/var/lib/containers/storage/overlay/81ca16bd973c4852101ec7e4938e3f1714d78564d3f7cc48706a91a5451f2909/merged major:0 minor:68 fsType:overlay blockSize:0} overlay_0-682:{mountpoint:/var/lib/containers/storage/overlay/7a36e54696633d919d40b8d2fa39b1fdbe5991cfbbdfd715f3f18124ce395dae/merged major:0 minor:682 fsType:overlay blockSize:0} overlay_0-71:{mountpoint:/var/lib/containers/storage/overlay/a32b9a72a8a898916ce499a850b7a907c1dc1c949da1df95a0636f4e2173026f/merged major:0 minor:71 fsType:overlay blockSize:0} overlay_0-729:{mountpoint:/var/lib/containers/storage/overlay/bd421de40ac55fcb055164771f8e92831fdc74b52562730ea990711f56c722f3/merged major:0 minor:729 fsType:overlay blockSize:0} overlay_0-730:{mountpoint:/var/lib/containers/storage/overlay/ab87b9ac4533d41270a563e867f6a265a346b7d03b780008d747577d05884c4a/merged major:0 minor:730 fsType:overlay blockSize:0} overlay_0-736:{mountpoint:/var/lib/containers/storage/overlay/26f6eaf9a893c350d5b5aaee8fdd27c77147712c114e3c02dcd0d2368f3c36d9/merged major:0 minor:736 fsType:overlay blockSize:0} overlay_0-738:{mountpoint:/var/lib/containers/storage/overlay/ec8cf8538b05fe90453749e8cb58e7e2f5b5c9f2545ea8db3238713dbef75e65/merged major:0 minor:738 fsType:overlay blockSize:0} overlay_0-740:{mountpoint:/var/lib/containers/storage/overlay/6a767ccaba5e358a8822fa79772d05f13564b10dd2e78d32504f5b169fc3da69/merged major:0 minor:740 fsType:overlay blockSize:0} overlay_0-744:{mountpoint:/var/lib/containers/storage/overlay/f5b275a2573811a4544583eb3691e09bde22ed7301526f1efafaa74a0cab2aed/merged major:0 minor:744 fsType:overlay blockSize:0} overlay_0-746:{mountpoint:/var/lib/containers/storage/overlay/f1754c5638320a460047542edee7e3f15ea915bb7071e37696ba61f2bce06a7c/merged major:0 minor:746 fsType:overlay blockSize:0} overlay_0-748:{mountpoint:/var/lib/containers/storage/overlay/83ca32fee339709b7edc9b2b493e9fe1f48600e501415881473894fa2b704e7b/merged major:0 minor:748 fsType:overlay blockSize:0} overlay_0-750:{mountpoint:/var/lib/containers/storage/overlay/ca005acbee919785eaa8220a63d9a454e7f36b16b381098fbfed7f1a7758ca68/merged major:0 minor:750 fsType:overlay blockSize:0} overlay_0-752:{mountpoint:/var/lib/containers/storage/overlay/cd648d12a0ff4a98be6878639cd4e0ae752631061c0355650624fbf7cca6d107/merged major:0 minor:752 fsType:overlay blockSize:0} overlay_0-755:{mountpoint:/var/lib/containers/storage/overlay/b6d119b022b1cba081339022ea471dcb59995f9f36c515861e3261a95d8cdd53/merged major:0 minor:755 fsType:overlay blockSize:0} overlay_0-766:{mountpoint:/var/lib/containers/storage/overlay/dfb66a5fd3942e9a8cd49884dc618438d53f6f7885c240fd7f7a9eccf0a6e9a1/merged major:0 minor:766 fsType:overlay blockSize:0} overlay_0-768:{mountpoint:/var/lib/containers/storage/overlay/7c9d51d154c30dd7d15fa9abdd1777c5eb7ba6b86d7533bce8b99c98943cf8dd/merged major:0 minor:768 fsType:overlay blockSize:0} overlay_0-770:{mountpoint:/var/lib/containers/storage/overlay/83530b750f7bdc9a634723dbc32f28f50e9132345df0f6f625cd753eb006563e/merged major:0 minor:770 fsType:overlay blockSize:0} overlay_0-772:{mountpoint:/var/lib/containers/storage/overlay/a484148fe56f632b833523f3684f198ef9c5c74225f9f2a0acd15de709eed6b1/merged major:0 minor:772 fsType:overlay blockSize:0} overlay_0-774:{mountpoint:/var/lib/containers/storage/overlay/c780b4eb1869d2d701993300b0fb45d9181089a41a8b6bfa09008a044b52aa92/merged major:0 minor:774 fsType:overlay blockSize:0} overlay_0-776:{mountpoint:/var/lib/containers/storage/overlay/395a05a9d1a836fb7c0f229b67515a3946473f7ba6f477ed19551a43e3206627/merged major:0 minor:776 fsType:overlay blockSize:0} overlay_0-778:{mountpoint:/var/lib/containers/storage/overlay/b55e8d6eeb8516686dacd19534a7764158e026c0f7a845fcfa1d2b67126a9d59/merged major:0 minor:778 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/397e7d766ea960fcccb342ac0fc2449130bfc614c99d604094a1cf80e8395659/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-780:{mountpoint:/var/lib/containers/storage/overlay/3ef427b0165f4a666ab2259da0ee22b34759d7eec104ddc046d86ad556fce1fb/merged major:0 minor:780 fsType:overlay blockSize:0} overlay_0-790:{mountpoint:/var/lib/containers/storage/overlay/ea1e552485240c992192c7e58eae3208c044d17503496c81032dc685290868fa/merged major:0 minor:790 fsType:overlay blockSize:0} overlay_0-792:{mountpoint:/var/lib/containers/storage/overlay/51326b31a62465f54be25985bd4af9b9dd469e6c28cd483b113317452701a417/merged major:0 minor:792 fsType:overlay blockSize:0} overlay_0-800:{mountpoint:/var/lib/containers/storage/overlay/79dc549f9cf59aa430c10f820161f45c1b51efda46eb5cdd89996a83a35ba907/merged major:0 minor:800 fsType:overlay blockSize:0} overlay_0-802:{mountpoint:/var/lib/containers/storage/overlay/46f020de05d1713145f5ff115a60db405af3d73d9ed784835ae7203775825373/merged major:0 minor:802 fsType:overlay blockSize:0} overlay_0-81:{mountpoint:/var/lib/containers/storage/overlay/6487700845a74c7fd612e2638ca2f57a8c0c63a3c5899457212ac8a8776e1505/merged major:0 minor:81 fsType:overlay blockSize:0} overlay_0-810:{mountpoint:/var/lib/containers/storage/overlay/2b531e3c81c7951f1099c4dcccb3e8cdac0dcbd5c254fce9e1d53660fc3906f2/merged major:0 minor:810 fsType:overlay blockSize:0} overlay_0-814:{mountpoint:/var/lib/containers/storage/overlay/1b314784b3d55aea1152386349216997c8217a4f5723548aa165131de070dc10/merged major:0 minor:814 fsType:overlay blockSize:0} overlay_0-817:{mountpoint:/var/lib/containers/storage/overlay/aea2cb62bb53bbdb41a4ef70a5ca97fce7c52e128b87607e81ffe1d6267843a6/merged major:0 minor:817 fsType:overlay blockSize:0} overlay_0-837:{mountpoint:/var/lib/containers/storage/overlay/83f1bb2910a75e99f982dd73a82450890a8712e1d368d2b185f5ce20e00a6bc1/merged major:0 minor:837 fsType:overlay blockSize:0} overlay_0-839:{mountpoint:/var/lib/containers/storage/overlay/5928e4b6e2b1a5f931d3c2be20e1004050f2db8127978e87e2d6df11c8dc000c/merged major:0 minor:839 fsType:overlay blockSize:0} overlay_0-845:{mountpoint:/var/lib/containers/storage/overlay/e2b4bce79b71ee928ac1f5216a7ea0587652d854181e35fb46e8ebb168359957/merged major:0 minor:845 fsType:overlay blockSize:0} overlay_0-86:{mountpoint:/var/lib/containers/storage/overlay/4240ad2d14bbed89e5f7a3fc1f6d30cf6318c32ce372afc5c3454076664ad0c8/merged major:0 minor:86 fsType:overlay blockSize:0} overlay_0-860:{mountpoint:/var/lib/containers/storage/overlay/b5aac927a59e57f172b929eef40ad3802938ba745b14969ac848c9742deba4c8/merged major:0 minor:860 fsType:overlay blockSize:0} overlay_0-88:{mountpoint:/var/lib/containers/storage/overlay/1050057ba311176e2decca520748e9b605e2aa5fd676a584d10b78a35774f4b1/merged major:0 minor:88 fsType:overlay blockSize:0} overlay_0-893:{mountpoint:/var/lib/containers/storage/overlay/bda8c52a29906cea81b601eef0c8a0e959f1d1244ef740606d6ece4f0a04c600/merged major:0 minor:893 fsType:overlay blockSize:0} overlay_0-908:{mountpoint:/var/lib/containers/storage/overlay/8a36bd751a70368b723d5fdcbfdb30bbf760187a97e9e4c39cf1d0ef6fb654bf/merged major:0 minor:908 fsType:overlay blockSize:0} overlay_0-918:{mountpoint:/var/lib/containers/storage/overlay/24d72726e4703b190ac60759fe755074d769c00584196f2c428fb69837f9be6c/merged major:0 minor:918 fsType:overlay blockSize:0} overlay_0-938:{mountpoint:/var/lib/containers/storage/overlay/909d5d27e3f9bb54378cdaed2299cdd709673ad897d59864da44dcc9602e1f7c/merged major:0 minor:938 fsType:overlay blockSize:0} overlay_0-948:{mountpoint:/var/lib/containers/storage/overlay/f0b778662d16b69d2019edcf46cea37b9da424ec86b6e662a7d3944b6f81f8d0/merged major:0 minor:948 fsType:overlay blockSize:0} overlay_0-950:{mountpoint:/var/lib/containers/storage/overlay/b97f430e7c16a30319090e10b612229f29061efa60556169b08d395c908a34c2/merged major:0 minor:950 fsType:overlay blockSize:0} overlay_0-96:{mountpoint:/var/lib/containers/storage/overlay/8a5f5c286f229bf0c0f7beda245cd3e9586b91e8709bf836447f3e9af9a57418/merged major:0 minor:96 fsType:overlay blockSize:0} overlay_0-960:{mountpoint:/var/lib/containers/storage/overlay/bf2a645898b16344baf7370edf8e4ee724284a94823e178701e84dcb27af784e/merged major:0 minor:960 fsType:overlay blockSize:0} overlay_0-965:{mountpoint:/var/lib/containers/storage/overlay/abf6a57c371670a63343a1b71c4895fac7959c972bfd0594c77d9e33177bcdcd/merged major:0 minor:965 fsType:overlay blockSize:0} overlay_0-967:{mountpoint:/var/lib/containers/storage/overlay/a92957f0160bb27fa92217bafc53eecb228e2e0902436a14338c3e94b7245bcb/merged major:0 minor:967 fsType:overlay blockSize:0} overlay_0-977:{mountpoint:/var/lib/containers/storage/overlay/cac41a76347353f02e73865bdeee51598173dc4f25fae3ffa362617791b979c0/merged major:0 minor:977 fsType:overlay blockSize:0} overlay_0-983:{mountpoint:/var/lib/containers/storage/overlay/e63d8ec60191150d6cb6f8d9d78e7b90df0427799815ccf76e94437c9daff8e8/merged major:0 minor:983 fsType:overlay blockSize:0} overlay_0-989:{mountpoint:/var/lib/containers/storage/overlay/0d4459e1a389aa48e533f0e6aac18571afdaad404c1963fcf242c4cc303c4fa4/merged major:0 minor:989 fsType:overlay blockSize:0} overlay_0-991:{mountpoint:/var/lib/containers/storage/overlay/b477286dc9cc21ee80c8318d43b42c17caa193b28ed68701a94f945b9c803a8f/merged major:0 minor:991 fsType:overlay blockSize:0}] Dec 05 12:50:03.094987 master-0 kubenswrapper[29936]: I1205 12:50:03.093518 29936 manager.go:217] Machine: {Timestamp:2025-12-05 12:50:03.092546025 +0000 UTC m=+0.224625726 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514145280 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:7ed1cb80ed224980aa762c96e2471f55 SystemUUID:7ed1cb80-ed22-4980-aa76-2c96e2471f55 BootID:195a1d65-51c2-44ad-9194-26630da59f9f Filesystems:[{Device:overlay_0-1123 DeviceMajor:0 DeviceMinor:1123 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-146 DeviceMajor:0 DeviceMinor:146 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-244 DeviceMajor:0 DeviceMinor:244 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1456 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1495 DeviceMajor:0 DeviceMinor:1495 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-790 DeviceMajor:0 DeviceMinor:790 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-570 DeviceMajor:0 DeviceMinor:570 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1595 DeviceMajor:0 DeviceMinor:1595 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d/userdata/shm DeviceMajor:0 DeviceMinor:187 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:1219 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/02429253d825f84b6e8d3688d755956761f60c5751e2535a15ebb536be6b1f94/userdata/shm DeviceMajor:0 DeviceMinor:112 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-481 DeviceMajor:0 DeviceMinor:481 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/db2e54b6-4879-40f4-9359-a8b0c31e76c2/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:936 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1118 DeviceMajor:0 DeviceMinor:1118 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dc5db54b-094f-4c36-a0ad-042e9fc2b61d/volumes/kubernetes.io~projected/kube-api-access-tjkjz DeviceMajor:0 DeviceMinor:1392 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-792 DeviceMajor:0 DeviceMinor:792 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-800 DeviceMajor:0 DeviceMinor:800 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-604 DeviceMajor:0 DeviceMinor:604 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-463 DeviceMajor:0 DeviceMinor:463 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c44264ca51ad61ed3b05ffa4c975691fd7debf64dbafd9a640308d225a077e0b/userdata/shm DeviceMajor:0 DeviceMinor:713 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/e5dfcb1e-1231-4f07-8c21-748965718729/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:77 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-68 DeviceMajor:0 DeviceMinor:68 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257070592 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~projected/kube-api-access-dtvzs DeviceMajor:0 DeviceMinor:295 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-748 DeviceMajor:0 DeviceMinor:748 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-485 DeviceMajor:0 DeviceMinor:485 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1376 DeviceMajor:0 DeviceMinor:1376 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b/userdata/shm DeviceMajor:0 DeviceMinor:149 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-965 DeviceMajor:0 DeviceMinor:965 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f0c6889-0739-48a3-99cd-6db9d1f83242/volumes/kubernetes.io~projected/kube-api-access-p5p5d DeviceMajor:0 DeviceMinor:329 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f2635f9f-219b-4d03-b5b3-496c0c836fae/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:904 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a280c582-685e-47ac-bf6b-248aa0c129a9/volumes/kubernetes.io~projected/kube-api-access-xkqq7 DeviceMajor:0 DeviceMinor:951 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1e7e859b537def1a21239198a62664ddf26773c1c6f156f411606722ed8cb4e6/userdata/shm DeviceMajor:0 DeviceMinor:986 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/49f2f301b501743d7a4254bc3eeb040151fb199e2a4d9ec64ddce3a74ce66f5b/userdata/shm DeviceMajor:0 DeviceMinor:325 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-977 DeviceMajor:0 DeviceMinor:977 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1202 DeviceMajor:0 DeviceMinor:1202 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/60327040-f782-4cda-a32d-52a4f183073c/volumes/kubernetes.io~projected/kube-api-access-zp957 DeviceMajor:0 DeviceMinor:1460 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-738 DeviceMajor:0 DeviceMinor:738 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-948 DeviceMajor:0 DeviceMinor:948 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1065 DeviceMajor:0 DeviceMinor:1065 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-729 DeviceMajor:0 DeviceMinor:729 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:186 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-356 DeviceMajor:0 DeviceMinor:356 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-772 DeviceMajor:0 DeviceMinor:772 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6beaecf0540643cd8682361578d468ced3e3fd0c3495c281547ab1933154b6de/userdata/shm DeviceMajor:0 DeviceMinor:1104 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/e943438b-1de8-435c-8a19-accd6a6292a4/volumes/kubernetes.io~projected/kube-api-access-lfknz DeviceMajor:0 DeviceMinor:101 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/0dda6d9b-cb3a-413a-85af-ef08f15ea42e/volumes/kubernetes.io~projected/kube-api-access-62nqj DeviceMajor:0 DeviceMinor:300 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9fd6db41eb8dc90e6efffc25bb3c93739722e6824dad0dcb9a786720bc6514c4/userdata/shm DeviceMajor:0 DeviceMinor:337 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/743ece8bb6e404056a2fb9957949cb0a30330d99bb6dbc633553c08d0fb45759/userdata/shm DeviceMajor:0 DeviceMinor:704 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-671 DeviceMajor:0 DeviceMinor:671 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-802 DeviceMajor:0 DeviceMinor:802 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/065b5ff0754f03af8b21df75fad6ff50fe29b9c92ca5f839b6b57c232043c975/userdata/shm DeviceMajor:0 DeviceMinor:716 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-776 DeviceMajor:0 DeviceMinor:776 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1322 DeviceMajor:0 DeviceMinor:1322 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/60327040-f782-4cda-a32d-52a4f183073c/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1457 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76/volumes/kubernetes.io~secret/openshift-state-metrics-tls DeviceMajor:0 DeviceMinor:1461 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bba3aa271baddd92ed5881d6af79fb82b3a45fce07083a5cd051cbeeb1a01428/userdata/shm DeviceMajor:0 DeviceMinor:321 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-338 DeviceMajor:0 DeviceMinor:338 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-374 DeviceMajor:0 DeviceMinor:374 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1198 DeviceMajor:0 DeviceMinor:1198 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1314 DeviceMajor:0 DeviceMinor:1314 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1497 DeviceMajor:0 DeviceMinor:1497 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/kube-api-access-fxxw7 DeviceMajor:0 DeviceMinor:294 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/b74e0607-6ed0-4119-8870-895b7d336830/volumes/kubernetes.io~projected/kube-api-access-72wst DeviceMajor:0 DeviceMinor:1089 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-71 DeviceMajor:0 DeviceMinor:71 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/07bd9adb3dd2a54b1348564cac3ab912144772686d957ab49d9bf60d68718f5e/userdata/shm DeviceMajor:0 DeviceMinor:562 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/04a1540e033fc0d53be3a8dfa10cb49b28b11738b911cb185f8d919660d6db47/userdata/shm DeviceMajor:0 DeviceMinor:913 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1589 DeviceMajor:0 DeviceMinor:1589 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:296 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/kube-api-access-t5hdg DeviceMajor:0 DeviceMinor:309 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1209 DeviceMajor:0 DeviceMinor:1209 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dbe144b5-3b78-4946-bbf9-b825b0e47b07/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:1316 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-401 DeviceMajor:0 DeviceMinor:401 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-435 DeviceMajor:0 DeviceMinor:435 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-151 DeviceMajor:0 DeviceMinor:151 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bd884dd8fbf0cb13a01d3369dc09dbcaf952157e210620f5c83187eab601232c/userdata/shm DeviceMajor:0 DeviceMinor:961 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab/userdata/shm DeviceMajor:0 DeviceMinor:327 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/92eddccae7e06f02f48401d5d5f367dae7b9b78b2bbc84b00b68ec03e90321c1/userdata/shm DeviceMajor:0 DeviceMinor:717 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-750 DeviceMajor:0 DeviceMinor:750 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-207 DeviceMajor:0 DeviceMinor:207 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-505 DeviceMajor:0 DeviceMinor:505 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1507 DeviceMajor:0 DeviceMinor:1507 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-348 DeviceMajor:0 DeviceMinor:348 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1194 DeviceMajor:0 DeviceMinor:1194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b13885ef-d2b5-4591-825d-446cf8729bc1/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:1093 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733/userdata/shm DeviceMajor:0 DeviceMinor:160 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3a9d8373a41ae93e2045d1c0300d43339b0c915de4cad9048741918269853b51/userdata/shm DeviceMajor:0 DeviceMinor:344 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0b836f01dcb43b6af667ba219b4059e3935a66980e122a92a279a33e963cb964/userdata/shm DeviceMajor:0 DeviceMinor:910 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb/volumes/kubernetes.io~projected/kube-api-access-ht5kr DeviceMajor:0 DeviceMinor:941 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a280c582-685e-47ac-bf6b-248aa0c129a9/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:944 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9ca3179bcac9021f22c3e7255b372820926d29356fd67cac276625618bd240a6/userdata/shm DeviceMajor:0 DeviceMinor:963 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/containers/storage/overlay-containers/164d69c4a697b3689889d3ab2e5a66ca6c9ed1089292b441ab9282cdde612925/userdata/shm DeviceMajor:0 DeviceMinor:1399 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1395 DeviceMajor:0 DeviceMinor:1395 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-860 DeviceMajor:0 DeviceMinor:860 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-586 DeviceMajor:0 DeviceMinor:586 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-837 DeviceMajor:0 DeviceMinor:837 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3d96c85a-fc88-46af-83d5-6c71ec6e2c23/volumes/kubernetes.io~projected/kube-api-access-ss5kh DeviceMajor:0 DeviceMinor:907 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-960 DeviceMajor:0 DeviceMinor:960 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1116 DeviceMajor:0 DeviceMinor:1116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~projected/kube-api-access-ph9w6 DeviceMajor:0 DeviceMinor:302 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a14df948-1ec4-4785-ad33-28d1e7063959/volumes/kubernetes.io~projected/kube-api-access-2g7n7 DeviceMajor:0 DeviceMinor:915 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1002 DeviceMajor:0 DeviceMinor:1002 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1154 DeviceMajor:0 DeviceMinor:1154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1485 DeviceMajor:0 DeviceMinor:1485 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cf8247a1-703a-46b3-9a33-25a73b27ab99/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:530 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1041 DeviceMajor:0 DeviceMinor:1041 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1030 DeviceMajor:0 DeviceMinor:1030 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9f1e76d4f58fcd22a9b3bb1871e5fda992687c0e5181ed09e4aadbd1b7953465/userdata/shm DeviceMajor:0 DeviceMinor:546 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/153fec1f-a10b-4c6c-a997-60fa80c13a86/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:561 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-652 DeviceMajor:0 DeviceMinor:652 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-730 DeviceMajor:0 DeviceMinor:730 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1122 DeviceMajor:0 DeviceMinor:1122 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~projected/kube-api-access-996h9 DeviceMajor:0 DeviceMinor:76 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1536 DeviceMajor:0 DeviceMinor:1536 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:305 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/ce9e2a6b-8ce7-477c-8bc7-24033243eabe/volumes/kubernetes.io~projected/kube-api-access-422c9 DeviceMajor:0 DeviceMinor:903 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/ce3d73c1-f4bd-4c91-936a-086dfa5e3460/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:292 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/915ebf41ba29fa9d0d989a762295214f873dc379b870a92fba77c1d033c014f1/userdata/shm DeviceMajor:0 DeviceMinor:113 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33/userdata/shm DeviceMajor:0 DeviceMinor:316 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1565 DeviceMajor:0 DeviceMinor:1565 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-989 DeviceMajor:0 DeviceMinor:989 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1008 DeviceMajor:0 DeviceMinor:1008 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1474 DeviceMajor:0 DeviceMinor:1474 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-509 DeviceMajor:0 DeviceMinor:509 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1274 DeviceMajor:0 DeviceMinor:1274 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-578 DeviceMajor:0 DeviceMinor:578 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/99996137-2621-458b-980d-584b3640d4ad/volumes/kubernetes.io~projected/kube-api-access-c69rc DeviceMajor:0 DeviceMinor:1433 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1146 DeviceMajor:0 DeviceMinor:1146 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102829056 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:303 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:702 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/95d8fb27-8b2b-4749-add3-9e9b16edb693/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:1095 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1148 DeviceMajor:0 DeviceMinor:1148 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/77da36c6bf5d09d68dbf2de017a655a5a15b25fda32cba3288a3d8b2cc4b44c0/userdata/shm DeviceMajor:0 DeviceMinor:376 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-118 DeviceMajor:0 DeviceMinor:118 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-168 DeviceMajor:0 DeviceMinor:168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9cfdae6ccb167d4a6f250b34ce3b8d4ec56326be1aca0a0b497bcb1caa6ac3cf/userdata/shm DeviceMajor:0 DeviceMinor:451 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:933 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1039 DeviceMajor:0 DeviceMinor:1039 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fb7003a6-4341-49eb-bec3-76ba8610fa12/volumes/kubernetes.io~projected/kube-api-access-69n5s DeviceMajor:0 DeviceMinor:153 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-58 DeviceMajor:0 DeviceMinor:58 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-368 DeviceMajor:0 DeviceMinor:368 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-755 DeviceMajor:0 DeviceMinor:755 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:935 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/130205999d123cc10c914ecc3cb22cde267becfbe33db09ccb0559c952bdc40f/userdata/shm DeviceMajor:0 DeviceMinor:1273 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:307 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-406 DeviceMajor:0 DeviceMinor:406 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:621 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-770 DeviceMajor:0 DeviceMinor:770 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5404e1e33c358f139ce43aadf9014fd74254490d058389642b99e6aa71216243/userdata/shm DeviceMajor:0 DeviceMinor:970 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/909ed395-8ad3-4350-95e3-b4b19c682f92/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:1212 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~projected/kube-api-access-nml2g DeviceMajor:0 DeviceMinor:299 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~projected/kube-api-access-wjp62 DeviceMajor:0 DeviceMinor:626 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-501 DeviceMajor:0 DeviceMinor:501 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1ee7a76b-cf1d-4513-b314-5aa314da818d/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:934 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1401 DeviceMajor:0 DeviceMinor:1401 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d3e283fe-a474-4f83-ad66-62971945060a/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:1588 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-124 DeviceMajor:0 DeviceMinor:124 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ed538f41551e0e7b372ee4dcc843f84e56fe8d6677fe847816efda02bfd61218/userdata/shm DeviceMajor:0 DeviceMinor:945 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1243 DeviceMajor:0 DeviceMinor:1243 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c5997a9e57f36847e6cb187afed936a398d9d89f0a3c5fbdaa0cdcf0b16bbffd/userdata/shm DeviceMajor:0 DeviceMinor:1262 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-426 DeviceMajor:0 DeviceMinor:426 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a14df948-1ec4-4785-ad33-28d1e7063959/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:912 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-106 DeviceMajor:0 DeviceMinor:106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1382 DeviceMajor:0 DeviceMinor:1382 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-744 DeviceMajor:0 DeviceMinor:744 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-552 DeviceMajor:0 DeviceMinor:552 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f2635f9f-219b-4d03-b5b3-496c0c836fae/volumes/kubernetes.io~projected/kube-api-access-fwrwm DeviceMajor:0 DeviceMinor:703 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4/userdata/shm DeviceMajor:0 DeviceMinor:328 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-774 DeviceMajor:0 DeviceMinor:774 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1013 DeviceMajor:0 DeviceMinor:1013 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-228 DeviceMajor:0 DeviceMinor:228 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e943438b-1de8-435c-8a19-accd6a6292a4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:74 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0b9e8ef8efad8c6e16cd6e6a39269d9f5b02a38a45cb5b422afaa90713381fcb/userdata/shm DeviceMajor:0 DeviceMinor:706 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c13f876ed14f7005d250ab3203aedc5ac3d9bddbbff7570300b321a40f59bd5c/userdata/shm DeviceMajor:0 DeviceMinor:1222 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0f8a1e4d8de6a06f67857b43e08d70d6ce0e19926ff25b49cbea007cf56e4e61/userdata/shm DeviceMajor:0 DeviceMinor:1103 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e/userdata/shm DeviceMajor:0 DeviceMinor:360 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-390 DeviceMajor:0 DeviceMinor:390 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/58187662-b502-4d90-95ce-2aa91a81d256/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:700 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-477 DeviceMajor:0 DeviceMinor:477 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-983 DeviceMajor:0 DeviceMinor:983 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1081 DeviceMajor:0 DeviceMinor:1081 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76/volumes/kubernetes.io~projected/kube-api-access-nqvfm DeviceMajor:0 DeviceMinor:1459 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-139 DeviceMajor:0 DeviceMinor:139 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-219 DeviceMajor:0 DeviceMinor:219 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-412 DeviceMajor:0 DeviceMinor:412 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~projected/kube-api-access-2tngh DeviceMajor:0 DeviceMinor:293 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/0dda6d9b-cb3a-413a-85af-ef08f15ea42e/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:693 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dac2262b7105102ce37a8db95766fbd5753d50bed12fb86441b8247f4653fc04/userdata/shm DeviceMajor:0 DeviceMinor:956 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1493 DeviceMajor:0 DeviceMinor:1493 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-126 DeviceMajor:0 DeviceMinor:126 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/365bf663-fd5b-44df-a327-0438995c015d/volumes/kubernetes.io~projected/kube-api-access-lqjgb DeviceMajor:0 DeviceMinor:957 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1016 DeviceMajor:0 DeviceMinor:1016 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/60327040-f782-4cda-a32d-52a4f183073c/volumes/kubernetes.io~secret/node-exporter-tls DeviceMajor:0 DeviceMinor:1463 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1685 DeviceMajor:0 DeviceMinor:1685 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cf8247a1-703a-46b3-9a33-25a73b27ab99/volumes/kubernetes.io~projected/kube-api-access-fqdxl DeviceMajor:0 DeviceMinor:531 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:701 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-810 DeviceMajor:0 DeviceMinor:810 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7306701b7f1e349175a899928ef136fbd77aaa68bd4675a9b0f16eeeda9ca379/userdata/shm DeviceMajor:0 DeviceMinor:1629 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-236 DeviceMajor:0 DeviceMinor:236 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a2acba71-b9dc-4b85-be35-c995b8be2f19/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:697 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1006 DeviceMajor:0 DeviceMinor:1006 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d53a4886-db25-43a1-825a-66a9a9a58590/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:291 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:622 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1472 DeviceMajor:0 DeviceMinor:1472 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1577 DeviceMajor:0 DeviceMinor:1577 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:290 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-415 DeviceMajor:0 DeviceMinor:415 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-433 DeviceMajor:0 DeviceMinor:433 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1258 DeviceMajor:0 DeviceMinor:1258 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2/volumes/kubernetes.io~secret/kube-state-metrics-tls DeviceMajor:0 DeviceMinor:1462 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-358 DeviceMajor:0 DeviceMinor:358 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-550 DeviceMajor:0 DeviceMinor:550 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-752 DeviceMajor:0 DeviceMinor:752 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/665c4362-e2e5-4f96-92c0-1746c63c7422/volumes/kubernetes.io~projected/kube-api-access-lckv7 DeviceMajor:0 DeviceMinor:796 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/531b8927-92db-4e9d-9a0a-12ff948cdaad/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:942 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1381 DeviceMajor:0 DeviceMinor:1381 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:289 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~projected/kube-api-access-wwcr9 DeviceMajor:0 DeviceMinor:319 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-746 DeviceMajor:0 DeviceMinor:746 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-437 DeviceMajor:0 DeviceMinor:437 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f2635f9f-219b-4d03-b5b3-496c0c836fae/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:905 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-162 DeviceMajor:0 DeviceMinor:162 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-209 DeviceMajor:0 DeviceMinor:209 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1ee7a76b-cf1d-4513-b314-5aa314da818d/volumes/kubernetes.io~projected/kube-api-access-lkdtr DeviceMajor:0 DeviceMinor:943 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1476 DeviceMajor:0 DeviceMinor:1476 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782/userdata/shm DeviceMajor:0 DeviceMinor:320 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-600 DeviceMajor:0 DeviceMinor:600 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f0c6889-0739-48a3-99cd-6db9d1f83242/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:695 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-398 DeviceMajor:0 DeviceMinor:398 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/db2e54b6-4879-40f4-9359-a8b0c31e76c2/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:940 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-183 DeviceMajor:0 DeviceMinor:183 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c39d2089-d3bf-4556-b6ef-c362a08c21a2/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:70 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:1213 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1257 DeviceMajor:0 DeviceMinor:1257 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/99996137-2621-458b-980d-584b3640d4ad/volumes/kubernetes.io~secret/prometheus-operator-tls DeviceMajor:0 DeviceMinor:1432 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1520 DeviceMajor:0 DeviceMinor:1520 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/010fcb3fd705e5d750eedd1adb06872aa524c08fbc85d2a921261129ee9bc96b/userdata/shm DeviceMajor:0 DeviceMinor:1100 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1028 DeviceMajor:0 DeviceMinor:1028 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-120 DeviceMajor:0 DeviceMinor:120 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-252 DeviceMajor:0 DeviceMinor:252 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-967 DeviceMajor:0 DeviceMinor:967 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-991 DeviceMajor:0 DeviceMinor:991 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-674 DeviceMajor:0 DeviceMinor:674 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1336 DeviceMajor:0 DeviceMinor:1336 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1436 DeviceMajor:0 DeviceMinor:1436 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-172 DeviceMajor:0 DeviceMinor:172 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6a7ef281a34ccfa6602eba73eaa73316754fbb0bb6c1935a7c44c597fce5d077/userdata/shm DeviceMajor:0 DeviceMinor:1301 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1168 DeviceMajor:0 DeviceMinor:1168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-366 DeviceMajor:0 DeviceMinor:366 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1032 DeviceMajor:0 DeviceMinor:1032 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1060 DeviceMajor:0 DeviceMinor:1060 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~secret/secret-metrics-client-certs DeviceMajor:0 DeviceMinor:1545 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1369 DeviceMajor:0 DeviceMinor:1369 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6/userdata/shm DeviceMajor:0 DeviceMinor:350 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9abf289d98169b2aa959495298e72df522e02a710723a8c85b99355af8b7eae3/userdata/shm DeviceMajor:0 DeviceMinor:632 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/19edfec7b5dad95038c7d84a7af049f95270320317e900ea90d94c12477f0556/userdata/shm DeviceMajor:0 DeviceMinor:709 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:308 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-950 DeviceMajor:0 DeviceMinor:950 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3fc55735af7e7e6d6e15c1fa34cd05fc0468a74822467cb4ea7df9c2efc6cd2f/userdata/shm DeviceMajor:0 DeviceMinor:1267 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1276 DeviceMajor:0 DeviceMinor:1276 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1619 DeviceMajor:0 DeviceMinor:1619 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-351 DeviceMajor:0 DeviceMinor:351 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3b741029-0eb5-409b-b7f1-95e8385dc400/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:595 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1374 DeviceMajor:0 DeviceMinor:1374 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:315 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-380 DeviceMajor:0 DeviceMinor:380 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ce9e2a6b-8ce7-477c-8bc7-24033243eabe/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:902 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a280c582-685e-47ac-bf6b-248aa0c129a9/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:937 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9a083a2de33da77d47cd60a3708aaf6bb8591ce81eba8d8e42788e2c8c58ecd3/userdata/shm DeviceMajor:0 DeviceMinor:707 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/ba095394-1873-4793-969d-3be979fa0771/volumes/kubernetes.io~projected/kube-api-access-55qpg DeviceMajor:0 DeviceMinor:314 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/49760d62-02e5-4882-b47f-663102b04946/volumes/kubernetes.io~projected/kube-api-access-26x2z DeviceMajor:0 DeviceMinor:335 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1438 DeviceMajor:0 DeviceMinor:1438 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-817 DeviceMajor:0 DeviceMinor:817 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/db2e54b6-4879-40f4-9359-a8b0c31e76c2/volumes/kubernetes.io~projected/kube-api-access-nwz29 DeviceMajor:0 DeviceMinor:947 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1250 DeviceMajor:0 DeviceMinor:1250 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1188 DeviceMajor:0 DeviceMinor:1188 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/708bf629-9949-4b79-a88a-c73ba033475b/volumes/kubernetes.io~projected/kube-api-access-6vx2z DeviceMajor:0 DeviceMinor:148 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/38941513-e968-45f1-9cb2-b63d40338f36/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:341 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b13885ef-d2b5-4591-825d-446cf8729bc1/volumes/kubernetes.io~projected/kube-api-access-xmjkp DeviceMajor:0 DeviceMinor:1091 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1566 DeviceMajor:0 DeviceMinor:1566 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-503 DeviceMajor:0 DeviceMinor:503 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-144 DeviceMajor:0 DeviceMinor:144 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-634 DeviceMajor:0 DeviceMinor:634 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fb7003a6-4341-49eb-bec3-76ba8610fa12/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:699 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1136 DeviceMajor:0 DeviceMinor:1136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-768 DeviceMajor:0 DeviceMinor:768 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd/userdata/shm DeviceMajor:0 DeviceMinor:311 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-640 DeviceMajor:0 DeviceMinor:640 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1e6babfe-724a-4eab-bb3b-bc318bf57b70/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:694 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-420 DeviceMajor:0 DeviceMinor:420 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:795 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a45f340c-0eca-4460-8961-4ca360467eeb/volumes/kubernetes.io~projected/kube-api-access-r7ftf DeviceMajor:0 DeviceMinor:929 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1447 DeviceMajor:0 DeviceMino Dec 05 12:50:03.095605 master-0 kubenswrapper[29936]: r:1447 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d3e283fe-a474-4f83-ad66-62971945060a/volumes/kubernetes.io~projected/kube-api-access-pjbwh DeviceMajor:0 DeviceMinor:1613 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/44e741be030df14b7e9e415d32f4095c562d693609b8dc4bd8ec51c21503bbca/userdata/shm DeviceMajor:0 DeviceMinor:334 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/e5dfcb1e-1231-4f07-8c21-748965718729/volumes/kubernetes.io~projected/kube-api-access-pb46q DeviceMajor:0 DeviceMinor:922 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1204 DeviceMajor:0 DeviceMinor:1204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1393 DeviceMajor:0 DeviceMinor:1393 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1674 DeviceMajor:0 DeviceMinor:1674 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-630 DeviceMajor:0 DeviceMinor:630 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-649 DeviceMajor:0 DeviceMinor:649 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dbe144b5-3b78-4946-bbf9-b825b0e47b07/volumes/kubernetes.io~projected/kube-api-access-mbg7w DeviceMajor:0 DeviceMinor:1321 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-260 DeviceMajor:0 DeviceMinor:260 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1186 DeviceMajor:0 DeviceMinor:1186 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-86 DeviceMajor:0 DeviceMinor:86 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4ed24c6b6f900a1eeba45b567c2d9336f6c8e081eea3b175ce81e0e583f37f25/userdata/shm DeviceMajor:0 DeviceMinor:981 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/62b006cd51c7d10f8e6f8e36ec2fbd7c2b472a5db5854f2056fdbe13f97f07e2/userdata/shm DeviceMajor:0 DeviceMinor:1004 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/db27bee9-3d33-4c4a-b38b-72f7cec77c7a/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:1263 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:576 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/8defe125-1529-4091-adff-e9d17a2b298f/volumes/kubernetes.io~projected/kube-api-access-jpxqg DeviceMajor:0 DeviceMinor:1090 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fafe50d6690c2fbac658b4db9e7e7d0a871a9941f8ee2fd5f2fce340df7fd5f6/userdata/shm DeviceMajor:0 DeviceMinor:1470 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1528 DeviceMajor:0 DeviceMinor:1528 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~projected/kube-api-access-bnwdh DeviceMajor:0 DeviceMinor:1547 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1234ab8fb98aae2372aaa8236a21f36a20e417c28feeae32f634a7022c473171/userdata/shm DeviceMajor:0 DeviceMinor:1548 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/4d215811-6210-4ec2-8356-f1533dc43f65/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:1634 Capacity:200003584 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/12c707b6a686095bb6b918fa64b447ec88e080a7e32878fed57fd6822470f9a2/userdata/shm DeviceMajor:0 DeviceMinor:955 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/3b741029-0eb5-409b-b7f1-95e8385dc400/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:492 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e67f95f822c645d6f2dd2098e7e055983609569dd0acfdc0e0bea037bf8d6c03/userdata/shm DeviceMajor:0 DeviceMinor:719 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/365bf663-fd5b-44df-a327-0438995c015d/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:954 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/e2e2d968-9946-4711-aaf0-3e3a03bff415/volumes/kubernetes.io~projected/kube-api-access-pxwwh DeviceMajor:0 DeviceMinor:1221 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/volumes/kubernetes.io~projected/kube-api-access-xtjln DeviceMajor:0 DeviceMinor:310 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-470 DeviceMajor:0 DeviceMinor:470 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:594 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-736 DeviceMajor:0 DeviceMinor:736 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-893 DeviceMajor:0 DeviceMinor:893 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1552 DeviceMajor:0 DeviceMinor:1552 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-780 DeviceMajor:0 DeviceMinor:780 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-268 DeviceMajor:0 DeviceMinor:268 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-740 DeviceMajor:0 DeviceMinor:740 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1196 DeviceMajor:0 DeviceMinor:1196 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1478a21e-b6ac-46fb-ad01-805ac71f0a79/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:1291 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-766 DeviceMajor:0 DeviceMinor:766 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-81 DeviceMajor:0 DeviceMinor:81 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1871a9d6-6369-4d08-816f-9c6310b61ddf/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:324 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~projected/kube-api-access-dmq98 DeviceMajor:0 DeviceMinor:159 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/1e6babfe-724a-4eab-bb3b-bc318bf57b70/volumes/kubernetes.io~projected/kube-api-access-c2gd8 DeviceMajor:0 DeviceMinor:318 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7d0792bf-e2da-4ee7-91fe-032299cea42f/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:993 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:1211 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1162 DeviceMajor:0 DeviceMinor:1162 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8/userdata/shm DeviceMajor:0 DeviceMinor:122 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/153fec1f-a10b-4c6c-a997-60fa80c13a86/volumes/kubernetes.io~projected/kube-api-access-dr2r9 DeviceMajor:0 DeviceMinor:544 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/95d8fb27-8b2b-4749-add3-9e9b16edb693/volumes/kubernetes.io~projected/kube-api-access-fb42t DeviceMajor:0 DeviceMinor:1096 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/95fb5697edafbf4a316d98f995b9941dad32b61de9fdb2705dcb30f672d4ab5b/userdata/shm DeviceMajor:0 DeviceMinor:1334 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/99996137-2621-458b-980d-584b3640d4ad/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1431 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-523 DeviceMajor:0 DeviceMinor:523 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6/volumes/kubernetes.io~projected/kube-api-access-mlnqb DeviceMajor:0 DeviceMinor:343 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/4492c55f-701b-4ec8-ada1-0a5dc126d405/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:158 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ecdffd0c2fc8d747077d4ca5dcb541da82682f6d035455ac42566e8514bfadc3/userdata/shm DeviceMajor:0 DeviceMinor:332 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/ebfbe878-1796-4a20-b3f0-76165038252e/volumes/kubernetes.io~projected/kube-api-access-tncxt DeviceMajor:0 DeviceMinor:1092 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-191 DeviceMajor:0 DeviceMinor:191 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1576 DeviceMajor:0 DeviceMinor:1576 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1424 DeviceMajor:0 DeviceMinor:1424 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1200 DeviceMajor:0 DeviceMinor:1200 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1282 DeviceMajor:0 DeviceMinor:1282 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/570d4cae37b4f398ab8be13ab3899c325813f0073ace4d7fbe1d38d0fbd654b9/userdata/shm DeviceMajor:0 DeviceMinor:1434 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~secret/client-ca-bundle DeviceMajor:0 DeviceMinor:1544 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:156 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:692 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/3d96c85a-fc88-46af-83d5-6c71ec6e2c23/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:794 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/58187662-b502-4d90-95ce-2aa91a81d256/volumes/kubernetes.io~projected/kube-api-access-ps4ws DeviceMajor:0 DeviceMinor:312 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/9c31f89c-b01b-4853-a901-bccc25441a46/volumes/kubernetes.io~projected/kube-api-access-czcmr DeviceMajor:0 DeviceMinor:1097 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-96 DeviceMajor:0 DeviceMinor:96 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:288 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/volumes/kubernetes.io~projected/kube-api-access-hfl8f DeviceMajor:0 DeviceMinor:450 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-654 DeviceMajor:0 DeviceMinor:654 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1114 DeviceMajor:0 DeviceMinor:1114 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-676 DeviceMajor:0 DeviceMinor:676 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-298 DeviceMajor:0 DeviceMinor:298 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-372 DeviceMajor:0 DeviceMinor:372 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1174 DeviceMajor:0 DeviceMinor:1174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a8ddc41afaf0c618d55e894f2ce13b792424c9105a66a883a048089812798f25/userdata/shm DeviceMajor:0 DeviceMinor:1635 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-110 DeviceMajor:0 DeviceMinor:110 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/480c1f6e-0e13-49f9-bc4e-07350842f16c/volumes/kubernetes.io~projected/kube-api-access-48ns8 DeviceMajor:0 DeviceMinor:453 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/665c4362-e2e5-4f96-92c0-1746c63c7422/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:705 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1133 DeviceMajor:0 DeviceMinor:1133 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-370 DeviceMajor:0 DeviceMinor:370 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1192 DeviceMajor:0 DeviceMinor:1192 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1215 DeviceMajor:0 DeviceMinor:1215 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fa27a4561538d102c835ff1b231e3510011f63fe691f54410ca3547822dc8742/userdata/shm DeviceMajor:0 DeviceMinor:1466 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/78eb0a378ee87ec426723278f27c3f8944db139eff4ee08e81e705d48c517d58/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/abe43915cc1089507c40de3eaceadf732ca7d07e2f0e1b5a070959328db4199f/userdata/shm DeviceMajor:0 DeviceMinor:980 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/dc5db54b-094f-4c36-a0ad-042e9fc2b61d/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:1391 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102829056 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:577 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/04f451fea9668a794e9e554df0005ce70f405943bf1c6d084959d7f333152fc6/userdata/shm DeviceMajor:0 DeviceMinor:598 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-918 DeviceMajor:0 DeviceMinor:918 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-193 DeviceMajor:0 DeviceMinor:193 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b13885ef-d2b5-4591-825d-446cf8729bc1/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:1094 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1324 DeviceMajor:0 DeviceMinor:1324 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-466 DeviceMajor:0 DeviceMinor:466 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/807d9093-aa67-4840-b5be-7f3abcc1beed/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:297 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/3b741029-0eb5-409b-b7f1-95e8385dc400/volumes/kubernetes.io~projected/kube-api-access-5g7mj DeviceMajor:0 DeviceMinor:464 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-538 DeviceMajor:0 DeviceMinor:538 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1000 DeviceMajor:0 DeviceMinor:1000 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1037 DeviceMajor:0 DeviceMinor:1037 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dc5db54b-094f-4c36-a0ad-042e9fc2b61d/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:1390 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/5efad170-c154-42ec-a7c0-b36a98d2bfcc/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/bc18a83a-998e-458e-87f0-d5368da52e1b/volumes/kubernetes.io~projected/kube-api-access-bmjn7 DeviceMajor:0 DeviceMinor:798 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1562 DeviceMajor:0 DeviceMinor:1562 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-814 DeviceMajor:0 DeviceMinor:814 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/12b2377bacbd62ee93e11591af977d559716347304347ca9deca90451df150b7/userdata/shm DeviceMajor:0 DeviceMinor:976 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/db27bee9-3d33-4c4a-b38b-72f7cec77c7a/volumes/kubernetes.io~projected/kube-api-access-2nbxt DeviceMajor:0 DeviceMinor:1265 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1312 DeviceMajor:0 DeviceMinor:1312 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134/userdata/shm DeviceMajor:0 DeviceMinor:164 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-665 DeviceMajor:0 DeviceMinor:665 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1478a21e-b6ac-46fb-ad01-805ac71f0a79/volumes/kubernetes.io~projected/kube-api-access-fz4q6 DeviceMajor:0 DeviceMinor:1295 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/20a72c8b-0f12-446b-8a42-53d98864c8f8/volumes/kubernetes.io~projected/kube-api-access-6dwm5 DeviceMajor:0 DeviceMinor:1220 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1646 DeviceMajor:0 DeviceMinor:1646 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257074688 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-392 DeviceMajor:0 DeviceMinor:392 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1310 DeviceMajor:0 DeviceMinor:1310 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-521 DeviceMajor:0 DeviceMinor:521 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-206 DeviceMajor:0 DeviceMinor:206 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9cdc542e09a2b9f60d00a132f1101f8d7a3bb737b3bcc4086c2409ef17b05c7e/userdata/shm DeviceMajor:0 DeviceMinor:715 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-440 DeviceMajor:0 DeviceMinor:440 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/54d1c55b3ab43714c6f9d30fae64742364176327ec7be4503594ab7c679b2007/userdata/shm DeviceMajor:0 DeviceMinor:454 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-938 DeviceMajor:0 DeviceMinor:938 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-88 DeviceMajor:0 DeviceMinor:88 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-397 DeviceMajor:0 DeviceMinor:397 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fb396b2885c697fc62cb75681d56dacee81e32f235fe9f427b2f065f721f39f2/userdata/shm DeviceMajor:0 DeviceMinor:1102 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-395 DeviceMajor:0 DeviceMinor:395 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1/volumes/kubernetes.io~projected/kube-api-access-x59kd DeviceMajor:0 DeviceMinor:141 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1110 DeviceMajor:0 DeviceMinor:1110 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/531b8927-92db-4e9d-9a0a-12ff948cdaad/volumes/kubernetes.io~projected/kube-api-access-xqblj DeviceMajor:0 DeviceMinor:953 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-362 DeviceMajor:0 DeviceMinor:362 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb/volumes/kubernetes.io~projected/kube-api-access-dh58c DeviceMajor:0 DeviceMinor:585 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1403 DeviceMajor:0 DeviceMinor:1403 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1550 DeviceMajor:0 DeviceMinor:1550 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1615 DeviceMajor:0 DeviceMinor:1615 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-197 DeviceMajor:0 DeviceMinor:197 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-568 DeviceMajor:0 DeviceMinor:568 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b0475df4d5336da05f2cdbc3f74e49ad376be174c9b01bb8c74b713bd60e7ac6/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-170 DeviceMajor:0 DeviceMinor:170 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1468 DeviceMajor:0 DeviceMinor:1468 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/86aa525c2c153f5cbd8c5b3603c3c0fdcde107672a7bd7aeacc117267683bb33/userdata/shm DeviceMajor:0 DeviceMinor:1464 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a484ee5e7b41d00e01ba54d4ad8789422ba018cb058ac26feb10517be87018de/userdata/shm DeviceMajor:0 DeviceMinor:428 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-41 DeviceMajor:0 DeviceMinor:41 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/46252d0271f63a839c2cf8d137d190a08ca8c85ab8a7cd49fe478dd080504839/userdata/shm DeviceMajor:0 DeviceMinor:1098 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a45f340c-0eca-4460-8961-4ca360467eeb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:925 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1248 DeviceMajor:0 DeviceMinor:1248 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1455 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-346 DeviceMajor:0 DeviceMinor:346 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-456 DeviceMajor:0 DeviceMinor:456 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/029b733e2c6ad9f0e336ec7c4af189bd8388fe0d1d5f30c3280c2f24f4c1e475/userdata/shm DeviceMajor:0 DeviceMinor:596 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/24b6227b14f227965d3702a28c5ff0f7f572415f72495d988769ab39d10c0d8f/userdata/shm DeviceMajor:0 DeviceMinor:916 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes/kubernetes.io~secret/secret-metrics-server-tls DeviceMajor:0 DeviceMinor:1546 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ae3644549c6caccb0e5b76cf093dd16f97c66829b7bc2c724be0d4328e24c56e/userdata/shm DeviceMajor:0 DeviceMinor:131 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-845 DeviceMajor:0 DeviceMinor:845 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/594aaded-5615-4bed-87ee-6173059a73be/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:306 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-377 DeviceMajor:0 DeviceMinor:377 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ce6d6f50d1ea16153d0bcd0e4641d90ef903c01636f33ef60f26b9dcbbaecad8/userdata/shm DeviceMajor:0 DeviceMinor:1101 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1240 DeviceMajor:0 DeviceMinor:1240 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1128 DeviceMajor:0 DeviceMinor:1128 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-566 DeviceMajor:0 DeviceMinor:566 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-628 DeviceMajor:0 DeviceMinor:628 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-364 DeviceMajor:0 DeviceMinor:364 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f119ffe4-16bd-49eb-916d-b18ba0d79b54/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:304 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-460 DeviceMajor:0 DeviceMinor:460 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1338 DeviceMajor:0 DeviceMinor:1338 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-444 DeviceMajor:0 DeviceMinor:444 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f3792522-fec6-4022-90ac-0b8467fcd625/volumes/kubernetes.io~projected/kube-api-access-flxbg DeviceMajor:0 DeviceMinor:301 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7d0792bf-e2da-4ee7-91fe-032299cea42f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:988 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1640 DeviceMajor:0 DeviceMinor:1640 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b8233dad-bd19-4842-a4d5-cfa84f1feb83/volumes/kubernetes.io~projected/kube-api-access-mvbfq DeviceMajor:0 DeviceMinor:185 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1190 DeviceMajor:0 DeviceMinor:1190 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1299 DeviceMajor:0 DeviceMinor:1299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f4a70855-80b5-4d6a-bed1-b42364940de0/volumes/kubernetes.io~projected/kube-api-access-69z2l DeviceMajor:0 DeviceMinor:342 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-682 DeviceMajor:0 DeviceMinor:682 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-507 DeviceMajor:0 DeviceMinor:507 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1676 DeviceMajor:0 DeviceMinor:1676 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-778 DeviceMajor:0 DeviceMinor:778 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-202 DeviceMajor:0 DeviceMinor:202 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-908 DeviceMajor:0 DeviceMinor:908 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-446 DeviceMajor:0 DeviceMinor:446 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1665 DeviceMajor:0 DeviceMinor:1665 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d72b2b71-27b2-4aff-bf69-7054a9556318/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:623 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/c39d2089-d3bf-4556-b6ef-c362a08c21a2/volumes/kubernetes.io~projected/kube-api-access-mr9jd DeviceMajor:0 DeviceMinor:98 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2/volumes/kubernetes.io~projected/kube-api-access-4bjs8 DeviceMajor:0 DeviceMinor:1458 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1330 DeviceMajor:0 DeviceMinor:1330 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-839 DeviceMajor:0 DeviceMinor:839 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-441 DeviceMajor:0 DeviceMinor:441 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f/volumes/kubernetes.io~projected/kube-api-access-b6wsq DeviceMajor:0 DeviceMinor:906 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-602 DeviceMajor:0 DeviceMinor:602 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a757f807-e1bf-4f1e-9787-6b4acc8d09cf/volumes/kubernetes.io~projected/kube-api-access-9z8h9 DeviceMajor:0 DeviceMinor:157 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:010fcb3fd705e5d MacAddress:f6:2a:d7:28:b4:9f Speed:10000 Mtu:8900} {Name:029b733e2c6ad9f MacAddress:9a:88:b3:87:38:19 Speed:10000 Mtu:8900} {Name:04a1540e033fc0d MacAddress:c2:bf:d1:59:98:00 Speed:10000 Mtu:8900} {Name:04f451fea9668a7 MacAddress:0e:0d:7c:53:78:99 Speed:10000 Mtu:8900} {Name:065b5ff0754f03a MacAddress:e2:24:61:65:81:09 Speed:10000 Mtu:8900} {Name:07bd9adb3dd2a54 MacAddress:56:e7:f3:c4:10:84 Speed:10000 Mtu:8900} {Name:0b836f01dcb43b6 MacAddress:f6:eb:49:58:84:98 Speed:10000 Mtu:8900} {Name:0b9e8ef8efad8c6 MacAddress:d2:1a:0f:27:82:16 Speed:10000 Mtu:8900} {Name:0f8a1e4d8de6a06 MacAddress:1e:5b:1f:a4:c3:4b Speed:10000 Mtu:8900} {Name:1234ab8fb98aae2 MacAddress:0e:e8:a1:19:45:47 Speed:10000 Mtu:8900} {Name:12b2377bacbd62e MacAddress:3e:c2:f3:16:50:8f Speed:10000 Mtu:8900} {Name:12c707b6a686095 MacAddress:1a:2e:0b:3a:2b:cf Speed:10000 Mtu:8900} {Name:130205999d123cc MacAddress:76:6d:d6:16:da:be Speed:10000 Mtu:8900} {Name:19edfec7b5dad95 MacAddress:f6:05:b4:ed:05:3b Speed:10000 Mtu:8900} {Name:1e7e859b537def1 MacAddress:0e:18:9a:95:8d:e1 Speed:10000 Mtu:8900} {Name:24b6227b14f2279 MacAddress:a2:16:0c:8b:e1:fb Speed:10000 Mtu:8900} {Name:2a325da0f7b2c28 MacAddress:2e:8f:82:c2:9f:83 Speed:10000 Mtu:8900} {Name:3d66257a9a5cc16 MacAddress:ae:64:c3:cd:c0:34 Speed:10000 Mtu:8900} {Name:44e741be030df14 MacAddress:72:9d:94:0f:f7:9d Speed:10000 Mtu:8900} {Name:46252d0271f63a8 MacAddress:e2:1d:23:55:a2:a4 Speed:10000 Mtu:8900} {Name:47731386c0cb9aa MacAddress:16:17:6f:15:8e:f5 Speed:10000 Mtu:8900} {Name:49f2f301b501743 MacAddress:b2:55:6c:0e:58:85 Speed:10000 Mtu:8900} {Name:4ed24c6b6f900a1 MacAddress:2e:fe:b7:62:47:b6 Speed:10000 Mtu:8900} {Name:5404e1e33c358f1 MacAddress:92:2a:2d:2f:68:be Speed:10000 Mtu:8900} {Name:54d1c55b3ab4371 MacAddress:0e:d9:58:d3:77:2b Speed:10000 Mtu:8900} {Name:570d4cae37b4f39 MacAddress:66:a9:ae:41:69:31 Speed:10000 Mtu:8900} {Name:6a7ef281a34ccfa MacAddress:8e:c4:2f:da:35:96 Speed:10000 Mtu:8900} {Name:6d7e84b5ce96cc7 MacAddress:76:ac:27:f2:08:72 Speed:10000 Mtu:8900} {Name:7306701b7f1e349 MacAddress:ba:5b:71:c1:99:6d Speed:10000 Mtu:8900} {Name:743ece8bb6e4040 MacAddress:b2:00:c0:bc:4c:4d Speed:10000 Mtu:8900} {Name:77da36c6bf5d09d MacAddress:7a:4e:48:e4:ad:b0 Speed:10000 Mtu:8900} {Name:86aa525c2c153f5 MacAddress:6a:59:8d:ff:9e:e8 Speed:10000 Mtu:8900} {Name:92eddccae7e06f0 MacAddress:e6:ce:46:67:e2:b4 Speed:10000 Mtu:8900} {Name:9a083a2de33da77 MacAddress:26:7f:02:e6:8d:89 Speed:10000 Mtu:8900} {Name:9abf289d98169b2 MacAddress:5e:e5:54:9e:9d:f9 Speed:10000 Mtu:8900} {Name:9ca3179bcac9021 MacAddress:2a:e4:53:e5:4d:a5 Speed:10000 Mtu:8900} {Name:9cdc542e09a2b9f MacAddress:7e:82:f3:96:72:b2 Speed:10000 Mtu:8900} {Name:9cfdae6ccb167d4 MacAddress:f6:7d:b6:c4:4b:38 Speed:10000 Mtu:8900} {Name:9f1e76d4f58fcd2 MacAddress:36:91:c8:0f:10:e4 Speed:10000 Mtu:8900} {Name:9fd6db41eb8dc90 MacAddress:92:e8:6e:43:d8:9f Speed:10000 Mtu:8900} {Name:a8ddc41afaf0c61 MacAddress:6e:37:24:f2:12:a1 Speed:10000 Mtu:8900} {Name:abe43915cc10895 MacAddress:fa:5f:a0:14:10:05 Speed:10000 Mtu:8900} {Name:ae3644549c6cacc MacAddress:2e:e7:ba:66:48:ce Speed:10000 Mtu:8900} {Name:b36190e4cf6d5a6 MacAddress:ce:9c:3a:9d:42:61 Speed:10000 Mtu:8900} {Name:bba3aa271baddd9 MacAddress:3a:66:f2:3c:04:f8 Speed:10000 Mtu:8900} {Name:bd884dd8fbf0cb1 MacAddress:9e:e5:96:6e:be:b4 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:b2:49:0d:80:cf:b8 Speed:0 Mtu:8900} {Name:c44264ca51ad61e MacAddress:a2:0b:25:6c:67:a7 Speed:10000 Mtu:8900} {Name:c5997a9e57f3684 MacAddress:c2:93:91:89:48:8c Speed:10000 Mtu:8900} {Name:c9238078b14a694 MacAddress:7e:5d:7f:f4:3d:54 Speed:10000 Mtu:8900} {Name:ce6d6f50d1ea161 MacAddress:8e:a3:38:39:22:03 Speed:10000 Mtu:8900} {Name:dac2262b7105102 MacAddress:0a:8c:a8:e0:e8:4c Speed:10000 Mtu:8900} {Name:df3031001bb8ce6 MacAddress:9e:ed:54:fe:dd:08 Speed:10000 Mtu:8900} {Name:e67f95f822c645d MacAddress:fa:2b:2b:b1:30:fa Speed:10000 Mtu:8900} {Name:ecdffd0c2fc8d74 MacAddress:22:c6:19:c7:5f:68 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:27:b3:a6 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:d3:8e:e6 Speed:-1 Mtu:9000} {Name:fafe50d6690c2fb MacAddress:a2:3b:d4:b0:83:14 Speed:10000 Mtu:8900} {Name:fb396b2885c697f MacAddress:96:15:22:5b:3a:17 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:8a:28:cb:43:ed:9b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514145280 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 12:50:03.095605 master-0 kubenswrapper[29936]: I1205 12:50:03.094783 29936 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 12:50:03.095605 master-0 kubenswrapper[29936]: I1205 12:50:03.094850 29936 manager.go:233] Version: {KernelVersion:5.14.0-427.100.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511170715-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 12:50:03.095605 master-0 kubenswrapper[29936]: I1205 12:50:03.095080 29936 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 12:50:03.095605 master-0 kubenswrapper[29936]: I1205 12:50:03.095224 29936 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 12:50:03.095605 master-0 kubenswrapper[29936]: I1205 12:50:03.095255 29936 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 12:50:03.095605 master-0 kubenswrapper[29936]: I1205 12:50:03.095432 29936 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 12:50:03.095605 master-0 kubenswrapper[29936]: I1205 12:50:03.095441 29936 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 12:50:03.095605 master-0 kubenswrapper[29936]: I1205 12:50:03.095449 29936 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 12:50:03.095605 master-0 kubenswrapper[29936]: I1205 12:50:03.095470 29936 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 12:50:03.096346 master-0 kubenswrapper[29936]: I1205 12:50:03.095706 29936 state_mem.go:36] "Initialized new in-memory state store" Dec 05 12:50:03.096346 master-0 kubenswrapper[29936]: I1205 12:50:03.095794 29936 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 12:50:03.096346 master-0 kubenswrapper[29936]: I1205 12:50:03.095859 29936 kubelet.go:418] "Attempting to sync node with API server" Dec 05 12:50:03.096346 master-0 kubenswrapper[29936]: I1205 12:50:03.095870 29936 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 12:50:03.096346 master-0 kubenswrapper[29936]: I1205 12:50:03.095888 29936 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 12:50:03.096346 master-0 kubenswrapper[29936]: I1205 12:50:03.095898 29936 kubelet.go:324] "Adding apiserver pod source" Dec 05 12:50:03.096346 master-0 kubenswrapper[29936]: I1205 12:50:03.095915 29936 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 12:50:03.098293 master-0 kubenswrapper[29936]: I1205 12:50:03.097951 29936 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 05 12:50:03.098293 master-0 kubenswrapper[29936]: I1205 12:50:03.098132 29936 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 12:50:03.106381 master-0 kubenswrapper[29936]: I1205 12:50:03.106275 29936 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 12:50:03.106819 master-0 kubenswrapper[29936]: I1205 12:50:03.106762 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 12:50:03.106868 master-0 kubenswrapper[29936]: I1205 12:50:03.106837 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 12:50:03.106868 master-0 kubenswrapper[29936]: I1205 12:50:03.106848 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 12:50:03.106868 master-0 kubenswrapper[29936]: I1205 12:50:03.106855 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 12:50:03.106868 master-0 kubenswrapper[29936]: I1205 12:50:03.106862 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 12:50:03.106868 master-0 kubenswrapper[29936]: I1205 12:50:03.106870 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 12:50:03.107065 master-0 kubenswrapper[29936]: I1205 12:50:03.106878 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 12:50:03.107065 master-0 kubenswrapper[29936]: I1205 12:50:03.106885 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 12:50:03.107065 master-0 kubenswrapper[29936]: I1205 12:50:03.106895 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 12:50:03.107065 master-0 kubenswrapper[29936]: I1205 12:50:03.106909 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 12:50:03.107065 master-0 kubenswrapper[29936]: I1205 12:50:03.106931 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 12:50:03.107065 master-0 kubenswrapper[29936]: I1205 12:50:03.106945 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 12:50:03.107065 master-0 kubenswrapper[29936]: I1205 12:50:03.106980 29936 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 12:50:03.108590 master-0 kubenswrapper[29936]: I1205 12:50:03.108547 29936 server.go:1280] "Started kubelet" Dec 05 12:50:03.109766 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 05 12:50:03.110028 master-0 kubenswrapper[29936]: I1205 12:50:03.109823 29936 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 12:50:03.110028 master-0 kubenswrapper[29936]: I1205 12:50:03.109861 29936 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 12:50:03.110028 master-0 kubenswrapper[29936]: I1205 12:50:03.109928 29936 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 05 12:50:03.110673 master-0 kubenswrapper[29936]: I1205 12:50:03.110610 29936 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 12:50:03.113619 master-0 kubenswrapper[29936]: I1205 12:50:03.111838 29936 server.go:449] "Adding debug handlers to kubelet server" Dec 05 12:50:03.113619 master-0 kubenswrapper[29936]: I1205 12:50:03.113208 29936 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 12:50:03.114856 master-0 kubenswrapper[29936]: I1205 12:50:03.114763 29936 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 12:50:03.139998 master-0 kubenswrapper[29936]: I1205 12:50:03.139909 29936 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 12:50:03.140243 master-0 kubenswrapper[29936]: I1205 12:50:03.140043 29936 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 12:50:03.141655 master-0 kubenswrapper[29936]: I1205 12:50:03.141462 29936 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-06 12:20:41 +0000 UTC, rotation deadline is 2025-12-06 09:49:46.342641603 +0000 UTC Dec 05 12:50:03.141655 master-0 kubenswrapper[29936]: I1205 12:50:03.141641 29936 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h59m43.201004949s for next certificate rotation Dec 05 12:50:03.147959 master-0 kubenswrapper[29936]: I1205 12:50:03.147905 29936 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 12:50:03.147959 master-0 kubenswrapper[29936]: I1205 12:50:03.147935 29936 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 12:50:03.148643 master-0 kubenswrapper[29936]: I1205 12:50:03.148609 29936 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 05 12:50:03.156114 master-0 kubenswrapper[29936]: I1205 12:50:03.155648 29936 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157321 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ee7a76b-cf1d-4513-b314-5aa314da818d" volumeName="kubernetes.io/secret/1ee7a76b-cf1d-4513-b314-5aa314da818d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157377 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="38941513-e968-45f1-9cb2-b63d40338f36" volumeName="kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-bound-sa-token" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157390 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d72b2b71-27b2-4aff-bf69-7054a9556318" volumeName="kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-policies" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157400 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d72b2b71-27b2-4aff-bf69-7054a9556318" volumeName="kubernetes.io/projected/d72b2b71-27b2-4aff-bf69-7054a9556318-kube-api-access-wjp62" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157409 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4b7f0d8d-a2bf-4550-b6e6-1c56adae827e" volumeName="kubernetes.io/projected/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-kube-api-access-xtjln" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157418 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="60327040-f782-4cda-a32d-52a4f183073c" volumeName="kubernetes.io/configmap/60327040-f782-4cda-a32d-52a4f183073c-metrics-client-ca" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157429 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dbe144b5-3b78-4946-bbf9-b825b0e47b07" volumeName="kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-auth-proxy-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157438 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="153fec1f-a10b-4c6c-a997-60fa80c13a86" volumeName="kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-kube-api-access-dr2r9" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157448 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="60327040-f782-4cda-a32d-52a4f183073c" volumeName="kubernetes.io/empty-dir/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-textfile" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157457 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5338041-f213-46ef-9d81-248567ba958d" volumeName="kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157465 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5338041-f213-46ef-9d81-248567ba958d" volumeName="kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157479 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" volumeName="kubernetes.io/secret/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157490 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6" volumeName="kubernetes.io/configmap/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-iptables-alerter-script" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157527 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="38941513-e968-45f1-9cb2-b63d40338f36" volumeName="kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157542 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="95d8fb27-8b2b-4749-add3-9e9b16edb693" volumeName="kubernetes.io/secret/95d8fb27-8b2b-4749-add3-9e9b16edb693-proxy-tls" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157551 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db27bee9-3d33-4c4a-b38b-72f7cec77c7a" volumeName="kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-auth-proxy-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157559 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e9ba71a-d1b5-4986-babe-2c15c19f9cc2" volumeName="kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157576 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf8247a1-703a-46b3-9a33-25a73b27ab99" volumeName="kubernetes.io/configmap/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-cabundle" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157585 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" volumeName="kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157595 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20a72c8b-0f12-446b-8a42-53d98864c8f8" volumeName="kubernetes.io/projected/20a72c8b-0f12-446b-8a42-53d98864c8f8-kube-api-access-6dwm5" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157605 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a45f340c-0eca-4460-8961-4ca360467eeb" volumeName="kubernetes.io/empty-dir/a45f340c-0eca-4460-8961-4ca360467eeb-available-featuregates" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157615 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc18a83a-998e-458e-87f0-d5368da52e1b" volumeName="kubernetes.io/projected/bc18a83a-998e-458e-87f0-d5368da52e1b-kube-api-access-bmjn7" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157624 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dbe144b5-3b78-4946-bbf9-b825b0e47b07" volumeName="kubernetes.io/projected/dbe144b5-3b78-4946-bbf9-b825b0e47b07-kube-api-access-mbg7w" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157634 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebfbe878-1796-4a20-b3f0-76165038252e" volumeName="kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-catalog-content" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157643 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="153fec1f-a10b-4c6c-a997-60fa80c13a86" volumeName="kubernetes.io/empty-dir/153fec1f-a10b-4c6c-a997-60fa80c13a86-cache" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157657 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b741029-0eb5-409b-b7f1-95e8385dc400" volumeName="kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-kube-api-access-5g7mj" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157669 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e9ba71a-d1b5-4986-babe-2c15c19f9cc2" volumeName="kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157680 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a14df948-1ec4-4785-ad33-28d1e7063959" volumeName="kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-trusted-ca-bundle" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157690 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3792522-fec6-4022-90ac-0b8467fcd625" volumeName="kubernetes.io/configmap/f3792522-fec6-4022-90ac-0b8467fcd625-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157699 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1478a21e-b6ac-46fb-ad01-805ac71f0a79" volumeName="kubernetes.io/secret/1478a21e-b6ac-46fb-ad01-805ac71f0a79-proxy-tls" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157709 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="60327040-f782-4cda-a32d-52a4f183073c" volumeName="kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157718 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="708bf629-9949-4b79-a88a-c73ba033475b" volumeName="kubernetes.io/projected/708bf629-9949-4b79-a88a-c73ba033475b-kube-api-access-6vx2z" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157728 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" volumeName="kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-image-import-ca" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157740 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1478a21e-b6ac-46fb-ad01-805ac71f0a79" volumeName="kubernetes.io/configmap/1478a21e-b6ac-46fb-ad01-805ac71f0a79-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157750 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="665c4362-e2e5-4f96-92c0-1746c63c7422" volumeName="kubernetes.io/configmap/665c4362-e2e5-4f96-92c0-1746c63c7422-cco-trusted-ca" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157761 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="95d8fb27-8b2b-4749-add3-9e9b16edb693" volumeName="kubernetes.io/projected/95d8fb27-8b2b-4749-add3-9e9b16edb693-kube-api-access-fb42t" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157771 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dc5db54b-094f-4c36-a0ad-042e9fc2b61d" volumeName="kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-node-bootstrap-token" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157781 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a14df948-1ec4-4785-ad33-28d1e7063959" volumeName="kubernetes.io/secret/a14df948-1ec4-4785-ad33-28d1e7063959-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157791 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba095394-1873-4793-969d-3be979fa0771" volumeName="kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-trusted-ca-bundle" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157800 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb7003a6-4341-49eb-bec3-76ba8610fa12" volumeName="kubernetes.io/projected/fb7003a6-4341-49eb-bec3-76ba8610fa12-kube-api-access-69n5s" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157812 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e943438b-1de8-435c-8a19-accd6a6292a4" volumeName="kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157824 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" volumeName="kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-encryption-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157834 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d0792bf-e2da-4ee7-91fe-032299cea42f" volumeName="kubernetes.io/secret/7d0792bf-e2da-4ee7-91fe-032299cea42f-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157843 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5338041-f213-46ef-9d81-248567ba958d" volumeName="kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157855 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" volumeName="kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovnkube-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157864 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dda6d9b-cb3a-413a-85af-ef08f15ea42e" volumeName="kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157874 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b741029-0eb5-409b-b7f1-95e8385dc400" volumeName="kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157883 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="60327040-f782-4cda-a32d-52a4f183073c" volumeName="kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-kube-rbac-proxy-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157893 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="807d9093-aa67-4840-b5be-7f3abcc1beed" volumeName="kubernetes.io/projected/807d9093-aa67-4840-b5be-7f3abcc1beed-kube-api-access" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157902 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" volumeName="kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-trusted-ca-bundle" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157911 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb7003a6-4341-49eb-bec3-76ba8610fa12" volumeName="kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157920 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="365bf663-fd5b-44df-a327-0438995c015d" volumeName="kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-images" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157935 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76" volumeName="kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157946 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b13885ef-d2b5-4591-825d-446cf8729bc1" volumeName="kubernetes.io/empty-dir/b13885ef-d2b5-4591-825d-446cf8729bc1-tmpfs" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157955 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4492c55f-701b-4ec8-ada1-0a5dc126d405" volumeName="kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-script-lib" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157966 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9c31f89c-b01b-4853-a901-bccc25441a46" volumeName="kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-utilities" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157975 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d53a4886-db25-43a1-825a-66a9a9a58590" volumeName="kubernetes.io/projected/d53a4886-db25-43a1-825a-66a9a9a58590-kube-api-access-2tngh" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157984 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db27bee9-3d33-4c4a-b38b-72f7cec77c7a" volumeName="kubernetes.io/secret/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-machine-approver-tls" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.157994 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49760d62-02e5-4882-b47f-663102b04946" volumeName="kubernetes.io/projected/49760d62-02e5-4882-b47f-663102b04946-kube-api-access-26x2z" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158003 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba095394-1873-4793-969d-3be979fa0771" volumeName="kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158012 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf8247a1-703a-46b3-9a33-25a73b27ab99" volumeName="kubernetes.io/projected/cf8247a1-703a-46b3-9a33-25a73b27ab99-kube-api-access-fqdxl" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158021 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" volumeName="kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-client" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158029 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2e2d968-9946-4711-aaf0-3e3a03bff415" volumeName="kubernetes.io/projected/e2e2d968-9946-4711-aaf0-3e3a03bff415-kube-api-access-pxwwh" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158037 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" volumeName="kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158045 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1e6babfe-724a-4eab-bb3b-bc318bf57b70" volumeName="kubernetes.io/projected/1e6babfe-724a-4eab-bb3b-bc318bf57b70-kube-api-access-c2gd8" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158054 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20a72c8b-0f12-446b-8a42-53d98864c8f8" volumeName="kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-default-certificate" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158062 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="365bf663-fd5b-44df-a327-0438995c015d" volumeName="kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-auth-proxy-config" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158072 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f" volumeName="kubernetes.io/secret/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-samples-operator-tls" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158081 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba095394-1873-4793-969d-3be979fa0771" volumeName="kubernetes.io/secret/ba095394-1873-4793-969d-3be979fa0771-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.158141 master-0 kubenswrapper[29936]: I1205 12:50:03.158090 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce9e2a6b-8ce7-477c-8bc7-24033243eabe" volumeName="kubernetes.io/configmap/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-config-volume" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158102 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a280c582-685e-47ac-bf6b-248aa0c129a9" volumeName="kubernetes.io/projected/a280c582-685e-47ac-bf6b-248aa0c129a9-kube-api-access-xkqq7" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158323 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a2acba71-b9dc-4b85-be35-c995b8be2f19" volumeName="kubernetes.io/configmap/a2acba71-b9dc-4b85-be35-c995b8be2f19-trusted-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158333 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8233dad-bd19-4842-a4d5-cfa84f1feb83" volumeName="kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158342 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c39d2089-d3bf-4556-b6ef-c362a08c21a2" volumeName="kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158353 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dc5db54b-094f-4c36-a0ad-042e9fc2b61d" volumeName="kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-certs" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158365 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76" volumeName="kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158374 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5efad170-c154-42ec-a7c0-b36a98d2bfcc" volumeName="kubernetes.io/secret/5efad170-c154-42ec-a7c0-b36a98d2bfcc-metrics-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158385 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d0792bf-e2da-4ee7-91fe-032299cea42f" volumeName="kubernetes.io/configmap/7d0792bf-e2da-4ee7-91fe-032299cea42f-service-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158396 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" volumeName="kubernetes.io/projected/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-kube-api-access-ph9w6" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158406 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="708bf629-9949-4b79-a88a-c73ba033475b" volumeName="kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-binary-copy" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158419 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8defe125-1529-4091-adff-e9d17a2b298f" volumeName="kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-utilities" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158427 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a2acba71-b9dc-4b85-be35-c995b8be2f19" volumeName="kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158436 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" volumeName="kubernetes.io/projected/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-kube-api-access-dh58c" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158447 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3d96c85a-fc88-46af-83d5-6c71ec6e2c23" volumeName="kubernetes.io/projected/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-kube-api-access-ss5kh" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158457 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4492c55f-701b-4ec8-ada1-0a5dc126d405" volumeName="kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-env-overrides" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158476 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76" volumeName="kubernetes.io/projected/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-kube-api-access-nqvfm" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158485 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a280c582-685e-47ac-bf6b-248aa0c129a9" volumeName="kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158496 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d72b2b71-27b2-4aff-bf69-7054a9556318" volumeName="kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-client" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158505 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58187662-b502-4d90-95ce-2aa91a81d256" volumeName="kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158514 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" volumeName="kubernetes.io/secret/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158525 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8233dad-bd19-4842-a4d5-cfa84f1feb83" volumeName="kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158534 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ee7a76b-cf1d-4513-b314-5aa314da818d" volumeName="kubernetes.io/projected/1ee7a76b-cf1d-4513-b314-5aa314da818d-kube-api-access-lkdtr" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158544 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76" volumeName="kubernetes.io/configmap/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-metrics-client-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158554 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f0c6889-0739-48a3-99cd-6db9d1f83242" volumeName="kubernetes.io/projected/5f0c6889-0739-48a3-99cd-6db9d1f83242-kube-api-access-p5p5d" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158564 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="807d9093-aa67-4840-b5be-7f3abcc1beed" volumeName="kubernetes.io/configmap/807d9093-aa67-4840-b5be-7f3abcc1beed-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158574 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d72b2b71-27b2-4aff-bf69-7054a9556318" volumeName="kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-encryption-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158583 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1871a9d6-6369-4d08-816f-9c6310b61ddf" volumeName="kubernetes.io/projected/1871a9d6-6369-4d08-816f-9c6310b61ddf-kube-api-access" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158593 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="594aaded-5615-4bed-87ee-6173059a73be" volumeName="kubernetes.io/projected/594aaded-5615-4bed-87ee-6173059a73be-kube-api-access" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158603 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" volumeName="kubernetes.io/configmap/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-trusted-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158612 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dc5db54b-094f-4c36-a0ad-042e9fc2b61d" volumeName="kubernetes.io/projected/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-kube-api-access-tjkjz" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158630 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158639 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4d215811-6210-4ec2-8356-f1533dc43f65" volumeName="kubernetes.io/projected/4d215811-6210-4ec2-8356-f1533dc43f65-kube-api-access" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158648 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="594aaded-5615-4bed-87ee-6173059a73be" volumeName="kubernetes.io/configmap/594aaded-5615-4bed-87ee-6173059a73be-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158658 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" volumeName="kubernetes.io/projected/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-kube-api-access-dtvzs" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158673 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" volumeName="kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-bound-sa-token" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158683 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8233dad-bd19-4842-a4d5-cfa84f1feb83" volumeName="kubernetes.io/projected/b8233dad-bd19-4842-a4d5-cfa84f1feb83-kube-api-access-mvbfq" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158693 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dbe144b5-3b78-4946-bbf9-b825b0e47b07" volumeName="kubernetes.io/secret/dbe144b5-3b78-4946-bbf9-b825b0e47b07-cloud-controller-manager-operator-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158704 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f2635f9f-219b-4d03-b5b3-496c0c836fae" volumeName="kubernetes.io/projected/f2635f9f-219b-4d03-b5b3-496c0c836fae-kube-api-access-fwrwm" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158713 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1e6babfe-724a-4eab-bb3b-bc318bf57b70" volumeName="kubernetes.io/configmap/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158723 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" volumeName="kubernetes.io/configmap/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158732 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c39d2089-d3bf-4556-b6ef-c362a08c21a2" volumeName="kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158741 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6" volumeName="kubernetes.io/projected/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-kube-api-access-mlnqb" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158749 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b741029-0eb5-409b-b7f1-95e8385dc400" volumeName="kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-ca-certs" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158759 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e9ba71a-d1b5-4986-babe-2c15c19f9cc2" volumeName="kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-custom-resource-state-configmap" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158768 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5338041-f213-46ef-9d81-248567ba958d" volumeName="kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158776 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f4a70855-80b5-4d6a-bed1-b42364940de0" volumeName="kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158784 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f" volumeName="kubernetes.io/projected/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-kube-api-access-b6wsq" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158794 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a14df948-1ec4-4785-ad33-28d1e7063959" volumeName="kubernetes.io/projected/a14df948-1ec4-4785-ad33-28d1e7063959-kube-api-access-2g7n7" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158804 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a280c582-685e-47ac-bf6b-248aa0c129a9" volumeName="kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158812 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b13885ef-d2b5-4591-825d-446cf8729bc1" volumeName="kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158821 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce9e2a6b-8ce7-477c-8bc7-24033243eabe" volumeName="kubernetes.io/projected/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-kube-api-access-422c9" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158830 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e943438b-1de8-435c-8a19-accd6a6292a4" volumeName="kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158840 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20a72c8b-0f12-446b-8a42-53d98864c8f8" volumeName="kubernetes.io/configmap/20a72c8b-0f12-446b-8a42-53d98864c8f8-service-ca-bundle" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158849 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="38941513-e968-45f1-9cb2-b63d40338f36" volumeName="kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-kube-api-access-t5hdg" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158857 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="665c4362-e2e5-4f96-92c0-1746c63c7422" volumeName="kubernetes.io/projected/665c4362-e2e5-4f96-92c0-1746c63c7422-kube-api-access-lckv7" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158865 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="708bf629-9949-4b79-a88a-c73ba033475b" volumeName="kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-whereabouts-configmap" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158875 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" volumeName="kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-env-overrides" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158883 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b74e0607-6ed0-4119-8870-895b7d336830" volumeName="kubernetes.io/projected/b74e0607-6ed0-4119-8870-895b7d336830-kube-api-access-72wst" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158892 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d3e283fe-a474-4f83-ad66-62971945060a" volumeName="kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158901 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" volumeName="kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158910 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/projected/f119ffe4-16bd-49eb-916d-b18ba0d79b54-kube-api-access-wwcr9" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158918 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5efad170-c154-42ec-a7c0-b36a98d2bfcc" volumeName="kubernetes.io/projected/5efad170-c154-42ec-a7c0-b36a98d2bfcc-kube-api-access-996h9" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158928 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f0c6889-0739-48a3-99cd-6db9d1f83242" volumeName="kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158936 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="665c4362-e2e5-4f96-92c0-1746c63c7422" volumeName="kubernetes.io/secret/665c4362-e2e5-4f96-92c0-1746c63c7422-cloud-credential-operator-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158945 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="807d9093-aa67-4840-b5be-7f3abcc1beed" volumeName="kubernetes.io/secret/807d9093-aa67-4840-b5be-7f3abcc1beed-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158955 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="95d8fb27-8b2b-4749-add3-9e9b16edb693" volumeName="kubernetes.io/configmap/95d8fb27-8b2b-4749-add3-9e9b16edb693-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158964 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a14df948-1ec4-4785-ad33-28d1e7063959" volumeName="kubernetes.io/empty-dir/a14df948-1ec4-4785-ad33-28d1e7063959-snapshots" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158973 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db2e54b6-4879-40f4-9359-a8b0c31e76c2" volumeName="kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-srv-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158983 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebfbe878-1796-4a20-b3f0-76165038252e" volumeName="kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-utilities" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.158992 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3d96c85a-fc88-46af-83d5-6c71ec6e2c23" volumeName="kubernetes.io/secret/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-cluster-storage-operator-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159001 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a280c582-685e-47ac-bf6b-248aa0c129a9" volumeName="kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-images" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159010 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db2e54b6-4879-40f4-9359-a8b0c31e76c2" volumeName="kubernetes.io/projected/db2e54b6-4879-40f4-9359-a8b0c31e76c2-kube-api-access-nwz29" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159019 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb" volumeName="kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-profile-collector-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159028 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" volumeName="kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-kube-api-access-fxxw7" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159037 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="365bf663-fd5b-44df-a327-0438995c015d" volumeName="kubernetes.io/secret/365bf663-fd5b-44df-a327-0438995c015d-proxy-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159046 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d53a4886-db25-43a1-825a-66a9a9a58590" volumeName="kubernetes.io/configmap/d53a4886-db25-43a1-825a-66a9a9a58590-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159055 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20a72c8b-0f12-446b-8a42-53d98864c8f8" volumeName="kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-stats-auth" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159064 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db27bee9-3d33-4c4a-b38b-72f7cec77c7a" volumeName="kubernetes.io/projected/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-kube-api-access-2nbxt" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159074 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159084 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4b7f0d8d-a2bf-4550-b6e6-1c56adae827e" volumeName="kubernetes.io/secret/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159094 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="531b8927-92db-4e9d-9a0a-12ff948cdaad" volumeName="kubernetes.io/projected/531b8927-92db-4e9d-9a0a-12ff948cdaad-kube-api-access-xqblj" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159104 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a14df948-1ec4-4785-ad33-28d1e7063959" volumeName="kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-service-ca-bundle" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159114 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b13885ef-d2b5-4591-825d-446cf8729bc1" volumeName="kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159125 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba095394-1873-4793-969d-3be979fa0771" volumeName="kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-service-ca-bundle" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159136 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="594aaded-5615-4bed-87ee-6173059a73be" volumeName="kubernetes.io/secret/594aaded-5615-4bed-87ee-6173059a73be-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159146 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c39d2089-d3bf-4556-b6ef-c362a08c21a2" volumeName="kubernetes.io/projected/c39d2089-d3bf-4556-b6ef-c362a08c21a2-kube-api-access-mr9jd" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159156 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf8247a1-703a-46b3-9a33-25a73b27ab99" volumeName="kubernetes.io/secret/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-key" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159166 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e943438b-1de8-435c-8a19-accd6a6292a4" volumeName="kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159192 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d0792bf-e2da-4ee7-91fe-032299cea42f" volumeName="kubernetes.io/projected/7d0792bf-e2da-4ee7-91fe-032299cea42f-kube-api-access" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159202 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" volumeName="kubernetes.io/projected/b9623eb8-55d2-4c5c-aa8d-74b6a27274d8-kube-api-access-hfl8f" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159211 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4492c55f-701b-4ec8-ada1-0a5dc126d405" volumeName="kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159221 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b13885ef-d2b5-4591-825d-446cf8729bc1" volumeName="kubernetes.io/projected/b13885ef-d2b5-4591-825d-446cf8729bc1-kube-api-access-xmjkp" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159230 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e943438b-1de8-435c-8a19-accd6a6292a4" volumeName="kubernetes.io/projected/e943438b-1de8-435c-8a19-accd6a6292a4-kube-api-access-lfknz" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159240 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-client" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159253 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" volumeName="kubernetes.io/secret/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-cluster-olm-operator-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159266 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="365bf663-fd5b-44df-a327-0438995c015d" volumeName="kubernetes.io/projected/365bf663-fd5b-44df-a327-0438995c015d-kube-api-access-lqjgb" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159276 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c39d2089-d3bf-4556-b6ef-c362a08c21a2" volumeName="kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159366 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1871a9d6-6369-4d08-816f-9c6310b61ddf" volumeName="kubernetes.io/configmap/1871a9d6-6369-4d08-816f-9c6310b61ddf-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159377 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="38941513-e968-45f1-9cb2-b63d40338f36" volumeName="kubernetes.io/configmap/38941513-e968-45f1-9cb2-b63d40338f36-trusted-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159386 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb" volumeName="kubernetes.io/projected/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-kube-api-access-ht5kr" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159396 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="531b8927-92db-4e9d-9a0a-12ff948cdaad" volumeName="kubernetes.io/secret/531b8927-92db-4e9d-9a0a-12ff948cdaad-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159419 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9c31f89c-b01b-4853-a901-bccc25441a46" volumeName="kubernetes.io/projected/9c31f89c-b01b-4853-a901-bccc25441a46-kube-api-access-czcmr" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159429 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d53a4886-db25-43a1-825a-66a9a9a58590" volumeName="kubernetes.io/secret/d53a4886-db25-43a1-825a-66a9a9a58590-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.159998 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="153fec1f-a10b-4c6c-a997-60fa80c13a86" volumeName="kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160012 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="99996137-2621-458b-980d-584b3640d4ad" volumeName="kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-kube-rbac-proxy-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160028 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d72b2b71-27b2-4aff-bf69-7054a9556318" volumeName="kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-serving-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160037 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ee7a76b-cf1d-4513-b314-5aa314da818d" volumeName="kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-images" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160088 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e9ba71a-d1b5-4986-babe-2c15c19f9cc2" volumeName="kubernetes.io/empty-dir/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-volume-directive-shadow" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160099 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="708bf629-9949-4b79-a88a-c73ba033475b" volumeName="kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160107 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c39d2089-d3bf-4556-b6ef-c362a08c21a2" volumeName="kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160118 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5dfcb1e-1231-4f07-8c21-748965718729" volumeName="kubernetes.io/secret/e5dfcb1e-1231-4f07-8c21-748965718729-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160499 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4492c55f-701b-4ec8-ada1-0a5dc126d405" volumeName="kubernetes.io/projected/4492c55f-701b-4ec8-ada1-0a5dc126d405-kube-api-access-dmq98" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160532 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e9ba71a-d1b5-4986-babe-2c15c19f9cc2" volumeName="kubernetes.io/projected/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-api-access-4bjs8" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160543 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58187662-b502-4d90-95ce-2aa91a81d256" volumeName="kubernetes.io/projected/58187662-b502-4d90-95ce-2aa91a81d256-kube-api-access-ps4ws" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160572 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8defe125-1529-4091-adff-e9d17a2b298f" volumeName="kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-catalog-content" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160584 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5338041-f213-46ef-9d81-248567ba958d" volumeName="kubernetes.io/projected/a5338041-f213-46ef-9d81-248567ba958d-kube-api-access-bnwdh" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160597 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5dfcb1e-1231-4f07-8c21-748965718729" volumeName="kubernetes.io/projected/e5dfcb1e-1231-4f07-8c21-748965718729-kube-api-access-pb46q" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160611 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f2635f9f-219b-4d03-b5b3-496c0c836fae" volumeName="kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-tmp" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160622 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a280c582-685e-47ac-bf6b-248aa0c129a9" volumeName="kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cluster-baremetal-operator-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160635 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f2635f9f-219b-4d03-b5b3-496c0c836fae" volumeName="kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-tuned" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160647 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="99996137-2621-458b-980d-584b3640d4ad" volumeName="kubernetes.io/projected/99996137-2621-458b-980d-584b3640d4ad-kube-api-access-c69rc" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160659 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" volumeName="kubernetes.io/projected/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-kube-api-access-9z8h9" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160672 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" volumeName="kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-serving-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160683 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1e6babfe-724a-4eab-bb3b-bc318bf57b70" volumeName="kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160694 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9c31f89c-b01b-4853-a901-bccc25441a46" volumeName="kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-catalog-content" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160706 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-service-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160717 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1871a9d6-6369-4d08-816f-9c6310b61ddf" volumeName="kubernetes.io/secret/1871a9d6-6369-4d08-816f-9c6310b61ddf-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160728 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" volumeName="kubernetes.io/empty-dir/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-operand-assets" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160863 29936 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160898 29936 factory.go:55] Registering systemd factory Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.160910 29936 factory.go:221] Registration of the systemd container factory successfully Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161018 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f119ffe4-16bd-49eb-916d-b18ba0d79b54" volumeName="kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161041 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dda6d9b-cb3a-413a-85af-ef08f15ea42e" volumeName="kubernetes.io/projected/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-kube-api-access-62nqj" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161053 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1478a21e-b6ac-46fb-ad01-805ac71f0a79" volumeName="kubernetes.io/projected/1478a21e-b6ac-46fb-ad01-805ac71f0a79-kube-api-access-fz4q6" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161069 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b741029-0eb5-409b-b7f1-95e8385dc400" volumeName="kubernetes.io/empty-dir/3b741029-0eb5-409b-b7f1-95e8385dc400-cache" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161095 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dbe144b5-3b78-4946-bbf9-b825b0e47b07" volumeName="kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-images" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161107 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="60327040-f782-4cda-a32d-52a4f183073c" volumeName="kubernetes.io/projected/60327040-f782-4cda-a32d-52a4f183073c-kube-api-access-zp957" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161118 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="99996137-2621-458b-980d-584b3640d4ad" volumeName="kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161131 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a2acba71-b9dc-4b85-be35-c995b8be2f19" volumeName="kubernetes.io/projected/a2acba71-b9dc-4b85-be35-c995b8be2f19-kube-api-access-nml2g" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161142 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d72b2b71-27b2-4aff-bf69-7054a9556318" volumeName="kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-trusted-ca-bundle" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161154 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d3e283fe-a474-4f83-ad66-62971945060a" volumeName="kubernetes.io/projected/d3e283fe-a474-4f83-ad66-62971945060a-kube-api-access-pjbwh" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161195 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4492c55f-701b-4ec8-ada1-0a5dc126d405" volumeName="kubernetes.io/secret/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161210 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58187662-b502-4d90-95ce-2aa91a81d256" volumeName="kubernetes.io/configmap/58187662-b502-4d90-95ce-2aa91a81d256-telemetry-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161222 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="909ed395-8ad3-4350-95e3-b4b19c682f92" volumeName="kubernetes.io/secret/909ed395-8ad3-4350-95e3-b4b19c682f92-tls-certificates" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161232 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba095394-1873-4793-969d-3be979fa0771" volumeName="kubernetes.io/projected/ba095394-1873-4793-969d-3be979fa0771-kube-api-access-55qpg" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161243 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20a72c8b-0f12-446b-8a42-53d98864c8f8" volumeName="kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-metrics-certs" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161252 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a45f340c-0eca-4460-8961-4ca360467eeb" volumeName="kubernetes.io/projected/a45f340c-0eca-4460-8961-4ca360467eeb-kube-api-access-r7ftf" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161262 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5338041-f213-46ef-9d81-248567ba958d" volumeName="kubernetes.io/empty-dir/a5338041-f213-46ef-9d81-248567ba958d-audit-log" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161266 29936 factory.go:153] Registering CRI-O factory Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161283 29936 factory.go:221] Registration of the crio container factory successfully Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161308 29936 factory.go:103] Registering Raw factory Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161323 29936 manager.go:1196] Started watching for new ooms in manager Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161271 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ebfbe878-1796-4a20-b3f0-76165038252e" volumeName="kubernetes.io/projected/ebfbe878-1796-4a20-b3f0-76165038252e-kube-api-access-tncxt" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161362 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3792522-fec6-4022-90ac-0b8467fcd625" volumeName="kubernetes.io/projected/f3792522-fec6-4022-90ac-0b8467fcd625-kube-api-access-flxbg" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161373 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb" volumeName="kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-srv-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161387 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ee7a76b-cf1d-4513-b314-5aa314da818d" volumeName="kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161398 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8defe125-1529-4091-adff-e9d17a2b298f" volumeName="kubernetes.io/projected/8defe125-1529-4091-adff-e9d17a2b298f-kube-api-access-jpxqg" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161409 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a2acba71-b9dc-4b85-be35-c995b8be2f19" volumeName="kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161419 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a5338041-f213-46ef-9d81-248567ba958d" volumeName="kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161428 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8233dad-bd19-4842-a4d5-cfa84f1feb83" volumeName="kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-env-overrides" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161437 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3792522-fec6-4022-90ac-0b8467fcd625" volumeName="kubernetes.io/secret/f3792522-fec6-4022-90ac-0b8467fcd625-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161446 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f725fa37-ef11-479a-8cf9-f4b90fe5e7a1" volumeName="kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cni-binary-copy" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161455 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4b7f0d8d-a2bf-4550-b6e6-1c56adae827e" volumeName="kubernetes.io/configmap/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161463 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" volumeName="kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161490 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b74e0607-6ed0-4119-8870-895b7d336830" volumeName="kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-catalog-content" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161527 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db27bee9-3d33-4c4a-b38b-72f7cec77c7a" volumeName="kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161538 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db2e54b6-4879-40f4-9359-a8b0c31e76c2" volumeName="kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-profile-collector-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161547 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5dfcb1e-1231-4f07-8c21-748965718729" volumeName="kubernetes.io/configmap/e5dfcb1e-1231-4f07-8c21-748965718729-auth-proxy-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161556 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f725fa37-ef11-479a-8cf9-f4b90fe5e7a1" volumeName="kubernetes.io/projected/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-kube-api-access-x59kd" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161565 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="99996137-2621-458b-980d-584b3640d4ad" volumeName="kubernetes.io/configmap/99996137-2621-458b-980d-584b3640d4ad-metrics-client-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161574 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce9e2a6b-8ce7-477c-8bc7-24033243eabe" volumeName="kubernetes.io/secret/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-metrics-tls" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161585 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d72b2b71-27b2-4aff-bf69-7054a9556318" volumeName="kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161595 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f725fa37-ef11-479a-8cf9-f4b90fe5e7a1" volumeName="kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-daemon-config" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161605 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="480c1f6e-0e13-49f9-bc4e-07350842f16c" volumeName="kubernetes.io/projected/480c1f6e-0e13-49f9-bc4e-07350842f16c-kube-api-access-48ns8" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161644 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e9ba71a-d1b5-4986-babe-2c15c19f9cc2" volumeName="kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-metrics-client-ca" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161656 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a45f340c-0eca-4460-8961-4ca360467eeb" volumeName="kubernetes.io/secret/a45f340c-0eca-4460-8961-4ca360467eeb-serving-cert" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161665 29936 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b74e0607-6ed0-4119-8870-895b7d336830" volumeName="kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-utilities" seLinuxMountContext="" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161676 29936 reconstruct.go:97] "Volume reconstruction finished" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.161687 29936 reconciler.go:26] "Reconciler: start to sync state" Dec 05 12:50:03.162699 master-0 kubenswrapper[29936]: I1205 12:50:03.162031 29936 manager.go:319] Starting recovery of all containers Dec 05 12:50:03.176536 master-0 kubenswrapper[29936]: E1205 12:50:03.163907 29936 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Dec 05 12:50:03.176536 master-0 kubenswrapper[29936]: I1205 12:50:03.171621 29936 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 12:50:03.182912 master-0 kubenswrapper[29936]: I1205 12:50:03.182721 29936 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 12:50:03.185669 master-0 kubenswrapper[29936]: I1205 12:50:03.184960 29936 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 12:50:03.185669 master-0 kubenswrapper[29936]: I1205 12:50:03.185051 29936 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 12:50:03.185669 master-0 kubenswrapper[29936]: I1205 12:50:03.185100 29936 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 12:50:03.185669 master-0 kubenswrapper[29936]: E1205 12:50:03.185201 29936 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 12:50:03.193250 master-0 kubenswrapper[29936]: I1205 12:50:03.193044 29936 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 12:50:03.219928 master-0 kubenswrapper[29936]: I1205 12:50:03.219858 29936 generic.go:334] "Generic (PLEG): container finished" podID="38941513-e968-45f1-9cb2-b63d40338f36" containerID="418f3d79b0988ff7f7ba36537b8459867264703e9c8f702bbd93e4dee2835882" exitCode=0 Dec 05 12:50:03.222272 master-0 kubenswrapper[29936]: I1205 12:50:03.221560 29936 generic.go:334] "Generic (PLEG): container finished" podID="ebfbe878-1796-4a20-b3f0-76165038252e" containerID="6ac9a49c2a57485ce32b61b5b230ca835325f9ead86b65416a1ed194a651372b" exitCode=0 Dec 05 12:50:03.222272 master-0 kubenswrapper[29936]: I1205 12:50:03.221582 29936 generic.go:334] "Generic (PLEG): container finished" podID="ebfbe878-1796-4a20-b3f0-76165038252e" containerID="1edf9b703ee13d520466151dc6a14b9861ec98cd381dcaaddc281b34b9755005" exitCode=0 Dec 05 12:50:03.223115 master-0 kubenswrapper[29936]: I1205 12:50:03.223091 29936 generic.go:334] "Generic (PLEG): container finished" podID="8defe125-1529-4091-adff-e9d17a2b298f" containerID="5fa7eec5b7c19299d0ce6c87bba94066e186df6e6ba2162f840498cde3a19934" exitCode=0 Dec 05 12:50:03.223115 master-0 kubenswrapper[29936]: I1205 12:50:03.223108 29936 generic.go:334] "Generic (PLEG): container finished" podID="8defe125-1529-4091-adff-e9d17a2b298f" containerID="539fe177f9c1deb8d425356a84818b5c05d811c1ab77b966156d70120d25eef1" exitCode=0 Dec 05 12:50:03.225740 master-0 kubenswrapper[29936]: I1205 12:50:03.225716 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm_dbe144b5-3b78-4946-bbf9-b825b0e47b07/config-sync-controllers/0.log" Dec 05 12:50:03.229252 master-0 kubenswrapper[29936]: I1205 12:50:03.227676 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm_dbe144b5-3b78-4946-bbf9-b825b0e47b07/cluster-cloud-controller-manager/0.log" Dec 05 12:50:03.229252 master-0 kubenswrapper[29936]: I1205 12:50:03.227769 29936 generic.go:334] "Generic (PLEG): container finished" podID="dbe144b5-3b78-4946-bbf9-b825b0e47b07" containerID="88bb2ee05e17ca0ccc95842f8e427991824283668dc77c62b2a389be9423149d" exitCode=1 Dec 05 12:50:03.229252 master-0 kubenswrapper[29936]: I1205 12:50:03.227793 29936 generic.go:334] "Generic (PLEG): container finished" podID="dbe144b5-3b78-4946-bbf9-b825b0e47b07" containerID="b36175e01241a922ef57ef9968701e5af5fa8f55a7287b6d3fe1828d9e78254f" exitCode=1 Dec 05 12:50:03.235739 master-0 kubenswrapper[29936]: I1205 12:50:03.235507 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_af196a48-6fcc-47d1-95ac-7c0acd63dd21/installer/0.log" Dec 05 12:50:03.235739 master-0 kubenswrapper[29936]: I1205 12:50:03.235572 29936 generic.go:334] "Generic (PLEG): container finished" podID="af196a48-6fcc-47d1-95ac-7c0acd63dd21" containerID="7ece5c46267ba23e02ffd4ff25da31145731b6190d3fd68ce186c1aedbb31e5d" exitCode=1 Dec 05 12:50:03.237734 master-0 kubenswrapper[29936]: I1205 12:50:03.237687 29936 generic.go:334] "Generic (PLEG): container finished" podID="96fa3513-5467-4b0f-a03d-9279d36317bd" containerID="0a7d145dbed8d32146e90821257e92134c8804dafe8896f59ec88530e6ad0c4e" exitCode=0 Dec 05 12:50:03.248783 master-0 kubenswrapper[29936]: I1205 12:50:03.248680 29936 generic.go:334] "Generic (PLEG): container finished" podID="c2415969-33ad-418b-9df0-4a6c7bb279db" containerID="dcccd3d0ecb79fd6fed3cade29b8b0d3ae9e791a686c5e6c4f661b34fd1efb10" exitCode=0 Dec 05 12:50:03.255607 master-0 kubenswrapper[29936]: I1205 12:50:03.254951 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-79767b7ff9-h8qkj_5efad170-c154-42ec-a7c0-b36a98d2bfcc/network-operator/2.log" Dec 05 12:50:03.255607 master-0 kubenswrapper[29936]: I1205 12:50:03.255020 29936 generic.go:334] "Generic (PLEG): container finished" podID="5efad170-c154-42ec-a7c0-b36a98d2bfcc" containerID="3507046fb798191fc9af19cde11e6d29feed57ad8ee65fd82dade1b688773700" exitCode=255 Dec 05 12:50:03.258000 master-0 kubenswrapper[29936]: I1205 12:50:03.257612 29936 generic.go:334] "Generic (PLEG): container finished" podID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerID="ded126662555b11ef5f6022975feef5471a12cb6870d5933adf38dcb51422cc7" exitCode=0 Dec 05 12:50:03.267805 master-0 kubenswrapper[29936]: I1205 12:50:03.267198 29936 generic.go:334] "Generic (PLEG): container finished" podID="21de9318-06b4-42ba-8791-6d22055a04f2" containerID="a6eeacf32c540b469027d242ad82a84ffbe4f8b8381d45f48601d0197961c30d" exitCode=0 Dec 05 12:50:03.283346 master-0 kubenswrapper[29936]: I1205 12:50:03.283297 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5bf4d88c6f-dxd24_f119ffe4-16bd-49eb-916d-b18ba0d79b54/etcd-operator/2.log" Dec 05 12:50:03.283555 master-0 kubenswrapper[29936]: I1205 12:50:03.283350 29936 generic.go:334] "Generic (PLEG): container finished" podID="f119ffe4-16bd-49eb-916d-b18ba0d79b54" containerID="1313a091281e52fd967237a2bce92f6479da56820a8f33315dc25623f650fa7b" exitCode=255 Dec 05 12:50:03.285438 master-0 kubenswrapper[29936]: E1205 12:50:03.285359 29936 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 05 12:50:03.293589 master-0 kubenswrapper[29936]: I1205 12:50:03.292539 29936 generic.go:334] "Generic (PLEG): container finished" podID="1e6babfe-724a-4eab-bb3b-bc318bf57b70" containerID="9059626ad4510705fe438e1803257849f89596beec2662512048f0044416af28" exitCode=0 Dec 05 12:50:03.314473 master-0 kubenswrapper[29936]: I1205 12:50:03.314418 29936 generic.go:334] "Generic (PLEG): container finished" podID="4492c55f-701b-4ec8-ada1-0a5dc126d405" containerID="1e1ba9d3a2cd6fc3c76c6b40cc81f5a9fa8707214a43505b547185529870eae9" exitCode=0 Dec 05 12:50:03.328039 master-0 kubenswrapper[29936]: I1205 12:50:03.327767 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_3169f44496ed8a28c6d6a15511ab0eec/kube-rbac-proxy-crio/2.log" Dec 05 12:50:03.328651 master-0 kubenswrapper[29936]: I1205 12:50:03.328621 29936 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="1071666e1f17d7cf23c890a67d65947ba5ea19368b6f26e80669ec0f695e375b" exitCode=1 Dec 05 12:50:03.328900 master-0 kubenswrapper[29936]: I1205 12:50:03.328759 29936 generic.go:334] "Generic (PLEG): container finished" podID="3169f44496ed8a28c6d6a15511ab0eec" containerID="419ef08e96de1310d58b89d9dc91be12123ef06b7cf2f7b293589e349077e04c" exitCode=0 Dec 05 12:50:03.332871 master-0 kubenswrapper[29936]: I1205 12:50:03.332825 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7cc89f4c4c-n28z2_3b741029-0eb5-409b-b7f1-95e8385dc400/manager/1.log" Dec 05 12:50:03.334222 master-0 kubenswrapper[29936]: I1205 12:50:03.334165 29936 generic.go:334] "Generic (PLEG): container finished" podID="3b741029-0eb5-409b-b7f1-95e8385dc400" containerID="712a042e7fad33cd815c939ca364362f7220e1f1ce6096f34de7bd5630509fb8" exitCode=1 Dec 05 12:50:03.339523 master-0 kubenswrapper[29936]: I1205 12:50:03.339485 29936 generic.go:334] "Generic (PLEG): container finished" podID="2cb8c983acca0c27a191b3f720d4b1e0" containerID="ce5bd605cc76993bca2c497ff38423a9bcba04863edec782efc7ee32483a630a" exitCode=0 Dec 05 12:50:03.341647 master-0 kubenswrapper[29936]: I1205 12:50:03.341622 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/4.log" Dec 05 12:50:03.342082 master-0 kubenswrapper[29936]: I1205 12:50:03.342053 29936 generic.go:334] "Generic (PLEG): container finished" podID="a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7" containerID="a62572546062b2df435bc85f27bda94544b75d65580e59f21beaef134a43b821" exitCode=1 Dec 05 12:50:03.347500 master-0 kubenswrapper[29936]: I1205 12:50:03.347409 29936 generic.go:334] "Generic (PLEG): container finished" podID="9c31f89c-b01b-4853-a901-bccc25441a46" containerID="1f82b253446479fa5b79026be8aaeda5c0a2e403ecedf5e8aa0aa49e59d88903" exitCode=0 Dec 05 12:50:03.347608 master-0 kubenswrapper[29936]: I1205 12:50:03.347506 29936 generic.go:334] "Generic (PLEG): container finished" podID="9c31f89c-b01b-4853-a901-bccc25441a46" containerID="ad4e3fece245cbae80f7af3bfbb0484b4d1681aae90d16b1b170f5d8af892edc" exitCode=0 Dec 05 12:50:03.357114 master-0 kubenswrapper[29936]: I1205 12:50:03.357073 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-765d9ff747-rw57t_807d9093-aa67-4840-b5be-7f3abcc1beed/kube-apiserver-operator/2.log" Dec 05 12:50:03.357404 master-0 kubenswrapper[29936]: I1205 12:50:03.357134 29936 generic.go:334] "Generic (PLEG): container finished" podID="807d9093-aa67-4840-b5be-7f3abcc1beed" containerID="57b070ec273f7f37c3345984984d73c010928b9bb1f5746c1d4e18feee8dc2dc" exitCode=255 Dec 05 12:50:03.359464 master-0 kubenswrapper[29936]: I1205 12:50:03.359445 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-67477646d4-9vfxw_0dda6d9b-cb3a-413a-85af-ef08f15ea42e/package-server-manager/0.log" Dec 05 12:50:03.359866 master-0 kubenswrapper[29936]: I1205 12:50:03.359840 29936 generic.go:334] "Generic (PLEG): container finished" podID="0dda6d9b-cb3a-413a-85af-ef08f15ea42e" containerID="494ce5c3826b0b94b974fd41d16b7ba6517fb0d007e27462a2d7b33e01aa4443" exitCode=1 Dec 05 12:50:03.376086 master-0 kubenswrapper[29936]: I1205 12:50:03.376032 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_076dafdf-a5d2-4e2d-9c38-6932910f7327/installer/0.log" Dec 05 12:50:03.376333 master-0 kubenswrapper[29936]: I1205 12:50:03.376094 29936 generic.go:334] "Generic (PLEG): container finished" podID="076dafdf-a5d2-4e2d-9c38-6932910f7327" containerID="f8dc47e77bee6411ef3a450c0123b8279b91a4729700211ae01112ac79fa1d1e" exitCode=1 Dec 05 12:50:03.383994 master-0 kubenswrapper[29936]: I1205 12:50:03.383968 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-7df95c79b5-ldg5j_531b8927-92db-4e9d-9a0a-12ff948cdaad/control-plane-machine-set-operator/0.log" Dec 05 12:50:03.384204 master-0 kubenswrapper[29936]: I1205 12:50:03.384161 29936 generic.go:334] "Generic (PLEG): container finished" podID="531b8927-92db-4e9d-9a0a-12ff948cdaad" containerID="5227d615ebfc1e16e53996d356380f47ad9e8fd55349d0658112ccb54f8ef1bb" exitCode=1 Dec 05 12:50:03.401722 master-0 kubenswrapper[29936]: I1205 12:50:03.401671 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-77758bc754-hfqsp_f3792522-fec6-4022-90ac-0b8467fcd625/service-ca-operator/1.log" Dec 05 12:50:03.402023 master-0 kubenswrapper[29936]: I1205 12:50:03.401741 29936 generic.go:334] "Generic (PLEG): container finished" podID="f3792522-fec6-4022-90ac-0b8467fcd625" containerID="e23a95f64400e0dbc7fc95b5f04dddc2d0290c35a1bcdf186ddbd3cd8314a14f" exitCode=255 Dec 05 12:50:03.412534 master-0 kubenswrapper[29936]: I1205 12:50:03.412288 29936 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="48cc412fc0495a9b989b3163afe32a67e585bd82e370a59d4690f30fe1abc9dc" exitCode=0 Dec 05 12:50:03.412534 master-0 kubenswrapper[29936]: I1205 12:50:03.412322 29936 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="d98d05970b7b2ac04c6af16edb9c07e4ea790e687fa82b42828f83752f9655a5" exitCode=0 Dec 05 12:50:03.412534 master-0 kubenswrapper[29936]: I1205 12:50:03.412330 29936 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="678a3e3b29045fc802f2f4ea9939ca067adfe6ff12b24bb2dd5f895390e55a41" exitCode=0 Dec 05 12:50:03.412534 master-0 kubenswrapper[29936]: I1205 12:50:03.412337 29936 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="46d777da61d52678086a53c15e814977a05f1e509e1945fa53a5e65cac047f51" exitCode=0 Dec 05 12:50:03.412534 master-0 kubenswrapper[29936]: I1205 12:50:03.412344 29936 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="d81a6813a03e38c556e737371d737471f12aa2c77281926715e2cfe7ffc056aa" exitCode=0 Dec 05 12:50:03.412534 master-0 kubenswrapper[29936]: I1205 12:50:03.412350 29936 generic.go:334] "Generic (PLEG): container finished" podID="708bf629-9949-4b79-a88a-c73ba033475b" containerID="503a0b99be77d72f51d7afcf8403bc7d040b77fef62f126cd910c2ff4b520892" exitCode=0 Dec 05 12:50:03.416005 master-0 kubenswrapper[29936]: I1205 12:50:03.415972 29936 generic.go:334] "Generic (PLEG): container finished" podID="cf8247a1-703a-46b3-9a33-25a73b27ab99" containerID="48561b92390271bf5bcb9ad8430184be011980a59e84d647901af23e4a1dea25" exitCode=0 Dec 05 12:50:03.421535 master-0 kubenswrapper[29936]: I1205 12:50:03.421466 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-b9c5dfc78-2n8gt_7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9/kube-storage-version-migrator-operator/2.log" Dec 05 12:50:03.422464 master-0 kubenswrapper[29936]: I1205 12:50:03.421532 29936 generic.go:334] "Generic (PLEG): container finished" podID="7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9" containerID="49b30bfe4873642e053b957d836ec7eddcac24e11ada8325b8fc8b72bfefafe8" exitCode=255 Dec 05 12:50:03.478724 master-0 kubenswrapper[29936]: I1205 12:50:03.478681 29936 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="2a8e7f38b128627544fcfe08f2d0eef9ae364770a9037f3dac3761d553a8ed98" exitCode=0 Dec 05 12:50:03.479276 master-0 kubenswrapper[29936]: I1205 12:50:03.479258 29936 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="073375b200bc70b30fb0cad0a5ecce97a68446c026019d9c52074056ad94e0a7" exitCode=0 Dec 05 12:50:03.479418 master-0 kubenswrapper[29936]: I1205 12:50:03.479396 29936 generic.go:334] "Generic (PLEG): container finished" podID="58d12e893528ad53a994f10901a644ea" containerID="7fe2ad5243db75a4d0831218b0b4d047af3794e202e2009112af905d4919bd2b" exitCode=0 Dec 05 12:50:03.482785 master-0 kubenswrapper[29936]: I1205 12:50:03.482344 29936 generic.go:334] "Generic (PLEG): container finished" podID="f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb" containerID="c9cbf8e5df58cf6c6aff3967b76368b2b683cdb47115f76abdee2db7c46ae76b" exitCode=0 Dec 05 12:50:03.485603 master-0 kubenswrapper[29936]: E1205 12:50:03.485570 29936 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 05 12:50:03.488376 master-0 kubenswrapper[29936]: I1205 12:50:03.487965 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_d627fcf3-2a80-4739-add9-e21ad4efc6eb/installer/0.log" Dec 05 12:50:03.488376 master-0 kubenswrapper[29936]: I1205 12:50:03.488017 29936 generic.go:334] "Generic (PLEG): container finished" podID="d627fcf3-2a80-4739-add9-e21ad4efc6eb" containerID="8654b600b7307ea1bcd3fe84275fb56084c5722cbe5ccf524025cea2bfa3d8cd" exitCode=1 Dec 05 12:50:03.492803 master-0 kubenswrapper[29936]: I1205 12:50:03.492741 29936 generic.go:334] "Generic (PLEG): container finished" podID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" containerID="fa4f02496398ccdc5c55acbb60e75e3c69d9850820e087e65cbe9d00bf63d07e" exitCode=0 Dec 05 12:50:03.495563 master-0 kubenswrapper[29936]: I1205 12:50:03.495503 29936 generic.go:334] "Generic (PLEG): container finished" podID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerID="b7483a678d691fbf8a3207dd7d6ed1c739a3647a4a630049897502326cc17230" exitCode=0 Dec 05 12:50:03.495713 master-0 kubenswrapper[29936]: I1205 12:50:03.495644 29936 generic.go:334] "Generic (PLEG): container finished" podID="20a72c8b-0f12-446b-8a42-53d98864c8f8" containerID="eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8" exitCode=0 Dec 05 12:50:03.499071 master-0 kubenswrapper[29936]: I1205 12:50:03.499020 29936 generic.go:334] "Generic (PLEG): container finished" podID="60327040-f782-4cda-a32d-52a4f183073c" containerID="5021d0ebd02a2ebd7ed1f4a980629b114fcca13491901c53a97391580abdd083" exitCode=0 Dec 05 12:50:03.500811 master-0 kubenswrapper[29936]: I1205 12:50:03.500773 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-78f758c7b9-5xg2k_a280c582-685e-47ac-bf6b-248aa0c129a9/cluster-baremetal-operator/0.log" Dec 05 12:50:03.501006 master-0 kubenswrapper[29936]: I1205 12:50:03.500932 29936 generic.go:334] "Generic (PLEG): container finished" podID="a280c582-685e-47ac-bf6b-248aa0c129a9" containerID="502462f2915d6fff82c1a557ec2a9e24c7fbeef3a6daff0dad2cf5862df79899" exitCode=1 Dec 05 12:50:03.502779 master-0 kubenswrapper[29936]: I1205 12:50:03.502122 29936 generic.go:334] "Generic (PLEG): container finished" podID="4d215811-6210-4ec2-8356-f1533dc43f65" containerID="419f6f30a7830337f1a96ed401ad15741b6815b1dc5b3d9cd59d5f9c8beb4aa8" exitCode=0 Dec 05 12:50:03.511593 master-0 kubenswrapper[29936]: I1205 12:50:03.511457 29936 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a" exitCode=0 Dec 05 12:50:03.515626 master-0 kubenswrapper[29936]: I1205 12:50:03.515561 29936 generic.go:334] "Generic (PLEG): container finished" podID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerID="bbc65050e19c8e05efbd98764627a92089e068c4fefa760a423cea0a25acab48" exitCode=0 Dec 05 12:50:03.515732 master-0 kubenswrapper[29936]: I1205 12:50:03.515666 29936 generic.go:334] "Generic (PLEG): container finished" podID="d72b2b71-27b2-4aff-bf69-7054a9556318" containerID="836113a149a4eefb4c2ce8d65a7d2c1b43cd3294cab879526b98ff307bc6e81d" exitCode=0 Dec 05 12:50:03.519760 master-0 kubenswrapper[29936]: I1205 12:50:03.519701 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-xwx26_b8233dad-bd19-4842-a4d5-cfa84f1feb83/approver/1.log" Dec 05 12:50:03.520654 master-0 kubenswrapper[29936]: I1205 12:50:03.520246 29936 generic.go:334] "Generic (PLEG): container finished" podID="b8233dad-bd19-4842-a4d5-cfa84f1feb83" containerID="efd0af11329fc9886861d20bcf790f4afa476fb62b8a37aabf75eec470dca7ba" exitCode=1 Dec 05 12:50:03.528728 master-0 kubenswrapper[29936]: I1205 12:50:03.528681 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-7ff994598c-lgc7z_58187662-b502-4d90-95ce-2aa91a81d256/cluster-monitoring-operator/0.log" Dec 05 12:50:03.528918 master-0 kubenswrapper[29936]: I1205 12:50:03.528755 29936 generic.go:334] "Generic (PLEG): container finished" podID="58187662-b502-4d90-95ce-2aa91a81d256" containerID="d0c256d51be6b67ce11d61e05ebacd6a747bab028d852541d977f5d77734ba1a" exitCode=1 Dec 05 12:50:03.537847 master-0 kubenswrapper[29936]: I1205 12:50:03.537801 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f85974995-4vsjv_1871a9d6-6369-4d08-816f-9c6310b61ddf/kube-scheduler-operator-container/1.log" Dec 05 12:50:03.537952 master-0 kubenswrapper[29936]: I1205 12:50:03.537855 29936 generic.go:334] "Generic (PLEG): container finished" podID="1871a9d6-6369-4d08-816f-9c6310b61ddf" containerID="6017aa71f5b63d7cdd32da77565edb00ec8b8b5e5059d49e4246bd4f05d6b50b" exitCode=255 Dec 05 12:50:03.540758 master-0 kubenswrapper[29936]: I1205 12:50:03.540707 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-848f645654-g6nj5_594aaded-5615-4bed-87ee-6173059a73be/kube-controller-manager-operator/2.log" Dec 05 12:50:03.543328 master-0 kubenswrapper[29936]: I1205 12:50:03.543257 29936 generic.go:334] "Generic (PLEG): container finished" podID="594aaded-5615-4bed-87ee-6173059a73be" containerID="7a5d71f74727c7976f8b5ae99bcc9e973922d3b63fcffbddb6ae9dd46ae8aa22" exitCode=255 Dec 05 12:50:03.548399 master-0 kubenswrapper[29936]: I1205 12:50:03.548295 29936 generic.go:334] "Generic (PLEG): container finished" podID="4957e218-f580-41a9-866a-fd4f92a3c007" containerID="eed2e77d9f832d089463e6b1b5c8775d3273e95a2de91d82d1ec20f52035753f" exitCode=0 Dec 05 12:50:03.568734 master-0 kubenswrapper[29936]: I1205 12:50:03.568678 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-6c8676f99d-546vz_d53a4886-db25-43a1-825a-66a9a9a58590/openshift-controller-manager-operator/2.log" Dec 05 12:50:03.568965 master-0 kubenswrapper[29936]: I1205 12:50:03.568747 29936 generic.go:334] "Generic (PLEG): container finished" podID="d53a4886-db25-43a1-825a-66a9a9a58590" containerID="d299754c006efdd8044d5689f796048069701078c77f429aafcfe9eafc6522ec" exitCode=255 Dec 05 12:50:03.575738 master-0 kubenswrapper[29936]: I1205 12:50:03.575694 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-85cff47f46-p9xtc_a2acba71-b9dc-4b85-be35-c995b8be2f19/cluster-node-tuning-operator/0.log" Dec 05 12:50:03.576023 master-0 kubenswrapper[29936]: I1205 12:50:03.575993 29936 generic.go:334] "Generic (PLEG): container finished" podID="a2acba71-b9dc-4b85-be35-c995b8be2f19" containerID="0ef8d356dac19c1922c065326d7809108046e5a2cd059d5d50b5229acd7007ec" exitCode=1 Dec 05 12:50:03.586092 master-0 kubenswrapper[29936]: I1205 12:50:03.586039 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-7bf7f6b755-b2pxs_4b7f0d8d-a2bf-4550-b6e6-1c56adae827e/openshift-apiserver-operator/1.log" Dec 05 12:50:03.586390 master-0 kubenswrapper[29936]: I1205 12:50:03.586103 29936 generic.go:334] "Generic (PLEG): container finished" podID="4b7f0d8d-a2bf-4550-b6e6-1c56adae827e" containerID="45f66b524a7ae3f72102f1c1f147264a8f0120c9900c09db2778e03215486e8a" exitCode=255 Dec 05 12:50:03.599021 master-0 kubenswrapper[29936]: I1205 12:50:03.598966 29936 generic.go:334] "Generic (PLEG): container finished" podID="8b47694fcc32464ab24d09c23d6efb57" containerID="3287f56a58ec6df79eb961042eccb67f5309daab6cc145e4e1caa74cca9833e8" exitCode=0 Dec 05 12:50:03.615326 master-0 kubenswrapper[29936]: I1205 12:50:03.615286 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5nqhk_f725fa37-ef11-479a-8cf9-f4b90fe5e7a1/kube-multus/0.log" Dec 05 12:50:03.615549 master-0 kubenswrapper[29936]: I1205 12:50:03.615346 29936 generic.go:334] "Generic (PLEG): container finished" podID="f725fa37-ef11-479a-8cf9-f4b90fe5e7a1" containerID="60d32869d5d76c04555375fdfd9ab0f008a07a41f85b96737cde09fadc0deeb4" exitCode=1 Dec 05 12:50:03.618124 master-0 kubenswrapper[29936]: I1205 12:50:03.618088 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_565d5ef6-b0e7-4f04-9460-61f1d3903d37/installer/0.log" Dec 05 12:50:03.618246 master-0 kubenswrapper[29936]: I1205 12:50:03.618166 29936 generic.go:334] "Generic (PLEG): container finished" podID="565d5ef6-b0e7-4f04-9460-61f1d3903d37" containerID="1cb443e02b64a65178050b34e99e50f308c86d2ef5b4e7e730bfa0faf58cc53e" exitCode=1 Dec 05 12:50:03.621989 master-0 kubenswrapper[29936]: I1205 12:50:03.621953 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6b958b6f94-7r5wv_b9623eb8-55d2-4c5c-aa8d-74b6a27274d8/snapshot-controller/4.log" Dec 05 12:50:03.622079 master-0 kubenswrapper[29936]: I1205 12:50:03.622031 29936 generic.go:334] "Generic (PLEG): container finished" podID="b9623eb8-55d2-4c5c-aa8d-74b6a27274d8" containerID="33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654" exitCode=1 Dec 05 12:50:03.630792 master-0 kubenswrapper[29936]: I1205 12:50:03.630741 29936 generic.go:334] "Generic (PLEG): container finished" podID="a45f340c-0eca-4460-8961-4ca360467eeb" containerID="06eca27e0fe884f90bd62d903b17dde7161c7cd5f8bd04b4c9959d40b8706ecb" exitCode=0 Dec 05 12:50:03.643148 master-0 kubenswrapper[29936]: I1205 12:50:03.643099 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-56fcb6cc5f-q9njf_ce3d73c1-f4bd-4c91-936a-086dfa5e3460/cluster-olm-operator/1.log" Dec 05 12:50:03.643918 master-0 kubenswrapper[29936]: I1205 12:50:03.643846 29936 generic.go:334] "Generic (PLEG): container finished" podID="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" containerID="551cae19e34d48fca769f493289bed8d81be6209860af5e4e43fa9850482cf12" exitCode=255 Dec 05 12:50:03.643918 master-0 kubenswrapper[29936]: I1205 12:50:03.643911 29936 generic.go:334] "Generic (PLEG): container finished" podID="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" containerID="020f4fb4f4314f00ea400478b93e32903a1a30b5d332647ebe9614d7f944a537" exitCode=0 Dec 05 12:50:03.644065 master-0 kubenswrapper[29936]: I1205 12:50:03.643927 29936 generic.go:334] "Generic (PLEG): container finished" podID="ce3d73c1-f4bd-4c91-936a-086dfa5e3460" containerID="373b9eebb249846584e2d3e04b61f1d2ede61eec7ddbb37f633ff477767fcf89" exitCode=0 Dec 05 12:50:03.647313 master-0 kubenswrapper[29936]: I1205 12:50:03.647283 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-74d9cbffbc-r7kbd_db27bee9-3d33-4c4a-b38b-72f7cec77c7a/machine-approver-controller/0.log" Dec 05 12:50:03.648484 master-0 kubenswrapper[29936]: I1205 12:50:03.648416 29936 generic.go:334] "Generic (PLEG): container finished" podID="db27bee9-3d33-4c4a-b38b-72f7cec77c7a" containerID="e4b5bdd189732f9903e53555a7a61c0d10d37cd90596a4c760274c5cce948d5d" exitCode=255 Dec 05 12:50:03.651956 master-0 kubenswrapper[29936]: I1205 12:50:03.651858 29936 generic.go:334] "Generic (PLEG): container finished" podID="a757f807-e1bf-4f1e-9787-6b4acc8d09cf" containerID="11b2f7447856caa8c6cb51432e0d7392e86f64482a9fae5b398a57d71719f20e" exitCode=0 Dec 05 12:50:03.657099 master-0 kubenswrapper[29936]: I1205 12:50:03.657052 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-6c968fdfdf-xxmfp_ba095394-1873-4793-969d-3be979fa0771/authentication-operator/2.log" Dec 05 12:50:03.657283 master-0 kubenswrapper[29936]: I1205 12:50:03.657119 29936 generic.go:334] "Generic (PLEG): container finished" podID="ba095394-1873-4793-969d-3be979fa0771" containerID="23c3457474b764927ea9138fc09fe8b0080b3d5dcfc7a8c9d9bd33c7ad79d98a" exitCode=255 Dec 05 12:50:03.661130 master-0 kubenswrapper[29936]: I1205 12:50:03.661079 29936 generic.go:334] "Generic (PLEG): container finished" podID="b74e0607-6ed0-4119-8870-895b7d336830" containerID="ea572c6fcc8d460ca830971971bae224cadfb5879734d2e8d7b9add3c483a937" exitCode=0 Dec 05 12:50:03.661130 master-0 kubenswrapper[29936]: I1205 12:50:03.661115 29936 generic.go:334] "Generic (PLEG): container finished" podID="b74e0607-6ed0-4119-8870-895b7d336830" containerID="05c868179fe699a72c6f244f8706f4870b83c4369ed24818820567f21e6d96f4" exitCode=0 Dec 05 12:50:03.664489 master-0 kubenswrapper[29936]: I1205 12:50:03.664413 29936 generic.go:334] "Generic (PLEG): container finished" podID="954c5c79-a96c-4c47-a4bc-024aaf4dc789" containerID="6a64d74f0d5ef7e0f5020ef79722fa9a1cfa622ec3d5ca7d9169d099609498b7" exitCode=0 Dec 05 12:50:03.677491 master-0 kubenswrapper[29936]: I1205 12:50:03.672218 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-7cbd59c7f8-d9g7k_153fec1f-a10b-4c6c-a997-60fa80c13a86/manager/1.log" Dec 05 12:50:03.678105 master-0 kubenswrapper[29936]: I1205 12:50:03.678003 29936 generic.go:334] "Generic (PLEG): container finished" podID="153fec1f-a10b-4c6c-a997-60fa80c13a86" containerID="7b0e1392f4706a31c5e08db223b1244b230bf09a2ede6f19588e74a4a3860cf4" exitCode=1 Dec 05 12:50:03.853589 master-0 kubenswrapper[29936]: I1205 12:50:03.853533 29936 manager.go:324] Recovery completed Dec 05 12:50:03.891437 master-0 kubenswrapper[29936]: E1205 12:50:03.891307 29936 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 05 12:50:03.979211 master-0 kubenswrapper[29936]: I1205 12:50:03.979159 29936 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 12:50:03.979211 master-0 kubenswrapper[29936]: I1205 12:50:03.979201 29936 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 12:50:03.979211 master-0 kubenswrapper[29936]: I1205 12:50:03.979223 29936 state_mem.go:36] "Initialized new in-memory state store" Dec 05 12:50:03.979824 master-0 kubenswrapper[29936]: I1205 12:50:03.979399 29936 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 05 12:50:03.979824 master-0 kubenswrapper[29936]: I1205 12:50:03.979410 29936 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 05 12:50:03.979824 master-0 kubenswrapper[29936]: I1205 12:50:03.979428 29936 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Dec 05 12:50:03.979824 master-0 kubenswrapper[29936]: I1205 12:50:03.979434 29936 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Dec 05 12:50:03.979824 master-0 kubenswrapper[29936]: I1205 12:50:03.979440 29936 policy_none.go:49] "None policy: Start" Dec 05 12:50:03.985026 master-0 kubenswrapper[29936]: I1205 12:50:03.984967 29936 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 12:50:03.985097 master-0 kubenswrapper[29936]: I1205 12:50:03.985068 29936 state_mem.go:35] "Initializing new in-memory state store" Dec 05 12:50:03.985585 master-0 kubenswrapper[29936]: I1205 12:50:03.985552 29936 state_mem.go:75] "Updated machine memory state" Dec 05 12:50:03.985585 master-0 kubenswrapper[29936]: I1205 12:50:03.985573 29936 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Dec 05 12:50:04.007596 master-0 kubenswrapper[29936]: I1205 12:50:04.007539 29936 manager.go:334] "Starting Device Plugin manager" Dec 05 12:50:04.007852 master-0 kubenswrapper[29936]: I1205 12:50:04.007639 29936 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 12:50:04.007852 master-0 kubenswrapper[29936]: I1205 12:50:04.007653 29936 server.go:79] "Starting device plugin registration server" Dec 05 12:50:04.008157 master-0 kubenswrapper[29936]: I1205 12:50:04.008125 29936 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 12:50:04.008251 master-0 kubenswrapper[29936]: I1205 12:50:04.008145 29936 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 12:50:04.008302 master-0 kubenswrapper[29936]: I1205 12:50:04.008290 29936 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 12:50:04.008383 master-0 kubenswrapper[29936]: I1205 12:50:04.008363 29936 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 12:50:04.008383 master-0 kubenswrapper[29936]: I1205 12:50:04.008377 29936 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 12:50:04.099058 master-0 kubenswrapper[29936]: I1205 12:50:04.097606 29936 apiserver.go:52] "Watching apiserver" Dec 05 12:50:04.108688 master-0 kubenswrapper[29936]: I1205 12:50:04.108641 29936 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 12:50:04.112453 master-0 kubenswrapper[29936]: I1205 12:50:04.112419 29936 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 05 12:50:04.112732 master-0 kubenswrapper[29936]: I1205 12:50:04.112718 29936 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 05 12:50:04.112824 master-0 kubenswrapper[29936]: I1205 12:50:04.112813 29936 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 05 12:50:04.113651 master-0 kubenswrapper[29936]: I1205 12:50:04.113634 29936 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 05 12:50:04.129794 master-0 kubenswrapper[29936]: I1205 12:50:04.129577 29936 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Dec 05 12:50:04.129794 master-0 kubenswrapper[29936]: I1205 12:50:04.129702 29936 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 05 12:50:04.132433 master-0 kubenswrapper[29936]: I1205 12:50:04.131246 29936 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 12:50:04.692277 master-0 kubenswrapper[29936]: I1205 12:50:04.692171 29936 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0"] Dec 05 12:50:04.692843 master-0 kubenswrapper[29936]: I1205 12:50:04.692763 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs","openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24","openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z","openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk","openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv","openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp","openshift-kube-apiserver/installer-1-master-0","openshift-marketplace/redhat-operators-wfk7f","openshift-marketplace/community-operators-2pp25","openshift-multus/multus-5nqhk","openshift-multus/network-metrics-daemon-99djw","openshift-network-diagnostics/network-check-target-qsggt","openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw","openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx","assisted-installer/assisted-installer-controller-m6pn4","openshift-apiserver/apiserver-845d4454f8-kcq9s","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc","openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf","openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c","openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-machine-config-operator/machine-config-server-4s89l","openshift-network-node-identity/network-node-identity-xwx26","openshift-network-operator/network-operator-79767b7ff9-h8qkj","openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz","openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c","openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm","openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv","openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw","openshift-kube-controller-manager/installer-3-master-0","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt","openshift-monitoring/kube-state-metrics-5857974f64-8p9n7","openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv","openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj","openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t","openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k","openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m","openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv","openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6","openshift-machine-api/machine-api-operator-88d48b57d-x947v","openshift-service-ca/service-ca-77c99c46b8-44qrw","openshift-dns/dns-default-rzl84","openshift-etcd/installer-2-master-0","openshift-insights/insights-operator-55965856b6-q9qdg","openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr","openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx","openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6","openshift-machine-config-operator/machine-config-daemon-45nwc","openshift-marketplace/marketplace-operator-f797b99b6-vwhxt","openshift-monitoring/metrics-server-54c5748c8c-kqs7s","openshift-network-operator/iptables-alerter-nwplt","openshift-etcd/etcd-master-0","openshift-ingress-operator/ingress-operator-8649c48786-7xrk6","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-multus/multus-admission-controller-8dbbb5754-j7x5j","openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv","openshift-etcd/installer-1-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r","openshift-marketplace/certified-operators-4p8p6","openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp","openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj","openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5","openshift-monitoring/node-exporter-z2nmc","openshift-ovn-kubernetes/ovnkube-node-9vqtb","openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j","openshift-multus/multus-additional-cni-plugins-prt97","openshift-cluster-node-tuning-operator/tuned-dcvtr","openshift-kube-scheduler/installer-4-master-0","openshift-kube-scheduler/installer-5-master-0","openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k","openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z","openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2","openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5","openshift-kube-apiserver/installer-3-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb","openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg","openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7","openshift-dns/node-resolver-f6j7m","openshift-ingress/router-default-5465c8b4db-dzlmb","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-controller-manager/installer-1-master-0","openshift-kube-controller-manager/installer-2-master-0","openshift-marketplace/redhat-marketplace-dmnvq"] Dec 05 12:50:04.692985 master-0 kubenswrapper[29936]: I1205 12:50:04.692894 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-m6pn4" Dec 05 12:50:04.705907 master-0 kubenswrapper[29936]: I1205 12:50:04.705841 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.705907 master-0 kubenswrapper[29936]: I1205 12:50:04.705882 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 05 12:50:04.706213 master-0 kubenswrapper[29936]: I1205 12:50:04.705911 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.706213 master-0 kubenswrapper[29936]: I1205 12:50:04.705977 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:50:04.706213 master-0 kubenswrapper[29936]: I1205 12:50:04.706155 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 05 12:50:04.706213 master-0 kubenswrapper[29936]: I1205 12:50:04.706208 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 05 12:50:04.706493 master-0 kubenswrapper[29936]: I1205 12:50:04.706440 29936 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="358b1a0e-7600-48d9-9639-b356d354dad2" Dec 05 12:50:04.707790 master-0 kubenswrapper[29936]: I1205 12:50:04.707758 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 05 12:50:04.708266 master-0 kubenswrapper[29936]: I1205 12:50:04.708236 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 05 12:50:04.708901 master-0 kubenswrapper[29936]: I1205 12:50:04.708615 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 05 12:50:04.708901 master-0 kubenswrapper[29936]: I1205 12:50:04.708613 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 05 12:50:04.709725 master-0 kubenswrapper[29936]: I1205 12:50:04.709700 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.735258 master-0 kubenswrapper[29936]: I1205 12:50:04.734965 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 12:50:04.737009 master-0 kubenswrapper[29936]: I1205 12:50:04.736831 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 12:50:04.737542 master-0 kubenswrapper[29936]: I1205 12:50:04.737517 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 12:50:04.737775 master-0 kubenswrapper[29936]: I1205 12:50:04.737754 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 12:50:04.737819 master-0 kubenswrapper[29936]: I1205 12:50:04.737774 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.737853 master-0 kubenswrapper[29936]: I1205 12:50:04.737812 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.737880 master-0 kubenswrapper[29936]: I1205 12:50:04.737847 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 12:50:04.737946 master-0 kubenswrapper[29936]: I1205 12:50:04.737888 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 12:50:04.738084 master-0 kubenswrapper[29936]: I1205 12:50:04.738055 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 12:50:04.738229 master-0 kubenswrapper[29936]: I1205 12:50:04.738197 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 12:50:04.738276 master-0 kubenswrapper[29936]: I1205 12:50:04.738234 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 12:50:04.738316 master-0 kubenswrapper[29936]: I1205 12:50:04.738260 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 12:50:04.738657 master-0 kubenswrapper[29936]: I1205 12:50:04.738605 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.738835 master-0 kubenswrapper[29936]: I1205 12:50:04.738621 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 12:50:04.739566 master-0 kubenswrapper[29936]: E1205 12:50:04.739532 29936 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-startup-monitor-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.746343 master-0 kubenswrapper[29936]: I1205 12:50:04.739800 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 12:50:04.746343 master-0 kubenswrapper[29936]: I1205 12:50:04.738647 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 12:50:04.746343 master-0 kubenswrapper[29936]: I1205 12:50:04.738999 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 12:50:04.746343 master-0 kubenswrapper[29936]: I1205 12:50:04.739007 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 12:50:04.746343 master-0 kubenswrapper[29936]: I1205 12:50:04.739131 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 12:50:04.746343 master-0 kubenswrapper[29936]: I1205 12:50:04.739218 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 12:50:04.746343 master-0 kubenswrapper[29936]: I1205 12:50:04.739213 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 12:50:04.746343 master-0 kubenswrapper[29936]: I1205 12:50:04.739415 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 12:50:04.746343 master-0 kubenswrapper[29936]: I1205 12:50:04.744900 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 05 12:50:04.746343 master-0 kubenswrapper[29936]: I1205 12:50:04.745047 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 05 12:50:04.746859 master-0 kubenswrapper[29936]: I1205 12:50:04.746464 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 05 12:50:04.781245 master-0 kubenswrapper[29936]: I1205 12:50:04.780588 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 05 12:50:04.781461 master-0 kubenswrapper[29936]: I1205 12:50:04.781346 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" Dec 05 12:50:04.783036 master-0 kubenswrapper[29936]: I1205 12:50:04.782971 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-45nwc" event={"ID":"95d8fb27-8b2b-4749-add3-9e9b16edb693","Type":"ContainerStarted","Data":"8419f43f14005852c093325fa596baaf624f2da2d38299ead3523e1bbf468c70"} Dec 05 12:50:04.783105 master-0 kubenswrapper[29936]: I1205 12:50:04.783036 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-45nwc" event={"ID":"95d8fb27-8b2b-4749-add3-9e9b16edb693","Type":"ContainerStarted","Data":"6beaecf0540643cd8682361578d468ced3e3fd0c3495c281547ab1933154b6de"} Dec 05 12:50:04.783105 master-0 kubenswrapper[29936]: I1205 12:50:04.783051 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" event={"ID":"38941513-e968-45f1-9cb2-b63d40338f36","Type":"ContainerStarted","Data":"422041b3d2323dfdeb50d410c114367777627894d6f0b8ffccb3a7e50a46157a"} Dec 05 12:50:04.783105 master-0 kubenswrapper[29936]: I1205 12:50:04.783065 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" event={"ID":"38941513-e968-45f1-9cb2-b63d40338f36","Type":"ContainerDied","Data":"418f3d79b0988ff7f7ba36537b8459867264703e9c8f702bbd93e4dee2835882"} Dec 05 12:50:04.783105 master-0 kubenswrapper[29936]: I1205 12:50:04.783078 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" event={"ID":"38941513-e968-45f1-9cb2-b63d40338f36","Type":"ContainerStarted","Data":"065b5ff0754f03af8b21df75fad6ff50fe29b9c92ca5f839b6b57c232043c975"} Dec 05 12:50:04.783105 master-0 kubenswrapper[29936]: I1205 12:50:04.783088 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmnvq" event={"ID":"ebfbe878-1796-4a20-b3f0-76165038252e","Type":"ContainerStarted","Data":"a8cae8900ae7cce8ceb6b634d4c10f86d39b83230027bdc07c4c7d3d67f473e8"} Dec 05 12:50:04.783105 master-0 kubenswrapper[29936]: I1205 12:50:04.783099 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmnvq" event={"ID":"ebfbe878-1796-4a20-b3f0-76165038252e","Type":"ContainerDied","Data":"6ac9a49c2a57485ce32b61b5b230ca835325f9ead86b65416a1ed194a651372b"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783110 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmnvq" event={"ID":"ebfbe878-1796-4a20-b3f0-76165038252e","Type":"ContainerDied","Data":"1edf9b703ee13d520466151dc6a14b9861ec98cd381dcaaddc281b34b9755005"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783123 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dmnvq" event={"ID":"ebfbe878-1796-4a20-b3f0-76165038252e","Type":"ContainerStarted","Data":"fb396b2885c697fc62cb75681d56dacee81e32f235fe9f427b2f065f721f39f2"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783134 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p8p6" event={"ID":"8defe125-1529-4091-adff-e9d17a2b298f","Type":"ContainerStarted","Data":"82b30fbccb7238f44d13f70da028d33f5b6e416081362f085ebb4ebdcea4d599"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783145 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p8p6" event={"ID":"8defe125-1529-4091-adff-e9d17a2b298f","Type":"ContainerDied","Data":"5fa7eec5b7c19299d0ce6c87bba94066e186df6e6ba2162f840498cde3a19934"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783157 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p8p6" event={"ID":"8defe125-1529-4091-adff-e9d17a2b298f","Type":"ContainerDied","Data":"539fe177f9c1deb8d425356a84818b5c05d811c1ab77b966156d70120d25eef1"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783167 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4p8p6" event={"ID":"8defe125-1529-4091-adff-e9d17a2b298f","Type":"ContainerStarted","Data":"010fcb3fd705e5d750eedd1adb06872aa524c08fbc85d2a921261129ee9bc96b"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783195 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerStarted","Data":"a8d28b5a11dbbe0ce40e81abee57abc61672afe8fdac498f35db8f445d2e2f79"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783208 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerStarted","Data":"85646088c9e7224de8ae7078ab75c2d8ce77a9f7abbf5c5fe5944cd4c31dc3b4"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783219 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerStarted","Data":"dd11d2ba2786d3d9f0ecdca93a7646fa05672ce1b1d99750eaff80844a557871"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783229 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerDied","Data":"88bb2ee05e17ca0ccc95842f8e427991824283668dc77c62b2a389be9423149d"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783241 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerDied","Data":"b36175e01241a922ef57ef9968701e5af5fa8f55a7287b6d3fe1828d9e78254f"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783252 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" event={"ID":"dbe144b5-3b78-4946-bbf9-b825b0e47b07","Type":"ContainerStarted","Data":"95fb5697edafbf4a316d98f995b9941dad32b61de9fdb2705dcb30f672d4ab5b"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783263 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"af196a48-6fcc-47d1-95ac-7c0acd63dd21","Type":"ContainerDied","Data":"7ece5c46267ba23e02ffd4ff25da31145731b6190d3fd68ce186c1aedbb31e5d"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783278 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"af196a48-6fcc-47d1-95ac-7c0acd63dd21","Type":"ContainerDied","Data":"64823040ff2341cdbb93a061017009c800e15a20e0d4a4f9a76225782f444caf"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783307 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64823040ff2341cdbb93a061017009c800e15a20e0d4a4f9a76225782f444caf" Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783332 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"96fa3513-5467-4b0f-a03d-9279d36317bd","Type":"ContainerDied","Data":"0a7d145dbed8d32146e90821257e92134c8804dafe8896f59ec88530e6ad0c4e"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783344 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"96fa3513-5467-4b0f-a03d-9279d36317bd","Type":"ContainerDied","Data":"89350a747cdc136c0874cbdedf75ff768f3aa173665ba78cb7204afab7285a1e"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783352 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89350a747cdc136c0874cbdedf75ff768f3aa173665ba78cb7204afab7285a1e" Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783363 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" event={"ID":"909ed395-8ad3-4350-95e3-b4b19c682f92","Type":"ContainerStarted","Data":"4f714e5931c4c8cc3b7ed7099b22570199ed80bdc76778cd2533b86f1ae3c6e0"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783376 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" event={"ID":"909ed395-8ad3-4350-95e3-b4b19c682f92","Type":"ContainerStarted","Data":"c5997a9e57f36847e6cb187afed936a398d9d89f0a3c5fbdaa0cdcf0b16bbffd"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783387 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" event={"ID":"480c1f6e-0e13-49f9-bc4e-07350842f16c","Type":"ContainerStarted","Data":"c3d80b69dcfc87067aaae63f00809fa404e99554c3b19017580f5646450199ef"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783397 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" event={"ID":"480c1f6e-0e13-49f9-bc4e-07350842f16c","Type":"ContainerStarted","Data":"e30c55a9fab66df10956ae03c408ba3a127fd7d10e3d72d2eb92d23500a928bc"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783407 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" event={"ID":"480c1f6e-0e13-49f9-bc4e-07350842f16c","Type":"ContainerStarted","Data":"54d1c55b3ab43714c6f9d30fae64742364176327ec7be4503594ab7c679b2007"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783420 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"c2415969-33ad-418b-9df0-4a6c7bb279db","Type":"ContainerDied","Data":"dcccd3d0ecb79fd6fed3cade29b8b0d3ae9e791a686c5e6c4f661b34fd1efb10"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783432 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"c2415969-33ad-418b-9df0-4a6c7bb279db","Type":"ContainerDied","Data":"cff910884ebcba45ebf5c933f29645e420a54c688eec01b58f8fb05ec723cbe8"} Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783441 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff910884ebcba45ebf5c933f29645e420a54c688eec01b58f8fb05ec723cbe8" Dec 05 12:50:04.783436 master-0 kubenswrapper[29936]: I1205 12:50:04.783451 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" event={"ID":"e943438b-1de8-435c-8a19-accd6a6292a4","Type":"ContainerStarted","Data":"d9a47a4e65ab5cf4baf6b36c8ce1ba7fd5756eae201f48950bc988deec039fe0"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783462 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" event={"ID":"e943438b-1de8-435c-8a19-accd6a6292a4","Type":"ContainerStarted","Data":"77da36c6bf5d09d68dbf2de017a655a5a15b25fda32cba3288a3d8b2cc4b44c0"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783472 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerStarted","Data":"0b797ac3c4b54a3959f9e93f6e0af3ca69c035c47e6f5d5a251314015696c012"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783485 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerDied","Data":"3507046fb798191fc9af19cde11e6d29feed57ad8ee65fd82dade1b688773700"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783497 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" event={"ID":"5efad170-c154-42ec-a7c0-b36a98d2bfcc","Type":"ContainerStarted","Data":"38012800baf13255ee676c8bd3688f9cc8eb6dcd0e296ee14ea80782e75670a8"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783509 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-m6pn4" event={"ID":"e7807b90-1059-4c0d-9224-a0d57a572bfc","Type":"ContainerDied","Data":"ded126662555b11ef5f6022975feef5471a12cb6870d5933adf38dcb51422cc7"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783521 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-m6pn4" event={"ID":"e7807b90-1059-4c0d-9224-a0d57a572bfc","Type":"ContainerDied","Data":"75e9032b6429595bd1a6f97d2cb17682f851991bfdb1cc650ef529d5407494ac"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783530 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75e9032b6429595bd1a6f97d2cb17682f851991bfdb1cc650ef529d5407494ac" Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783540 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" event={"ID":"3d96c85a-fc88-46af-83d5-6c71ec6e2c23","Type":"ContainerStarted","Data":"daa09ad85f2f2082378a9c295067a8cfb84e2945b5becb78e60f9a9831fd768e"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783553 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" event={"ID":"3d96c85a-fc88-46af-83d5-6c71ec6e2c23","Type":"ContainerStarted","Data":"0b836f01dcb43b6af667ba219b4059e3935a66980e122a92a279a33e963cb964"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783563 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21de9318-06b4-42ba-8791-6d22055a04f2","Type":"ContainerDied","Data":"a6eeacf32c540b469027d242ad82a84ffbe4f8b8381d45f48601d0197961c30d"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783575 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"21de9318-06b4-42ba-8791-6d22055a04f2","Type":"ContainerDied","Data":"6cb38a8f7e475b51ec4e82d15e81123c84bcaa6f22b937b869d4c561cbe1b95c"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783584 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cb38a8f7e475b51ec4e82d15e81123c84bcaa6f22b937b869d4c561cbe1b95c" Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783593 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" event={"ID":"db2e54b6-4879-40f4-9359-a8b0c31e76c2","Type":"ContainerStarted","Data":"cdb8d0aeedb7fe170e4e369cb7ea0bb66c8248a41e81c31debeed5037170ef86"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783602 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" event={"ID":"db2e54b6-4879-40f4-9359-a8b0c31e76c2","Type":"ContainerStarted","Data":"abe43915cc1089507c40de3eaceadf732ca7d07e2f0e1b5a070959328db4199f"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783611 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerStarted","Data":"b907a1534b6517c416df2085c7f1d267c7cb079929611943c0ef4097c8c96c8d"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783621 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerDied","Data":"1313a091281e52fd967237a2bce92f6479da56820a8f33315dc25623f650fa7b"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783630 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" event={"ID":"f119ffe4-16bd-49eb-916d-b18ba0d79b54","Type":"ContainerStarted","Data":"b36190e4cf6d5a6244899784eca2665872c2f9d60ae3d454ea48fd9aa2aa3bab"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783638 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" event={"ID":"1e6babfe-724a-4eab-bb3b-bc318bf57b70","Type":"ContainerStarted","Data":"c57ae702507ae32ef2cc00a4261c94cb1e11a39f67dcebd33d947544fc98f957"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783649 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" event={"ID":"1e6babfe-724a-4eab-bb3b-bc318bf57b70","Type":"ContainerDied","Data":"9059626ad4510705fe438e1803257849f89596beec2662512048f0044416af28"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783659 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" event={"ID":"1e6babfe-724a-4eab-bb3b-bc318bf57b70","Type":"ContainerStarted","Data":"9cdc542e09a2b9f60d00a132f1101f8d7a3bb737b3bcc4086c2409ef17b05c7e"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783667 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"9ca4e2fef100193f66453d974a3aeb277071826fa1a2d4b62430670342a3f96c"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783676 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"bc3cbb74a19a81cb8c8983a0be4b89fe52a747534fa8fdd8c9d37eaea6fe9abf"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783685 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"870198accbd17689e843fe910fb099fc9b006845184f99c6f27de26a4f229484"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783693 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"add76824970c5035363490e23193d9c3951f4e6416f932bfbf753686a3f1c73d"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783702 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"76860118877eff4c9fad8d4e5b56521e5266564bc07f30738a05fcf57a7826c0"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783710 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"884ee3f24f9e08747275b422db9832356d773976d0d14992046903c0c0db05d6"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783718 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"0ad0842396c73a410e419040cdd43cae0eebe3ffda0c30e321c4b5837d83dd29"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783727 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"5c460027d91dbde84ccd52b0f0ca6fe1dc4b67dcc928471783b6fff7ae2ff897"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783735 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerDied","Data":"1e1ba9d3a2cd6fc3c76c6b40cc81f5a9fa8707214a43505b547185529870eae9"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783754 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" event={"ID":"4492c55f-701b-4ec8-ada1-0a5dc126d405","Type":"ContainerStarted","Data":"ed1704f4a6522faa5c439c3ffd85686d7bb1d3595d2d60cd653dbed071367134"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783763 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" event={"ID":"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f","Type":"ContainerStarted","Data":"6239eae44fcba24c87ffe8091386b03d73857f84777d4bb210653bc8e77e499d"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783773 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" event={"ID":"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f","Type":"ContainerStarted","Data":"1b430712d22ac161924beaf5505ca8d2172d739daf63d6df7781a1aff1c1828b"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783783 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" event={"ID":"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f","Type":"ContainerStarted","Data":"04a1540e033fc0d53be3a8dfa10cb49b28b11738b911cb185f8d919660d6db47"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783792 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"f9824f2538239be2916d2115cdd6e15355f5d12571e5c02316bdba7857f30ff8"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783802 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"1071666e1f17d7cf23c890a67d65947ba5ea19368b6f26e80669ec0f695e375b"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783846 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerDied","Data":"419ef08e96de1310d58b89d9dc91be12123ef06b7cf2f7b293589e349077e04c"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783856 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"3169f44496ed8a28c6d6a15511ab0eec","Type":"ContainerStarted","Data":"5a89fdcb31a57b509eb73373840f305ff5d3039dc4adac822b9b40350179af76"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783865 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerStarted","Data":"46664a4ee70d50343e807b0abdcc4556bf4a4ccc60c19f6748f2ddf921020853"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783875 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerDied","Data":"712a042e7fad33cd815c939ca364362f7220e1f1ce6096f34de7bd5630509fb8"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783885 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerStarted","Data":"48416d2553549ef2df4e4b21da938432c85035a334034a6b191574d20869a9df"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783895 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" event={"ID":"3b741029-0eb5-409b-b7f1-95e8385dc400","Type":"ContainerStarted","Data":"029b733e2c6ad9f0e336ec7c4af189bd8388fe0d1d5f30c3280c2f24f4c1e475"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783904 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"2c505d1745e5c41c810aeede53577e7297a75c5a2221af8e371f406e5004dcbf"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783914 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"ba110a7b76ad288df7047b8cf5908c2bd3487d9f6a715466f139c0f2eb3f27da"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783922 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"b24c1b8d78045ff86297a6b78ba71b900f89c5e046061babf21a495bd9bf95d3"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783933 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerDied","Data":"ce5bd605cc76993bca2c497ff38423a9bcba04863edec782efc7ee32483a630a"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783943 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"2cb8c983acca0c27a191b3f720d4b1e0","Type":"ContainerStarted","Data":"b0475df4d5336da05f2cdbc3f74e49ad376be174c9b01bb8c74b713bd60e7ac6"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783953 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerDied","Data":"a62572546062b2df435bc85f27bda94544b75d65580e59f21beaef134a43b821"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783964 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerStarted","Data":"8a2315b172a2f4696d36566ac0967bac2a393e7df33c410eb47c73827f2cb352"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783973 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerStarted","Data":"c44264ca51ad61ed3b05ffa4c975691fd7debf64dbafd9a640308d225a077e0b"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783983 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4s89l" event={"ID":"dc5db54b-094f-4c36-a0ad-042e9fc2b61d","Type":"ContainerStarted","Data":"2f534d2b0f28fdc73bcde7620f0608943fcec70ee43db7154e751cbea94562d5"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.783993 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-4s89l" event={"ID":"dc5db54b-094f-4c36-a0ad-042e9fc2b61d","Type":"ContainerStarted","Data":"164d69c4a697b3689889d3ab2e5a66ca6c9ed1089292b441ab9282cdde612925"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784002 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfk7f" event={"ID":"9c31f89c-b01b-4853-a901-bccc25441a46","Type":"ContainerStarted","Data":"3d5a43caa0556605bbe96d059d3c904315c064da61dfc71414acf314ee5b2814"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784013 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfk7f" event={"ID":"9c31f89c-b01b-4853-a901-bccc25441a46","Type":"ContainerDied","Data":"1f82b253446479fa5b79026be8aaeda5c0a2e403ecedf5e8aa0aa49e59d88903"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784025 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfk7f" event={"ID":"9c31f89c-b01b-4853-a901-bccc25441a46","Type":"ContainerDied","Data":"ad4e3fece245cbae80f7af3bfbb0484b4d1681aae90d16b1b170f5d8af892edc"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784034 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wfk7f" event={"ID":"9c31f89c-b01b-4853-a901-bccc25441a46","Type":"ContainerStarted","Data":"46252d0271f63a839c2cf8d137d190a08ca8c85ab8a7cd49fe478dd080504839"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784044 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" event={"ID":"49760d62-02e5-4882-b47f-663102b04946","Type":"ContainerStarted","Data":"987d6983e55310f76f89331773ed3d708557e669dfdedbfdd605e1afe8d494c4"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784053 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" event={"ID":"49760d62-02e5-4882-b47f-663102b04946","Type":"ContainerStarted","Data":"6d7e84b5ce96cc743bb3392588c9efdf14f4afe467d9a7be36705ddbb090197e"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784063 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerStarted","Data":"96432bd98ca024e492fc580cbc73eb38cd510787da2af19671c5dce6d570c07d"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784077 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerDied","Data":"57b070ec273f7f37c3345984984d73c010928b9bb1f5746c1d4e18feee8dc2dc"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784089 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" event={"ID":"807d9093-aa67-4840-b5be-7f3abcc1beed","Type":"ContainerStarted","Data":"47731386c0cb9aab3894731b6143775966f36286ae6b54927bb926129b389c33"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784114 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" event={"ID":"0dda6d9b-cb3a-413a-85af-ef08f15ea42e","Type":"ContainerStarted","Data":"b34007475640228397f904792caa70766119deff9ff9ac4f7b367d7746a11f97"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784127 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" event={"ID":"0dda6d9b-cb3a-413a-85af-ef08f15ea42e","Type":"ContainerDied","Data":"494ce5c3826b0b94b974fd41d16b7ba6517fb0d007e27462a2d7b33e01aa4443"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784141 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" event={"ID":"0dda6d9b-cb3a-413a-85af-ef08f15ea42e","Type":"ContainerStarted","Data":"3de4ddaa09ada567848564877e7c542bbe9c6a292970b0f8cf886f5ba9fa75db"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784151 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" event={"ID":"0dda6d9b-cb3a-413a-85af-ef08f15ea42e","Type":"ContainerStarted","Data":"9a083a2de33da77d47cd60a3708aaf6bb8591ce81eba8d8e42788e2c8c58ecd3"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784162 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" event={"ID":"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb","Type":"ContainerStarted","Data":"d020dc1da875fa7050b8625e3ab5b871982f94b26fe855432e6788c518f5cf79"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784174 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" event={"ID":"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb","Type":"ContainerStarted","Data":"4ed24c6b6f900a1eeba45b567c2d9336f6c8e081eea3b175ce81e0e583f37f25"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784204 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" event={"ID":"d3e283fe-a474-4f83-ad66-62971945060a","Type":"ContainerStarted","Data":"515973c663cd824493ca1b981576a0161a6d2ecb1bc5aa4db6d64554c07e31d5"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784215 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" event={"ID":"d3e283fe-a474-4f83-ad66-62971945060a","Type":"ContainerStarted","Data":"7ad783bf372e01adf00c6e45c58b553edb8704c6a0612bb491f7869b46f9b52d"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784225 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" event={"ID":"d3e283fe-a474-4f83-ad66-62971945060a","Type":"ContainerStarted","Data":"7306701b7f1e349175a899928ef136fbd77aaa68bd4675a9b0f16eeeda9ca379"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784237 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-99djw" event={"ID":"fb7003a6-4341-49eb-bec3-76ba8610fa12","Type":"ContainerStarted","Data":"f7e070e3835422f37986b17613bb2a923a628ccc634c0641f7b2911fd3c07111"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784248 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-99djw" event={"ID":"fb7003a6-4341-49eb-bec3-76ba8610fa12","Type":"ContainerStarted","Data":"e148c0e6308743ecf579bef0b88df088d99461b29256ac158a317b333df0b195"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784260 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-99djw" event={"ID":"fb7003a6-4341-49eb-bec3-76ba8610fa12","Type":"ContainerStarted","Data":"e67f95f822c645d6f2dd2098e7e055983609569dd0acfdc0e0bea037bf8d6c03"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784272 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"076dafdf-a5d2-4e2d-9c38-6932910f7327","Type":"ContainerDied","Data":"f8dc47e77bee6411ef3a450c0123b8279b91a4729700211ae01112ac79fa1d1e"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784285 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"076dafdf-a5d2-4e2d-9c38-6932910f7327","Type":"ContainerDied","Data":"26722ad2bd6e7ca8bda35211d0d46cd57e0c0ba5a29870576dae6f8264697434"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784295 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26722ad2bd6e7ca8bda35211d0d46cd57e0c0ba5a29870576dae6f8264697434" Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784306 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rzl84" event={"ID":"ce9e2a6b-8ce7-477c-8bc7-24033243eabe","Type":"ContainerStarted","Data":"193b5b7aa7464f9332f9efd8e29d1c5efa1e26b3892e37be477fd5522ff1eff9"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784317 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rzl84" event={"ID":"ce9e2a6b-8ce7-477c-8bc7-24033243eabe","Type":"ContainerStarted","Data":"679937dc83b97301577d3c65750d4ebf2b527dc3eb9e1329443173e8480258f9"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784327 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rzl84" event={"ID":"ce9e2a6b-8ce7-477c-8bc7-24033243eabe","Type":"ContainerStarted","Data":"19edfec7b5dad95038c7d84a7af049f95270320317e900ea90d94c12477f0556"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784338 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" event={"ID":"531b8927-92db-4e9d-9a0a-12ff948cdaad","Type":"ContainerStarted","Data":"a1e1f964f61db578543e8bda36d3c26eb06dbcb3659a952a96708304cb1ba2a9"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784350 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" event={"ID":"531b8927-92db-4e9d-9a0a-12ff948cdaad","Type":"ContainerDied","Data":"5227d615ebfc1e16e53996d356380f47ad9e8fd55349d0658112ccb54f8ef1bb"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784362 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" event={"ID":"531b8927-92db-4e9d-9a0a-12ff948cdaad","Type":"ContainerStarted","Data":"9ca3179bcac9021f22c3e7255b372820926d29356fd67cac276625618bd240a6"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784374 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nwplt" event={"ID":"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6","Type":"ContainerStarted","Data":"00da80bef6b48eb1abc64fad064f00d41d97860ac5c2f760a1238efd21d8e70d"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784386 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nwplt" event={"ID":"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6","Type":"ContainerStarted","Data":"3a9d8373a41ae93e2045d1c0300d43339b0c915de4cad9048741918269853b51"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784399 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" event={"ID":"e2e2d968-9946-4711-aaf0-3e3a03bff415","Type":"ContainerStarted","Data":"de8c9b1dc7ded42717fa7579b2074dc4f99da101c43fbc90332e93b93966800c"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784411 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" event={"ID":"e2e2d968-9946-4711-aaf0-3e3a03bff415","Type":"ContainerStarted","Data":"130205999d123cc10c914ecc3cb22cde267becfbe33db09ccb0559c952bdc40f"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784423 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" event={"ID":"f2635f9f-219b-4d03-b5b3-496c0c836fae","Type":"ContainerStarted","Data":"9af8ab651bd63e8bc68f978bbf5aebe3be6cba36632826679028614cf841f7a7"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784435 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" event={"ID":"f2635f9f-219b-4d03-b5b3-496c0c836fae","Type":"ContainerStarted","Data":"b3ecec2aa414e2dc966b3af1e3db3667edb0ce30dd8be08c7dc1e26871633e6e"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784443 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" event={"ID":"f3792522-fec6-4022-90ac-0b8467fcd625","Type":"ContainerStarted","Data":"273fc7466339fc71dbff783d03d786641f9cff2c7e10ab401acbc6c674705b52"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784454 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" event={"ID":"f3792522-fec6-4022-90ac-0b8467fcd625","Type":"ContainerDied","Data":"e23a95f64400e0dbc7fc95b5f04dddc2d0290c35a1bcdf186ddbd3cd8314a14f"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784463 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" event={"ID":"f3792522-fec6-4022-90ac-0b8467fcd625","Type":"ContainerStarted","Data":"49f2f301b501743d7a4254bc3eeb040151fb199e2a4d9ec64ddce3a74ce66f5b"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784475 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerStarted","Data":"1c3530626c917433ac22bbadd19205f000313560085b5540423d4847a8993705"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784488 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"48cc412fc0495a9b989b3163afe32a67e585bd82e370a59d4690f30fe1abc9dc"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784504 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"d98d05970b7b2ac04c6af16edb9c07e4ea790e687fa82b42828f83752f9655a5"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784518 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"678a3e3b29045fc802f2f4ea9939ca067adfe6ff12b24bb2dd5f895390e55a41"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784528 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"46d777da61d52678086a53c15e814977a05f1e509e1945fa53a5e65cac047f51"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784537 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"d81a6813a03e38c556e737371d737471f12aa2c77281926715e2cfe7ffc056aa"} Dec 05 12:50:04.784471 master-0 kubenswrapper[29936]: I1205 12:50:04.784545 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerDied","Data":"503a0b99be77d72f51d7afcf8403bc7d040b77fef62f126cd910c2ff4b520892"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784554 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-prt97" event={"ID":"708bf629-9949-4b79-a88a-c73ba033475b","Type":"ContainerStarted","Data":"b852dfb0ed7374453aa61f11c0df40cc142ce70b6943ce06b264cc249753a13b"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784565 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" event={"ID":"cf8247a1-703a-46b3-9a33-25a73b27ab99","Type":"ContainerStarted","Data":"65805ce826a6880e17ce2c571cd39f060976d0a8a6ae89fcede7232cc66bff52"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784575 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" event={"ID":"cf8247a1-703a-46b3-9a33-25a73b27ab99","Type":"ContainerDied","Data":"48561b92390271bf5bcb9ad8430184be011980a59e84d647901af23e4a1dea25"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784584 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" event={"ID":"cf8247a1-703a-46b3-9a33-25a73b27ab99","Type":"ContainerStarted","Data":"9f1e76d4f58fcd22a9b3bb1871e5fda992687c0e5181ed09e4aadbd1b7953465"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784593 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerStarted","Data":"6b78f4686886eb46c40366678ccd87c7785bc499aa4eabf81ddb13759dd9ebc7"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784603 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerDied","Data":"49b30bfe4873642e053b957d836ec7eddcac24e11ada8325b8fc8b72bfefafe8"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784617 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" event={"ID":"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9","Type":"ContainerStarted","Data":"df3031001bb8ce6924d98db7ed12f84815ddd5de33ab7d2a19bcefd503d510dd"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784629 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" event={"ID":"5f0c6889-0739-48a3-99cd-6db9d1f83242","Type":"ContainerStarted","Data":"d324f3be47b40d64f2eb275a06a3da375cc2d17a721be2f87def91dc6fec78c2"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784641 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" event={"ID":"5f0c6889-0739-48a3-99cd-6db9d1f83242","Type":"ContainerStarted","Data":"cac6f03a0427fe3f821f5cb9684613bbc6f78a43198e6a2ef1b43d626c97b8ba"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784653 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" event={"ID":"5f0c6889-0739-48a3-99cd-6db9d1f83242","Type":"ContainerStarted","Data":"92eddccae7e06f02f48401d5d5f367dae7b9b78b2bbc84b00b68ec03e90321c1"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784664 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" event={"ID":"1478a21e-b6ac-46fb-ad01-805ac71f0a79","Type":"ContainerStarted","Data":"784eea1d9740e9545a6ad492de89955c83abee11f7626ae20b597773d711dc88"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784676 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" event={"ID":"1478a21e-b6ac-46fb-ad01-805ac71f0a79","Type":"ContainerStarted","Data":"d3609111ab16a5bdb6bc2e1385cc4f52a698e20f663f1587bad8d27d757fd6be"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784689 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" event={"ID":"1478a21e-b6ac-46fb-ad01-805ac71f0a79","Type":"ContainerStarted","Data":"6a7ef281a34ccfa6602eba73eaa73316754fbb0bb6c1935a7c44c597fce5d077"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784699 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" event={"ID":"665c4362-e2e5-4f96-92c0-1746c63c7422","Type":"ContainerStarted","Data":"14e043dc9f8b3470df421fe84c1bd6ed6c94ede3d95b8d74893ae012e041f04e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784708 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" event={"ID":"665c4362-e2e5-4f96-92c0-1746c63c7422","Type":"ContainerStarted","Data":"3db5e7135e78833b3a92c45746a15fb15863c2f0a43a694b41b9752901ee6fff"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784717 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" event={"ID":"665c4362-e2e5-4f96-92c0-1746c63c7422","Type":"ContainerStarted","Data":"24b6227b14f227965d3702a28c5ff0f7f572415f72495d988769ab39d10c0d8f"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784728 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"34c662fa07de0c08cafe82dd42ae1e0359fa3bbfb03c3cbf9cbec7bb72517328"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784740 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"b3860dcf136009c692df523a643ebd872d14983cf881ae6cecf3b72bd4c343db"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784751 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"835dec2af52fd7ec20588c9018988cf86c305d1989f96ea1549008fdb5e04109"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784761 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"f10cf9fb0238be4d34d1001638012d731272864867405100db90e54fd1b0489b"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784771 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"bca56bcd2d866b305199c7dd4a2eee615bb7722c74c3f021e5b1413c58454e2d"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784782 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"2a8e7f38b128627544fcfe08f2d0eef9ae364770a9037f3dac3761d553a8ed98"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784794 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"073375b200bc70b30fb0cad0a5ecce97a68446c026019d9c52074056ad94e0a7"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784804 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerDied","Data":"7fe2ad5243db75a4d0831218b0b4d047af3794e202e2009112af905d4919bd2b"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784815 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"58d12e893528ad53a994f10901a644ea","Type":"ContainerStarted","Data":"a484ee5e7b41d00e01ba54d4ad8789422ba018cb058ac26feb10517be87018de"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784825 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" event={"ID":"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb","Type":"ContainerStarted","Data":"47993b0f5c02b8432a4bbdcf73db57ba7e46c6e4e750f5d8d873140e16f0fa9e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784845 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" event={"ID":"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb","Type":"ContainerStarted","Data":"943143ef3973188af4783dcf40be99b719a3294d28a086cd4ae91e7bc36161f4"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784854 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" event={"ID":"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb","Type":"ContainerDied","Data":"c9cbf8e5df58cf6c6aff3967b76368b2b683cdb47115f76abdee2db7c46ae76b"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784865 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" event={"ID":"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb","Type":"ContainerStarted","Data":"04f451fea9668a794e9e554df0005ce70f405943bf1c6d084959d7f333152fc6"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784874 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"d627fcf3-2a80-4739-add9-e21ad4efc6eb","Type":"ContainerDied","Data":"8654b600b7307ea1bcd3fe84275fb56084c5722cbe5ccf524025cea2bfa3d8cd"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784884 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"d627fcf3-2a80-4739-add9-e21ad4efc6eb","Type":"ContainerDied","Data":"f05e889af7263b1ba07fc81648bcfe8b4d672794681d2558bab4fed4dcbd28ca"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784893 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f05e889af7263b1ba07fc81648bcfe8b4d672794681d2558bab4fed4dcbd28ca" Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784901 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" event={"ID":"c39d2089-d3bf-4556-b6ef-c362a08c21a2","Type":"ContainerStarted","Data":"712d1506cd51ade164ab750d49a330bfc3046901061c8f5155346b2e5325a1c2"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784912 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" event={"ID":"c39d2089-d3bf-4556-b6ef-c362a08c21a2","Type":"ContainerDied","Data":"fa4f02496398ccdc5c55acbb60e75e3c69d9850820e087e65cbe9d00bf63d07e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784923 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" event={"ID":"c39d2089-d3bf-4556-b6ef-c362a08c21a2","Type":"ContainerStarted","Data":"ae3644549c6caccb0e5b76cf093dd16f97c66829b7bc2c724be0d4328e24c56e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784933 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerDied","Data":"b7483a678d691fbf8a3207dd7d6ed1c739a3647a4a630049897502326cc17230"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784946 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerDied","Data":"eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784956 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerStarted","Data":"c13f876ed14f7005d250ab3203aedc5ac3d9bddbbff7570300b321a40f59bd5c"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784964 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" event={"ID":"a5338041-f213-46ef-9d81-248567ba958d","Type":"ContainerStarted","Data":"8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784974 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" event={"ID":"a5338041-f213-46ef-9d81-248567ba958d","Type":"ContainerStarted","Data":"1234ab8fb98aae2372aaa8236a21f36a20e417c28feeae32f634a7022c473171"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784983 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z2nmc" event={"ID":"60327040-f782-4cda-a32d-52a4f183073c","Type":"ContainerStarted","Data":"cdeb1ea490ea5701b2a95b0146cfc27c895466411f8fb26720209d1edb7876cb"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.784993 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z2nmc" event={"ID":"60327040-f782-4cda-a32d-52a4f183073c","Type":"ContainerStarted","Data":"cfb51eb4003c8f464c1a8f16b5c32a07dfa0f9b4a935f9263c448d3754ceed40"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785002 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z2nmc" event={"ID":"60327040-f782-4cda-a32d-52a4f183073c","Type":"ContainerDied","Data":"5021d0ebd02a2ebd7ed1f4a980629b114fcca13491901c53a97391580abdd083"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785013 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z2nmc" event={"ID":"60327040-f782-4cda-a32d-52a4f183073c","Type":"ContainerStarted","Data":"fa27a4561538d102c835ff1b231e3510011f63fe691f54410ca3547822dc8742"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785023 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" event={"ID":"a280c582-685e-47ac-bf6b-248aa0c129a9","Type":"ContainerStarted","Data":"fb030ad34b9342fc42a80c2fdf5d7deaefdc07aa0ffbb47e24246b631e76fcfa"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785033 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" event={"ID":"a280c582-685e-47ac-bf6b-248aa0c129a9","Type":"ContainerStarted","Data":"4e53d72cb8b1cdc5f2650e124f1a5eb3f2376bad125be7582d7eaee220557d0e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785043 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" event={"ID":"a280c582-685e-47ac-bf6b-248aa0c129a9","Type":"ContainerDied","Data":"502462f2915d6fff82c1a557ec2a9e24c7fbeef3a6daff0dad2cf5862df79899"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785053 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" event={"ID":"a280c582-685e-47ac-bf6b-248aa0c129a9","Type":"ContainerStarted","Data":"5404e1e33c358f139ce43aadf9014fd74254490d058389642b99e6aa71216243"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785063 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"4d215811-6210-4ec2-8356-f1533dc43f65","Type":"ContainerDied","Data":"419f6f30a7830337f1a96ed401ad15741b6815b1dc5b3d9cd59d5f9c8beb4aa8"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785073 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"4d215811-6210-4ec2-8356-f1533dc43f65","Type":"ContainerStarted","Data":"a8ddc41afaf0c618d55e894f2ce13b792424c9105a66a883a048089812798f25"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785083 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785093 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785103 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785113 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785123 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785132 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerDied","Data":"1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785144 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"b89698aa356a3bc32694e2b098f9a900","Type":"ContainerStarted","Data":"915ebf41ba29fa9d0d989a762295214f873dc379b870a92fba77c1d033c014f1"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785152 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" event={"ID":"d72b2b71-27b2-4aff-bf69-7054a9556318","Type":"ContainerStarted","Data":"072364fce4151260cecc71f6271a4001a02ac2658d21c4a5606f1cd07f40e995"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785162 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" event={"ID":"d72b2b71-27b2-4aff-bf69-7054a9556318","Type":"ContainerDied","Data":"bbc65050e19c8e05efbd98764627a92089e068c4fefa760a423cea0a25acab48"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785171 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" event={"ID":"d72b2b71-27b2-4aff-bf69-7054a9556318","Type":"ContainerDied","Data":"836113a149a4eefb4c2ce8d65a7d2c1b43cd3294cab879526b98ff307bc6e81d"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785197 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" event={"ID":"d72b2b71-27b2-4aff-bf69-7054a9556318","Type":"ContainerStarted","Data":"9abf289d98169b2aa959495298e72df522e02a710723a8c85b99355af8b7eae3"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785207 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerStarted","Data":"247a3b2c777f8fe2f346367a22bb39c214fdb1a922b3b827dcfce8dd159f9390"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785221 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerDied","Data":"efd0af11329fc9886861d20bcf790f4afa476fb62b8a37aabf75eec470dca7ba"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785231 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerStarted","Data":"30c5b5c630bd02b5b3e82dbf4596b8a0300a9a9b3ba466ae6fca11dbd31d9aeb"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785241 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-xwx26" event={"ID":"b8233dad-bd19-4842-a4d5-cfa84f1feb83","Type":"ContainerStarted","Data":"ee402b16b01951f980b833d7daf2d0304b91018363304b2cfe0e79874029cf9d"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785251 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" event={"ID":"58187662-b502-4d90-95ce-2aa91a81d256","Type":"ContainerStarted","Data":"7c715e090bdbc7252a3de31126638fe765c309cab209969215dce8cf6f422ab7"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785264 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" event={"ID":"58187662-b502-4d90-95ce-2aa91a81d256","Type":"ContainerDied","Data":"d0c256d51be6b67ce11d61e05ebacd6a747bab028d852541d977f5d77734ba1a"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785275 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" event={"ID":"58187662-b502-4d90-95ce-2aa91a81d256","Type":"ContainerStarted","Data":"0b9e8ef8efad8c6e16cd6e6a39269d9f5b02a38a45cb5b422afaa90713381fcb"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785287 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" event={"ID":"e5dfcb1e-1231-4f07-8c21-748965718729","Type":"ContainerStarted","Data":"2b11a7092987ce9dc3415de6986fd3fb9e8cd98ab5789b4c5b5b61519d70650e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785297 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" event={"ID":"e5dfcb1e-1231-4f07-8c21-748965718729","Type":"ContainerStarted","Data":"0881763cdee0ccdba8e5778bd81b5f22280f808126ce0c207bab6ce207f27343"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785305 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" event={"ID":"e5dfcb1e-1231-4f07-8c21-748965718729","Type":"ContainerStarted","Data":"12c707b6a686095bb6b918fa64b447ec88e080a7e32878fed57fd6822470f9a2"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785315 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" event={"ID":"1871a9d6-6369-4d08-816f-9c6310b61ddf","Type":"ContainerStarted","Data":"e133f763b4090ab8deffe912e58f36acd0db95abe046782abfe8e5f290664368"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785325 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" event={"ID":"1871a9d6-6369-4d08-816f-9c6310b61ddf","Type":"ContainerDied","Data":"6017aa71f5b63d7cdd32da77565edb00ec8b8b5e5059d49e4246bd4f05d6b50b"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785334 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" event={"ID":"1871a9d6-6369-4d08-816f-9c6310b61ddf","Type":"ContainerStarted","Data":"2a325da0f7b2c285fc4bf3a467e693950dfc8948d49a5740a004f6101e748cc4"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785344 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerStarted","Data":"fc976579761cd166f544b17e7e21a078085d48a1844a3caee0473f2393e3d972"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785353 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerDied","Data":"7a5d71f74727c7976f8b5ae99bcc9e973922d3b63fcffbddb6ae9dd46ae8aa22"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785362 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" event={"ID":"594aaded-5615-4bed-87ee-6173059a73be","Type":"ContainerStarted","Data":"44e741be030df14b7e9e415d32f4095c562d693609b8dc4bd8ec51c21503bbca"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785372 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"4957e218-f580-41a9-866a-fd4f92a3c007","Type":"ContainerDied","Data":"eed2e77d9f832d089463e6b1b5c8775d3273e95a2de91d82d1ec20f52035753f"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785403 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"4957e218-f580-41a9-866a-fd4f92a3c007","Type":"ContainerDied","Data":"a533643c1db6301ec0bbc4a1c931a3217a3a4fe2dc4e69292c8e2163d1c11951"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785412 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a533643c1db6301ec0bbc4a1c931a3217a3a4fe2dc4e69292c8e2163d1c11951" Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785420 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" event={"ID":"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76","Type":"ContainerStarted","Data":"26d489dc1c2c45db44d32cda974f33865505518c1962dcd743305c17c6b0f5bb"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785430 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" event={"ID":"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76","Type":"ContainerStarted","Data":"3bd2ebecae58df5657c5f3c9fc768f7de5c16550901b835bef03d24d93582761"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785440 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" event={"ID":"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76","Type":"ContainerStarted","Data":"6e36fea081a65f76f7b44518c2dc8f9952033f7a8d733e7f0dc464daba9c2867"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785449 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" event={"ID":"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76","Type":"ContainerStarted","Data":"86aa525c2c153f5cbd8c5b3603c3c0fdcde107672a7bd7aeacc117267683bb33"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785457 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" event={"ID":"7d0792bf-e2da-4ee7-91fe-032299cea42f","Type":"ContainerStarted","Data":"b0b6c6e5845f21451ae31807803c6c6a8522e211f03654dc5026b22ef249bb34"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785468 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" event={"ID":"7d0792bf-e2da-4ee7-91fe-032299cea42f","Type":"ContainerStarted","Data":"62b006cd51c7d10f8e6f8e36ec2fbd7c2b472a5db5854f2056fdbe13f97f07e2"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785481 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerStarted","Data":"7300fdd0ccd012b07cc22015385845a110863d45bf0c343844c7aeba0c0cd40b"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785494 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerDied","Data":"d299754c006efdd8044d5689f796048069701078c77f429aafcfe9eafc6522ec"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785507 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" event={"ID":"d53a4886-db25-43a1-825a-66a9a9a58590","Type":"ContainerStarted","Data":"bba3aa271baddd92ed5881d6af79fb82b3a45fce07083a5cd051cbeeb1a01428"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785517 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" event={"ID":"a2acba71-b9dc-4b85-be35-c995b8be2f19","Type":"ContainerStarted","Data":"0e7509a9d39d3092ec9ec8e1b908f1fa3448275694e027b1ba9f70fa93878312"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785526 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" event={"ID":"a2acba71-b9dc-4b85-be35-c995b8be2f19","Type":"ContainerDied","Data":"0ef8d356dac19c1922c065326d7809108046e5a2cd059d5d50b5229acd7007ec"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785535 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" event={"ID":"a2acba71-b9dc-4b85-be35-c995b8be2f19","Type":"ContainerStarted","Data":"743ece8bb6e404056a2fb9957949cb0a30330d99bb6dbc633553c08d0fb45759"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785545 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5863ee594cde86aeedce8416be9b249f569b2f49267eb70875c7f8a2e451e4e" Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785553 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qsggt" event={"ID":"f4a70855-80b5-4d6a-bed1-b42364940de0","Type":"ContainerStarted","Data":"91f93abed058375a2f9d971d7119339c27c4857eb8ea956d8ecc7aeb14fabe54"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785561 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qsggt" event={"ID":"f4a70855-80b5-4d6a-bed1-b42364940de0","Type":"ContainerStarted","Data":"c9238078b14a694c40b63db5c3f18b28faafcb8ecbd14ef862a7acac34f2ffa6"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785571 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" event={"ID":"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e","Type":"ContainerStarted","Data":"95843f9193f5780326180ee5d96855091da4ac76a7b08a6ce8f5f391baac0caf"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785583 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" event={"ID":"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e","Type":"ContainerDied","Data":"45f66b524a7ae3f72102f1c1f147264a8f0120c9900c09db2778e03215486e8a"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785593 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" event={"ID":"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e","Type":"ContainerStarted","Data":"9fd6db41eb8dc90e6efffc25bb3c93739722e6824dad0dcb9a786720bc6514c4"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785601 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a906debd0c35952850935aee2d607cce","Type":"ContainerStarted","Data":"4d7c7fd9f6be698bd81fc9eb6c8b4d1eab76e44ec95ef9874a47a2596768ed58"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785611 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a906debd0c35952850935aee2d607cce","Type":"ContainerStarted","Data":"02429253d825f84b6e8d3688d755956761f60c5751e2535a15ebb536be6b1f94"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785621 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" event={"ID":"365bf663-fd5b-44df-a327-0438995c015d","Type":"ContainerStarted","Data":"2ddf5b913c8def77b7ff031d3fc7b9bf753cce46d08c9770d77762f9cc280fa4"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785632 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" event={"ID":"365bf663-fd5b-44df-a327-0438995c015d","Type":"ContainerStarted","Data":"667cd6e2494d3e418da699cd959c521ed7b9fd7b51dbacbf2b69ac4e7e52a0ee"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785642 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" event={"ID":"365bf663-fd5b-44df-a327-0438995c015d","Type":"ContainerStarted","Data":"1e7e859b537def1a21239198a62664ddf26773c1c6f156f411606722ed8cb4e6"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785650 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ab1992e269496bc39c1df6084e6e60fd","Type":"ContainerStarted","Data":"0a16bc5dbf4947d3592d7a160d069d5ae407c8eecca6478799c03089401c073c"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785659 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ab1992e269496bc39c1df6084e6e60fd","Type":"ContainerStarted","Data":"1b3283d0fac22ca78f337b1d5e3afe8d01431a02a7bb6f2fb90c61b14196aefb"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785669 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ab1992e269496bc39c1df6084e6e60fd","Type":"ContainerStarted","Data":"8d14f1413c8e8a2ef6cd9ab523725814ba9ff7a6021dd1c6a68ef759cfabfdf3"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785677 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ab1992e269496bc39c1df6084e6e60fd","Type":"ContainerStarted","Data":"91dbe5959251acff62db45931eb5a5e1e4e7af9bb363ef308eee803d4237a389"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785685 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"ab1992e269496bc39c1df6084e6e60fd","Type":"ContainerStarted","Data":"78eb0a378ee87ec426723278f27c3f8944db139eff4ee08e81e705d48c517d58"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785696 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34418050489e8b48781fa5128a0548228f5bdb58f7e6a5f88226bbd7dacf7bb5" Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785704 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f6j7m" event={"ID":"bc18a83a-998e-458e-87f0-d5368da52e1b","Type":"ContainerStarted","Data":"07e07e8abe6c713822a9a9f9d007d69e82226fff5293360065d48b0d20066a24"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785713 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f6j7m" event={"ID":"bc18a83a-998e-458e-87f0-d5368da52e1b","Type":"ContainerStarted","Data":"ed538f41551e0e7b372ee4dcc843f84e56fe8d6677fe847816efda02bfd61218"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785722 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" event={"ID":"b13885ef-d2b5-4591-825d-446cf8729bc1","Type":"ContainerStarted","Data":"8bd8ff38f53cf4940c1efaa7c62de04a6ed00058d3624f9a76bc40b03dd26c9f"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785731 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" event={"ID":"b13885ef-d2b5-4591-825d-446cf8729bc1","Type":"ContainerStarted","Data":"0f8a1e4d8de6a06f67857b43e08d70d6ce0e19926ff25b49cbea007cf56e4e61"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785740 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5nqhk" event={"ID":"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1","Type":"ContainerStarted","Data":"714e28f97e2ec6d00e1683c4d2537a164bb01931e5ad5b6860350da680801a09"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785749 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5nqhk" event={"ID":"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1","Type":"ContainerDied","Data":"60d32869d5d76c04555375fdfd9ab0f008a07a41f85b96737cde09fadc0deeb4"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785759 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5nqhk" event={"ID":"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1","Type":"ContainerStarted","Data":"7fe4976a702070d88ebc0b91a8c147521b2f0d81e1e2131e752211b96529d448"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785768 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"565d5ef6-b0e7-4f04-9460-61f1d3903d37","Type":"ContainerDied","Data":"1cb443e02b64a65178050b34e99e50f308c86d2ef5b4e7e730bfa0faf58cc53e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785783 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"565d5ef6-b0e7-4f04-9460-61f1d3903d37","Type":"ContainerDied","Data":"ffedcf7b097d85236cfda3f347741e8721f2f2b5597465d279b812038c00b460"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785793 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffedcf7b097d85236cfda3f347741e8721f2f2b5597465d279b812038c00b460" Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785804 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerStarted","Data":"f31f1f557e375896231e731e60b18a48878ddaf2be696f8a53d9f13550375166"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785817 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerDied","Data":"33a709d9e47c123942b76dd36410ef83571393334a41e347b73753c3f8332654"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785830 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" event={"ID":"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8","Type":"ContainerStarted","Data":"9cfdae6ccb167d4a6f250b34ce3b8d4ec56326be1aca0a0b497bcb1caa6ac3cf"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785842 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" event={"ID":"a45f340c-0eca-4460-8961-4ca360467eeb","Type":"ContainerStarted","Data":"3a00979f1a40fa874a9b8220fac00b5191a3cf77eaa5880a179ac86b435ff29f"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785854 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" event={"ID":"a45f340c-0eca-4460-8961-4ca360467eeb","Type":"ContainerDied","Data":"06eca27e0fe884f90bd62d903b17dde7161c7cd5f8bd04b4c9959d40b8706ecb"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785866 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" event={"ID":"a45f340c-0eca-4460-8961-4ca360467eeb","Type":"ContainerStarted","Data":"bd884dd8fbf0cb13a01d3369dc09dbcaf952157e210620f5c83187eab601232c"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785877 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" event={"ID":"1ee7a76b-cf1d-4513-b314-5aa314da818d","Type":"ContainerStarted","Data":"be906a53f820b21555f2880c815b5a7120f14a015e27df21706cfb62d2b36ef4"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785890 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" event={"ID":"1ee7a76b-cf1d-4513-b314-5aa314da818d","Type":"ContainerStarted","Data":"fede23ee661b7ea969175a9ba409eaa0d47e0f9069332c22e94196ac525e392e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785902 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" event={"ID":"1ee7a76b-cf1d-4513-b314-5aa314da818d","Type":"ContainerStarted","Data":"12b2377bacbd62ee93e11591af977d559716347304347ca9deca90451df150b7"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785916 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" event={"ID":"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2","Type":"ContainerStarted","Data":"acc93650e1b2b844988ef7bf696d586f1fa71b30b85d3a240aab334218886cb9"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785929 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" event={"ID":"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2","Type":"ContainerStarted","Data":"02cbd5be726f383c3fff717aa896f2f5f9edd3ef8d5f5444366eb4982a31e95b"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785940 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" event={"ID":"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2","Type":"ContainerStarted","Data":"6156788076b5ad36c99009735d59fdd236497a34b3d72e5f7f9563e6ec82cdb6"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785951 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" event={"ID":"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2","Type":"ContainerStarted","Data":"fafe50d6690c2fbac658b4db9e7e7d0a871a9941f8ee2fd5f2fce340df7fd5f6"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785962 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-q9qdg" event={"ID":"a14df948-1ec4-4785-ad33-28d1e7063959","Type":"ContainerStarted","Data":"e49b3ffc20b79d61c5da13ba14b717c8ba7cb68b1431994e06693ed50d8cbba7"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785973 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-55965856b6-q9qdg" event={"ID":"a14df948-1ec4-4785-ad33-28d1e7063959","Type":"ContainerStarted","Data":"dac2262b7105102ce37a8db95766fbd5753d50bed12fb86441b8247f4653fc04"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785986 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerStarted","Data":"5911e299ea12124949df6b53fe6e36667af26bd5976d0d79c6027eddac8ef8b5"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.785997 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerDied","Data":"551cae19e34d48fca769f493289bed8d81be6209860af5e4e43fa9850482cf12"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786011 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerDied","Data":"020f4fb4f4314f00ea400478b93e32903a1a30b5d332647ebe9614d7f944a537"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786024 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerDied","Data":"373b9eebb249846584e2d3e04b61f1d2ede61eec7ddbb37f633ff477767fcf89"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786035 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" event={"ID":"ce3d73c1-f4bd-4c91-936a-086dfa5e3460","Type":"ContainerStarted","Data":"3d66257a9a5cc16c308a04623948fb3eceefd2f34694e08267e4f17ec43d3782"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786045 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" event={"ID":"99996137-2621-458b-980d-584b3640d4ad","Type":"ContainerStarted","Data":"4d39f87ec0ab3e7d43386497849fc0b62dfc1564ab50782064167f0cb951ca1d"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786056 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" event={"ID":"99996137-2621-458b-980d-584b3640d4ad","Type":"ContainerStarted","Data":"d8e8a1073abbdf051f404a2a4f1613aeacac287ee90a5af14a8006b5d070a015"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786066 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" event={"ID":"99996137-2621-458b-980d-584b3640d4ad","Type":"ContainerStarted","Data":"570d4cae37b4f398ab8be13ab3899c325813f0073ace4d7fbe1d38d0fbd654b9"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786076 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" event={"ID":"db27bee9-3d33-4c4a-b38b-72f7cec77c7a","Type":"ContainerStarted","Data":"fb6a19c9ac322ac2e35728dc364dd53920d781096971f1443cf23fd5196c363e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786087 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" event={"ID":"db27bee9-3d33-4c4a-b38b-72f7cec77c7a","Type":"ContainerDied","Data":"e4b5bdd189732f9903e53555a7a61c0d10d37cd90596a4c760274c5cce948d5d"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786096 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" event={"ID":"db27bee9-3d33-4c4a-b38b-72f7cec77c7a","Type":"ContainerStarted","Data":"0ddc2093e9ac31dcb8fccf79117cc3631c474d52d69ff0ebdea12838c0ae6a82"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786106 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" event={"ID":"db27bee9-3d33-4c4a-b38b-72f7cec77c7a","Type":"ContainerStarted","Data":"3fc55735af7e7e6d6e15c1fa34cd05fc0468a74822467cb4ea7df9c2efc6cd2f"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786114 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerStarted","Data":"c3f1ecb329d73d055806e0a97968047a0f0996cc11b92a4b13a31f4dd631d1b9"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786123 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerDied","Data":"11b2f7447856caa8c6cb51432e0d7392e86f64482a9fae5b398a57d71719f20e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786133 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerStarted","Data":"b1ed868ac971480d433bc214f55b6262c1c9875a557884ba05c4f9ee33a0c3dc"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786141 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" event={"ID":"a757f807-e1bf-4f1e-9787-6b4acc8d09cf","Type":"ContainerStarted","Data":"a34af96221abd2b9bf387305f2624222004ffa4b53496a2a4e5584e580bd9733"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786151 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerStarted","Data":"400a9419d33e5072253e6a099476c2c681d982530672b0c4be40561f95d01978"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786163 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerDied","Data":"23c3457474b764927ea9138fc09fe8b0080b3d5dcfc7a8c9d9bd33c7ad79d98a"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786172 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" event={"ID":"ba095394-1873-4793-969d-3be979fa0771","Type":"ContainerStarted","Data":"ecdffd0c2fc8d747077d4ca5dcb541da82682f6d035455ac42566e8514bfadc3"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786199 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2pp25" event={"ID":"b74e0607-6ed0-4119-8870-895b7d336830","Type":"ContainerStarted","Data":"de5c01ef20eb6b4a7a0d0edd765eb5a0d5c99c96508f7cefbb6d4334d267cd81"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786209 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2pp25" event={"ID":"b74e0607-6ed0-4119-8870-895b7d336830","Type":"ContainerDied","Data":"ea572c6fcc8d460ca830971971bae224cadfb5879734d2e8d7b9add3c483a937"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786219 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2pp25" event={"ID":"b74e0607-6ed0-4119-8870-895b7d336830","Type":"ContainerDied","Data":"05c868179fe699a72c6f244f8706f4870b83c4369ed24818820567f21e6d96f4"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786228 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2pp25" event={"ID":"b74e0607-6ed0-4119-8870-895b7d336830","Type":"ContainerStarted","Data":"ce6d6f50d1ea16153d0bcd0e4641d90ef903c01636f33ef60f26b9dcbbaecad8"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786239 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" event={"ID":"954c5c79-a96c-4c47-a4bc-024aaf4dc789","Type":"ContainerDied","Data":"6a64d74f0d5ef7e0f5020ef79722fa9a1cfa622ec3d5ca7d9169d099609498b7"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786249 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv" event={"ID":"954c5c79-a96c-4c47-a4bc-024aaf4dc789","Type":"ContainerDied","Data":"c1f8d00525a746947cf993ebf0bd13cbdeabfcd9444c040d4018d1355c19f19f"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786258 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1f8d00525a746947cf993ebf0bd13cbdeabfcd9444c040d4018d1355c19f19f" Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786266 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerStarted","Data":"ba9a9971d6a0e8e47787750aed5178bf0427946fa6537ae74aff0ff8a94d2c5c"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786275 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerDied","Data":"7b0e1392f4706a31c5e08db223b1244b230bf09a2ede6f19588e74a4a3860cf4"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786286 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerStarted","Data":"919a5a586c053c933b88b4faaf4716d63b0ce72dea0802a0de12305677effe13"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786294 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" event={"ID":"153fec1f-a10b-4c6c-a997-60fa80c13a86","Type":"ContainerStarted","Data":"07bd9adb3dd2a54b1348564cac3ab912144772686d957ab49d9bf60d68718f5e"} Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786499 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:50:04.794913 master-0 kubenswrapper[29936]: I1205 12:50:04.786529 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a14df948-1ec4-4785-ad33-28d1e7063959-snapshots\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.786554 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-config\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.786575 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d53a4886-db25-43a1-825a-66a9a9a58590-config\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.786597 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-996h9\" (UniqueName: \"kubernetes.io/projected/5efad170-c154-42ec-a7c0-b36a98d2bfcc-kube-api-access-996h9\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.786624 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/665c4362-e2e5-4f96-92c0-1746c63c7422-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.786665 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.786690 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-422c9\" (UniqueName: \"kubernetes.io/projected/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-kube-api-access-422c9\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.786695 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.786854 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.787029 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.787212 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.787263 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.787402 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.787226 29936 scope.go:117] "RemoveContainer" containerID="eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.787521 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.787792 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: E1205 12:50:04.787843 29936 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"openshift-kube-scheduler-master-0\" already exists" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788027 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788062 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788169 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.786714 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wst\" (UniqueName: \"kubernetes.io/projected/b74e0607-6ed0-4119-8870-895b7d336830-kube-api-access-72wst\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.792970 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-catalog-content\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.792991 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b741029-0eb5-409b-b7f1-95e8385dc400-cache\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793016 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tngh\" (UniqueName: \"kubernetes.io/projected/d53a4886-db25-43a1-825a-66a9a9a58590-kube-api-access-2tngh\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793036 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-iptables-alerter-script\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793055 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793071 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-serving-cert\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793088 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-client\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793104 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ee7a76b-cf1d-4513-b314-5aa314da818d-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793123 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793144 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-config\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793162 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793197 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-host-slash\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793217 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793235 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793254 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/58187662-b502-4d90-95ce-2aa91a81d256-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: E1205 12:50:04.791305 29936 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793547 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-iptables-alerter-script\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793684 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-serving-cert\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793768 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/58187662-b502-4d90-95ce-2aa91a81d256-telemetry-config\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788228 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.793884 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-client\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788781 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794015 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-catalog-content\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788850 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a14df948-1ec4-4785-ad33-28d1e7063959-snapshots\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794139 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fb7003a6-4341-49eb-bec3-76ba8610fa12-metrics-certs\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: E1205 12:50:04.791288 29936 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: E1205 12:50:04.791268 29936 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-master-0\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788358 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788400 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794600 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqdxl\" (UniqueName: \"kubernetes.io/projected/cf8247a1-703a-46b3-9a33-25a73b27ab99-kube-api-access-fqdxl\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788409 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794665 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794686 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1871a9d6-6369-4d08-816f-9c6310b61ddf-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788458 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794708 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794738 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-stats-auth\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788473 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794756 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794778 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794799 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794821 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794839 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794884 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-trusted-ca-bundle\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794920 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-utilities\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794942 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfl8f\" (UniqueName: \"kubernetes.io/projected/b9623eb8-55d2-4c5c-aa8d-74b6a27274d8-kube-api-access-hfl8f\") pod \"csi-snapshot-controller-6b958b6f94-7r5wv\" (UID: \"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794962 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc18a83a-998e-458e-87f0-d5368da52e1b-hosts-file\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794980 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dwm5\" (UniqueName: \"kubernetes.io/projected/20a72c8b-0f12-446b-8a42-53d98864c8f8-kube-api-access-6dwm5\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.794999 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795017 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795036 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxwwh\" (UniqueName: \"kubernetes.io/projected/e2e2d968-9946-4711-aaf0-3e3a03bff415-kube-api-access-pxwwh\") pod \"network-check-source-85d8db45d4-kkllg\" (UID: \"e2e2d968-9946-4711-aaf0-3e3a03bff415\") " pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795053 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/665c4362-e2e5-4f96-92c0-1746c63c7422-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795070 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-config\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795088 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr2r9\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-kube-api-access-dr2r9\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795107 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5dfcb1e-1231-4f07-8c21-748965718729-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795129 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1871a9d6-6369-4d08-816f-9c6310b61ddf-config\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795147 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdtr\" (UniqueName: \"kubernetes.io/projected/1ee7a76b-cf1d-4513-b314-5aa314da818d-kube-api-access-lkdtr\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795158 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1871a9d6-6369-4d08-816f-9c6310b61ddf-serving-cert\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795165 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795239 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqq7\" (UniqueName: \"kubernetes.io/projected/a280c582-685e-47ac-bf6b-248aa0c129a9-kube-api-access-xkqq7\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795266 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-key\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795285 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7n7\" (UniqueName: \"kubernetes.io/projected/a14df948-1ec4-4785-ad33-28d1e7063959-kube-api-access-2g7n7\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795306 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795329 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-config\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795354 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/153fec1f-a10b-4c6c-a997-60fa80c13a86-cache\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795376 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/909ed395-8ad3-4350-95e3-b4b19c682f92-tls-certificates\") pod \"prometheus-operator-admission-webhook-7c85c4dffd-sbvlr\" (UID: \"909ed395-8ad3-4350-95e3-b4b19c682f92\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788537 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795394 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795411 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795431 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-utilities\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795448 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnwdh\" (UniqueName: \"kubernetes.io/projected/a5338041-f213-46ef-9d81-248567ba958d-kube-api-access-bnwdh\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795466 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-cabundle\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795490 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795511 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d53a4886-db25-43a1-825a-66a9a9a58590-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795531 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14df948-1ec4-4785-ad33-28d1e7063959-serving-cert\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795553 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht5kr\" (UniqueName: \"kubernetes.io/projected/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-kube-api-access-ht5kr\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795572 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a72c8b-0f12-446b-8a42-53d98864c8f8-service-ca-bundle\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795592 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-images\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795610 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-service-ca-bundle\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795630 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b13885ef-d2b5-4591-825d-446cf8729bc1-tmpfs\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795634 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795665 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmjn7\" (UniqueName: \"kubernetes.io/projected/bc18a83a-998e-458e-87f0-d5368da52e1b-kube-api-access-bmjn7\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795683 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-default-certificate\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795701 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g7mj\" (UniqueName: \"kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-kube-api-access-5g7mj\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795711 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b13885ef-d2b5-4591-825d-446cf8729bc1-tmpfs\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795721 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb46q\" (UniqueName: \"kubernetes.io/projected/e5dfcb1e-1231-4f07-8c21-748965718729-kube-api-access-pb46q\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795753 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/dbe144b5-3b78-4946-bbf9-b825b0e47b07-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795785 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps4ws\" (UniqueName: \"kubernetes.io/projected/58187662-b502-4d90-95ce-2aa91a81d256-kube-api-access-ps4ws\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795811 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795871 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwcr9\" (UniqueName: \"kubernetes.io/projected/f119ffe4-16bd-49eb-916d-b18ba0d79b54-kube-api-access-wwcr9\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795895 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74e0607-6ed0-4119-8870-895b7d336830-utilities\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795896 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795937 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czcmr\" (UniqueName: \"kubernetes.io/projected/9c31f89c-b01b-4853-a901-bccc25441a46-kube-api-access-czcmr\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795982 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796001 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a5338041-f213-46ef-9d81-248567ba958d-audit-log\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796020 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796042 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796068 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69n5s\" (UniqueName: \"kubernetes.io/projected/fb7003a6-4341-49eb-bec3-76ba8610fa12-kube-api-access-69n5s\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796088 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cert\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796110 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmjkp\" (UniqueName: \"kubernetes.io/projected/b13885ef-d2b5-4591-825d-446cf8729bc1-kube-api-access-xmjkp\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796128 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796148 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796167 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-srv-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796200 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796217 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z8h9\" (UniqueName: \"kubernetes.io/projected/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-kube-api-access-9z8h9\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796235 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796250 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796269 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55qpg\" (UniqueName: \"kubernetes.io/projected/ba095394-1873-4793-969d-3be979fa0771-kube-api-access-55qpg\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796286 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1871a9d6-6369-4d08-816f-9c6310b61ddf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796295 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-package-server-manager-serving-cert\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796304 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-config-volume\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796327 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796343 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-metrics-certs\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796361 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-ca-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796378 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796395 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796398 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/153fec1f-a10b-4c6c-a997-60fa80c13a86-cache\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.795354 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-env-overrides\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796500 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a5338041-f213-46ef-9d81-248567ba958d-audit-log\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788557 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.796414 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lckv7\" (UniqueName: \"kubernetes.io/projected/665c4362-e2e5-4f96-92c0-1746c63c7422-kube-api-access-lckv7\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788624 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788628 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 12:50:04.802970 master-0 kubenswrapper[29936]: I1205 12:50:04.788630 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.788665 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.796913 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-ovnkube-config\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.788875 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.796984 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/58187662-b502-4d90-95ce-2aa91a81d256-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.797115 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-config\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.797200 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/5efad170-c154-42ec-a7c0-b36a98d2bfcc-host-etc-kube\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.797212 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.788964 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.789027 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.789100 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.789248 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.789266 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.789302 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.789351 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.789358 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.789390 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.789402 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.789457 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.790694 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.790817 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.790909 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791166 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791301 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791352 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791365 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791366 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791421 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791461 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791463 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791507 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791537 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791549 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791620 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.791625 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.792151 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.792236 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.792324 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.792423 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.797945 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b741029-0eb5-409b-b7f1-95e8385dc400-cache\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.798148 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.798128 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f119ffe4-16bd-49eb-916d-b18ba0d79b54-etcd-service-ca\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.798248 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-utilities\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.798566 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-service-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.798714 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.798835 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d53a4886-db25-43a1-825a-66a9a9a58590-serving-cert\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.798854 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.798931 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.799034 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.799054 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1871a9d6-6369-4d08-816f-9c6310b61ddf-config\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.799239 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbg7w\" (UniqueName: \"kubernetes.io/projected/dbe144b5-3b78-4946-bbf9-b825b0e47b07-kube-api-access-mbg7w\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.800903 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62nqj\" (UniqueName: \"kubernetes.io/projected/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-kube-api-access-62nqj\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.800945 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801010 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801040 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlnqb\" (UniqueName: \"kubernetes.io/projected/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-kube-api-access-mlnqb\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801067 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-images\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801092 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-metrics-tls\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801203 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801230 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5efad170-c154-42ec-a7c0-b36a98d2bfcc-metrics-tls\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801256 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba095394-1873-4793-969d-3be979fa0771-serving-cert\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801283 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801309 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/dbe144b5-3b78-4946-bbf9-b825b0e47b07-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801341 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-catalog-content\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801361 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801458 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr9jd\" (UniqueName: \"kubernetes.io/projected/c39d2089-d3bf-4556-b6ef-c362a08c21a2-kube-api-access-mr9jd\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801537 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5dfcb1e-1231-4f07-8c21-748965718729-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.801958 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c31f89c-b01b-4853-a901-bccc25441a46-catalog-content\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.802461 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba095394-1873-4793-969d-3be979fa0771-serving-cert\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.802803 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5efad170-c154-42ec-a7c0-b36a98d2bfcc-metrics-tls\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.803625 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.804719 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-config\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.804790 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d53a4886-db25-43a1-825a-66a9a9a58590-config\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.804945 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.805311 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.808149 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba095394-1873-4793-969d-3be979fa0771-trusted-ca-bundle\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:50:04.809353 master-0 kubenswrapper[29936]: I1205 12:50:04.808747 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 05 12:50:04.812715 master-0 kubenswrapper[29936]: I1205 12:50:04.812245 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-ca-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.813038 master-0 kubenswrapper[29936]: I1205 12:50:04.813007 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 12:50:04.813518 master-0 kubenswrapper[29936]: I1205 12:50:04.813460 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 12:50:04.814985 master-0 kubenswrapper[29936]: I1205 12:50:04.814939 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 12:50:04.815328 master-0 kubenswrapper[29936]: I1205 12:50:04.815275 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 05 12:50:04.818709 master-0 kubenswrapper[29936]: I1205 12:50:04.818670 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/3b741029-0eb5-409b-b7f1-95e8385dc400-catalogserver-certs\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.825286 master-0 kubenswrapper[29936]: I1205 12:50:04.825240 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 05 12:50:04.835171 master-0 kubenswrapper[29936]: I1205 12:50:04.835104 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 05 12:50:04.840104 master-0 kubenswrapper[29936]: I1205 12:50:04.840020 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-ca-certs\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.849965 master-0 kubenswrapper[29936]: I1205 12:50:04.849911 29936 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 05 12:50:04.860571 master-0 kubenswrapper[29936]: E1205 12:50:04.858920 29936 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Dec 05 12:50:04.860571 master-0 kubenswrapper[29936]: I1205 12:50:04.860353 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 05 12:50:04.871777 master-0 kubenswrapper[29936]: I1205 12:50:04.871486 29936 scope.go:117] "RemoveContainer" containerID="eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8" Dec 05 12:50:04.872741 master-0 kubenswrapper[29936]: E1205 12:50:04.872694 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8\": container with ID starting with eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8 not found: ID does not exist" containerID="eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8" Dec 05 12:50:04.872819 master-0 kubenswrapper[29936]: I1205 12:50:04.872757 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8"} err="failed to get container status \"eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8\": rpc error: code = NotFound desc = could not find container \"eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8\": container with ID starting with eb64e2421ed896450777b0f0f93d8bac59a879b9f30c7599b0e2a7c59b1f3be8 not found: ID does not exist" Dec 05 12:50:04.880545 master-0 kubenswrapper[29936]: I1205 12:50:04.880457 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 12:50:04.902656 master-0 kubenswrapper[29936]: I1205 12:50:04.902590 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-netns\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.902881 master-0 kubenswrapper[29936]: I1205 12:50:04.902669 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.902881 master-0 kubenswrapper[29936]: I1205 12:50:04.902697 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.902881 master-0 kubenswrapper[29936]: I1205 12:50:04.902718 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-root\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:04.902881 master-0 kubenswrapper[29936]: I1205 12:50:04.902740 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-serving-cert\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:04.902881 master-0 kubenswrapper[29936]: I1205 12:50:04.902764 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1478a21e-b6ac-46fb-ad01-805ac71f0a79-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:50:04.902881 master-0 kubenswrapper[29936]: I1205 12:50:04.902794 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0792bf-e2da-4ee7-91fe-032299cea42f-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:04.902881 master-0 kubenswrapper[29936]: I1205 12:50:04.902817 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:50:04.902881 master-0 kubenswrapper[29936]: I1205 12:50:04.902843 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:04.902881 master-0 kubenswrapper[29936]: I1205 12:50:04.902865 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a45f340c-0eca-4460-8961-4ca360467eeb-serving-cert\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:50:04.903132 master-0 kubenswrapper[29936]: I1205 12:50:04.902887 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-sys\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:04.903132 master-0 kubenswrapper[29936]: I1205 12:50:04.902930 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-utilities\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:50:04.903132 master-0 kubenswrapper[29936]: I1205 12:50:04.902952 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-system-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.903132 master-0 kubenswrapper[29936]: I1205 12:50:04.902976 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-config\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:50:04.903132 master-0 kubenswrapper[29936]: I1205 12:50:04.903006 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1478a21e-b6ac-46fb-ad01-805ac71f0a79-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:50:04.903132 master-0 kubenswrapper[29936]: I1205 12:50:04.903046 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60327040-f782-4cda-a32d-52a4f183073c-metrics-client-ca\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:04.903132 master-0 kubenswrapper[29936]: I1205 12:50:04.903085 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:04.903132 master-0 kubenswrapper[29936]: I1205 12:50:04.903111 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:50:04.903132 master-0 kubenswrapper[29936]: I1205 12:50:04.903136 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-encryption-config\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.903429 master-0 kubenswrapper[29936]: I1205 12:50:04.903159 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-kubernetes\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.903429 master-0 kubenswrapper[29936]: I1205 12:50:04.903206 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-ovn\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.903429 master-0 kubenswrapper[29936]: I1205 12:50:04.903235 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqblj\" (UniqueName: \"kubernetes.io/projected/531b8927-92db-4e9d-9a0a-12ff948cdaad-kube-api-access-xqblj\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:50:04.903429 master-0 kubenswrapper[29936]: I1205 12:50:04.903261 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-tuned\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.903429 master-0 kubenswrapper[29936]: I1205 12:50:04.903293 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-systemd\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.903429 master-0 kubenswrapper[29936]: I1205 12:50:04.903321 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tncxt\" (UniqueName: \"kubernetes.io/projected/ebfbe878-1796-4a20-b3f0-76165038252e-kube-api-access-tncxt\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:50:04.903429 master-0 kubenswrapper[29936]: I1205 12:50:04.903344 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-utilities\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:50:04.903429 master-0 kubenswrapper[29936]: I1205 12:50:04.903374 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807d9093-aa67-4840-b5be-7f3abcc1beed-config\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:50:04.903429 master-0 kubenswrapper[29936]: I1205 12:50:04.903396 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-client\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.903429 master-0 kubenswrapper[29936]: I1205 12:50:04.903419 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95d8fb27-8b2b-4749-add3-9e9b16edb693-rootfs\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:04.903702 master-0 kubenswrapper[29936]: I1205 12:50:04.903443 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-system-cni-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.903702 master-0 kubenswrapper[29936]: I1205 12:50:04.903485 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:50:04.903702 master-0 kubenswrapper[29936]: I1205 12:50:04.903508 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:04.903702 master-0 kubenswrapper[29936]: I1205 12:50:04.903530 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-sys\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.903702 master-0 kubenswrapper[29936]: I1205 12:50:04.903551 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.903702 master-0 kubenswrapper[29936]: I1205 12:50:04.903592 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/594aaded-5615-4bed-87ee-6173059a73be-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:50:04.903702 master-0 kubenswrapper[29936]: I1205 12:50:04.903638 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:04.903702 master-0 kubenswrapper[29936]: I1205 12:50:04.903663 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:04.903934 master-0 kubenswrapper[29936]: I1205 12:50:04.903708 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-etc-kubernetes\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.903934 master-0 kubenswrapper[29936]: I1205 12:50:04.903865 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.904097 master-0 kubenswrapper[29936]: I1205 12:50:04.904062 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d215811-6210-4ec2-8356-f1533dc43f65-kube-api-access\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:50:04.904097 master-0 kubenswrapper[29936]: I1205 12:50:04.904078 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-utilities\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:50:04.904284 master-0 kubenswrapper[29936]: I1205 12:50:04.904254 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-config\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:50:04.904330 master-0 kubenswrapper[29936]: I1205 12:50:04.904295 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807d9093-aa67-4840-b5be-7f3abcc1beed-config\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:50:04.904366 master-0 kubenswrapper[29936]: I1205 12:50:04.904347 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-modprobe-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.904727 master-0 kubenswrapper[29936]: I1205 12:50:04.904372 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-netns\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.904727 master-0 kubenswrapper[29936]: I1205 12:50:04.904394 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:50:04.904727 master-0 kubenswrapper[29936]: I1205 12:50:04.904420 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-config\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:50:04.904727 master-0 kubenswrapper[29936]: I1205 12:50:04.904459 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.904727 master-0 kubenswrapper[29936]: I1205 12:50:04.904477 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594aaded-5615-4bed-87ee-6173059a73be-serving-cert\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:50:04.904727 master-0 kubenswrapper[29936]: I1205 12:50:04.904495 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-tuned\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.904727 master-0 kubenswrapper[29936]: I1205 12:50:04.904497 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.904727 master-0 kubenswrapper[29936]: I1205 12:50:04.904597 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-containers\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.904727 master-0 kubenswrapper[29936]: I1205 12:50:04.904730 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-utilities\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.904733 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/594aaded-5615-4bed-87ee-6173059a73be-serving-cert\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.904790 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.904827 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.904851 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.904874 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.904900 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.904910 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.904952 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.904975 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-client\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.905022 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.905062 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-6v84m\" (UID: \"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.905102 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-multus-certs\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.905124 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-wtmp\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.905150 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.905158 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-metrics-tls\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.905227 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.905253 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-conf-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.905294 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38941513-e968-45f1-9cb2-b63d40338f36-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:50:04.905461 master-0 kubenswrapper[29936]: I1205 12:50:04.905461 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-apiservice-cert\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:50:04.905970 master-0 kubenswrapper[29936]: I1205 12:50:04.905538 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-containers\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.905970 master-0 kubenswrapper[29936]: I1205 12:50:04.905568 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:50:04.905970 master-0 kubenswrapper[29936]: I1205 12:50:04.905567 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38941513-e968-45f1-9cb2-b63d40338f36-trusted-ca\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:50:04.905970 master-0 kubenswrapper[29936]: I1205 12:50:04.905602 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:04.905970 master-0 kubenswrapper[29936]: I1205 12:50:04.905663 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cni-binary-copy\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.905970 master-0 kubenswrapper[29936]: I1205 12:50:04.905734 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:04.905970 master-0 kubenswrapper[29936]: I1205 12:50:04.905863 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-node-pullsecrets\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.905970 master-0 kubenswrapper[29936]: I1205 12:50:04.905895 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-conf\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.905970 master-0 kubenswrapper[29936]: I1205 12:50:04.905922 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwrwm\" (UniqueName: \"kubernetes.io/projected/f2635f9f-219b-4d03-b5b3-496c0c836fae-kube-api-access-fwrwm\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.905970 master-0 kubenswrapper[29936]: I1205 12:50:04.905959 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7ftf\" (UniqueName: \"kubernetes.io/projected/a45f340c-0eca-4460-8961-4ca360467eeb-kube-api-access-r7ftf\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:50:04.906288 master-0 kubenswrapper[29936]: I1205 12:50:04.905985 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:50:04.906288 master-0 kubenswrapper[29936]: I1205 12:50:04.906010 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2gd8\" (UniqueName: \"kubernetes.io/projected/1e6babfe-724a-4eab-bb3b-bc318bf57b70-kube-api-access-c2gd8\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:50:04.906288 master-0 kubenswrapper[29936]: I1205 12:50:04.906064 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95d8fb27-8b2b-4749-add3-9e9b16edb693-mcd-auth-proxy-config\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:04.906288 master-0 kubenswrapper[29936]: I1205 12:50:04.906088 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/365bf663-fd5b-44df-a327-0438995c015d-proxy-tls\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:04.906288 master-0 kubenswrapper[29936]: I1205 12:50:04.906114 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.906288 master-0 kubenswrapper[29936]: I1205 12:50:04.906137 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:04.906288 master-0 kubenswrapper[29936]: I1205 12:50:04.906161 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-host\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.906288 master-0 kubenswrapper[29936]: I1205 12:50:04.906232 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a45f340c-0eca-4460-8961-4ca360467eeb-available-featuregates\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:50:04.906288 master-0 kubenswrapper[29936]: I1205 12:50:04.906259 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-env-overrides\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:50:04.906288 master-0 kubenswrapper[29936]: I1205 12:50:04.906284 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz4q6\" (UniqueName: \"kubernetes.io/projected/1478a21e-b6ac-46fb-ad01-805ac71f0a79-kube-api-access-fz4q6\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:50:04.906562 master-0 kubenswrapper[29936]: I1205 12:50:04.906315 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807d9093-aa67-4840-b5be-7f3abcc1beed-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:50:04.906562 master-0 kubenswrapper[29936]: I1205 12:50:04.906338 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwz29\" (UniqueName: \"kubernetes.io/projected/db2e54b6-4879-40f4-9359-a8b0c31e76c2-kube-api-access-nwz29\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:50:04.906562 master-0 kubenswrapper[29936]: I1205 12:50:04.906363 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb42t\" (UniqueName: \"kubernetes.io/projected/95d8fb27-8b2b-4749-add3-9e9b16edb693-kube-api-access-fb42t\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:04.906562 master-0 kubenswrapper[29936]: I1205 12:50:04.906386 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-os-release\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.906562 master-0 kubenswrapper[29936]: I1205 12:50:04.906407 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2acba71-b9dc-4b85-be35-c995b8be2f19-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:50:04.906562 master-0 kubenswrapper[29936]: I1205 12:50:04.906431 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-os-release\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.906711 master-0 kubenswrapper[29936]: I1205 12:50:04.906583 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cni-binary-copy\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.906778 master-0 kubenswrapper[29936]: I1205 12:50:04.906753 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-operator-metrics\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:50:04.906909 master-0 kubenswrapper[29936]: I1205 12:50:04.906874 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a45f340c-0eca-4460-8961-4ca360467eeb-available-featuregates\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:50:04.906981 master-0 kubenswrapper[29936]: I1205 12:50:04.906959 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807d9093-aa67-4840-b5be-7f3abcc1beed-serving-cert\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907015 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-env-overrides\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907159 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907237 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2acba71-b9dc-4b85-be35-c995b8be2f19-trusted-ca\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907246 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bjs8\" (UniqueName: \"kubernetes.io/projected/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-api-access-4bjs8\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907281 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6wsq\" (UniqueName: \"kubernetes.io/projected/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-kube-api-access-b6wsq\") pod \"cluster-samples-operator-797cfd8b47-6v84m\" (UID: \"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907497 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907505 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d0792bf-e2da-4ee7-91fe-032299cea42f-kube-api-access\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907584 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp957\" (UniqueName: \"kubernetes.io/projected/60327040-f782-4cda-a32d-52a4f183073c-kube-api-access-zp957\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907614 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-systemd-units\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907641 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-env-overrides\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907685 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-daemon-config\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907714 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-bound-sa-token\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907742 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nml2g\" (UniqueName: \"kubernetes.io/projected/a2acba71-b9dc-4b85-be35-c995b8be2f19-kube-api-access-nml2g\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:50:04.907853 master-0 kubenswrapper[29936]: I1205 12:50:04.907773 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.907945 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-daemon-config\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.907959 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit-dir\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908026 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48ns8\" (UniqueName: \"kubernetes.io/projected/480c1f6e-0e13-49f9-bc4e-07350842f16c-kube-api-access-48ns8\") pod \"migrator-74b7b57c65-fp4s6\" (UID: \"480c1f6e-0e13-49f9-bc4e-07350842f16c\") " pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908052 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d0792bf-e2da-4ee7-91fe-032299cea42f-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908078 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-bin\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908091 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-whereabouts-configmap\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908108 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5p5d\" (UniqueName: \"kubernetes.io/projected/5f0c6889-0739-48a3-99cd-6db9d1f83242-kube-api-access-p5p5d\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908134 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908201 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908227 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908263 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-volume-directive-shadow\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908292 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss5kh\" (UniqueName: \"kubernetes.io/projected/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-kube-api-access-ss5kh\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:50:04.908348 master-0 kubenswrapper[29936]: I1205 12:50:04.908312 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-env-overrides\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.908701 master-0 kubenswrapper[29936]: I1205 12:50:04.908317 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-var-lock\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:50:04.908701 master-0 kubenswrapper[29936]: I1205 12:50:04.908362 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/3b741029-0eb5-409b-b7f1-95e8385dc400-etc-docker\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:04.908701 master-0 kubenswrapper[29936]: I1205 12:50:04.908397 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-volume-directive-shadow\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:04.908701 master-0 kubenswrapper[29936]: I1205 12:50:04.908397 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-binary-copy\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.908701 master-0 kubenswrapper[29936]: I1205 12:50:04.908576 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc18a83a-998e-458e-87f0-d5368da52e1b-hosts-file\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:50:04.908701 master-0 kubenswrapper[29936]: I1205 12:50:04.908632 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-binary-copy\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.908701 master-0 kubenswrapper[29936]: I1205 12:50:04.908666 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc18a83a-998e-458e-87f0-d5368da52e1b-hosts-file\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:50:04.908985 master-0 kubenswrapper[29936]: I1205 12:50:04.908712 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:04.908985 master-0 kubenswrapper[29936]: I1205 12:50:04.908734 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:04.908985 master-0 kubenswrapper[29936]: I1205 12:50:04.908774 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqvfm\" (UniqueName: \"kubernetes.io/projected/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-kube-api-access-nqvfm\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:04.908985 master-0 kubenswrapper[29936]: I1205 12:50:04.908798 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cnibin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.908985 master-0 kubenswrapper[29936]: I1205 12:50:04.908889 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x59kd\" (UniqueName: \"kubernetes.io/projected/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-kube-api-access-x59kd\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.908985 master-0 kubenswrapper[29936]: I1205 12:50:04.908951 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:04.908985 master-0 kubenswrapper[29936]: I1205 12:50:04.908983 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:50:04.909212 master-0 kubenswrapper[29936]: I1205 12:50:04.909002 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:50:04.909212 master-0 kubenswrapper[29936]: I1205 12:50:04.909037 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:04.909212 master-0 kubenswrapper[29936]: I1205 12:50:04.909069 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovn-node-metrics-cert\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.909212 master-0 kubenswrapper[29936]: I1205 12:50:04.909128 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:50:04.909376 master-0 kubenswrapper[29936]: I1205 12:50:04.909232 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-encryption-config\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:04.909376 master-0 kubenswrapper[29936]: I1205 12:50:04.909245 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:50:04.909376 master-0 kubenswrapper[29936]: I1205 12:50:04.909252 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-dir\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:04.909466 master-0 kubenswrapper[29936]: I1205 12:50:04.909370 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovn-node-metrics-cert\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.909466 master-0 kubenswrapper[29936]: I1205 12:50:04.909389 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d8fb27-8b2b-4749-add3-9e9b16edb693-proxy-tls\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:04.909466 master-0 kubenswrapper[29936]: I1205 12:50:04.909411 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-kubelet\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.909466 master-0 kubenswrapper[29936]: I1205 12:50:04.909434 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3792522-fec6-4022-90ac-0b8467fcd625-config\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:50:04.909579 master-0 kubenswrapper[29936]: I1205 12:50:04.909479 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.909579 master-0 kubenswrapper[29936]: I1205 12:50:04.909512 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-run\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.909579 master-0 kubenswrapper[29936]: I1205 12:50:04.909539 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-k8s-cni-cncf-io\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.909579 master-0 kubenswrapper[29936]: I1205 12:50:04.909569 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-policies\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:04.909698 master-0 kubenswrapper[29936]: I1205 12:50:04.909593 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjp62\" (UniqueName: \"kubernetes.io/projected/d72b2b71-27b2-4aff-bf69-7054a9556318-kube-api-access-wjp62\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:04.909698 master-0 kubenswrapper[29936]: I1205 12:50:04.909639 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3792522-fec6-4022-90ac-0b8467fcd625-config\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:50:04.909754 master-0 kubenswrapper[29936]: I1205 12:50:04.909717 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.909754 master-0 kubenswrapper[29936]: I1205 12:50:04.909747 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjkjz\" (UniqueName: \"kubernetes.io/projected/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-kube-api-access-tjkjz\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:50:04.909815 master-0 kubenswrapper[29936]: I1205 12:50:04.909769 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-netd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.909815 master-0 kubenswrapper[29936]: I1205 12:50:04.909793 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-trusted-ca-bundle\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.909869 master-0 kubenswrapper[29936]: I1205 12:50:04.909817 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-ovnkube-identity-cm\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:50:04.909869 master-0 kubenswrapper[29936]: I1205 12:50:04.909832 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/153fec1f-a10b-4c6c-a997-60fa80c13a86-etc-docker\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:04.909869 master-0 kubenswrapper[29936]: I1205 12:50:04.909840 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-script-lib\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.909955 master-0 kubenswrapper[29936]: I1205 12:50:04.909901 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph9w6\" (UniqueName: \"kubernetes.io/projected/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-kube-api-access-ph9w6\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:50:04.910213 master-0 kubenswrapper[29936]: I1205 12:50:04.910042 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/b8233dad-bd19-4842-a4d5-cfa84f1feb83-ovnkube-identity-cm\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:50:04.910213 master-0 kubenswrapper[29936]: I1205 12:50:04.910080 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-var-lib-kubelet\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.910213 master-0 kubenswrapper[29936]: I1205 12:50:04.910101 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/dbe144b5-3b78-4946-bbf9-b825b0e47b07-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:04.910213 master-0 kubenswrapper[29936]: I1205 12:50:04.910120 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594aaded-5615-4bed-87ee-6173059a73be-config\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:50:04.910213 master-0 kubenswrapper[29936]: I1205 12:50:04.910161 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/dbe144b5-3b78-4946-bbf9-b825b0e47b07-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:04.910213 master-0 kubenswrapper[29936]: I1205 12:50:04.910197 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:04.910403 master-0 kubenswrapper[29936]: I1205 12:50:04.910223 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-script-lib\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.910403 master-0 kubenswrapper[29936]: I1205 12:50:04.910233 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:50:04.910403 master-0 kubenswrapper[29936]: I1205 12:50:04.910252 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.910403 master-0 kubenswrapper[29936]: I1205 12:50:04.910327 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/594aaded-5615-4bed-87ee-6173059a73be-config\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:50:04.910403 master-0 kubenswrapper[29936]: I1205 12:50:04.910346 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vx2z\" (UniqueName: \"kubernetes.io/projected/708bf629-9949-4b79-a88a-c73ba033475b-kube-api-access-6vx2z\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.910403 master-0 kubenswrapper[29936]: I1205 12:50:04.910393 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/708bf629-9949-4b79-a88a-c73ba033475b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.910582 master-0 kubenswrapper[29936]: I1205 12:50:04.910413 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-image-import-ca\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.910582 master-0 kubenswrapper[29936]: I1205 12:50:04.910441 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:04.910582 master-0 kubenswrapper[29936]: I1205 12:50:04.910471 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/531b8927-92db-4e9d-9a0a-12ff948cdaad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:50:04.910582 master-0 kubenswrapper[29936]: I1205 12:50:04.910465 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-trusted-ca-bundle\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.910582 master-0 kubenswrapper[29936]: I1205 12:50:04.910550 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:50:04.910730 master-0 kubenswrapper[29936]: I1205 12:50:04.910621 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/38941513-e968-45f1-9cb2-b63d40338f36-image-registry-operator-tls\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:50:04.910730 master-0 kubenswrapper[29936]: I1205 12:50:04.910630 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-node-log\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.910730 master-0 kubenswrapper[29936]: I1205 12:50:04.910659 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.910730 master-0 kubenswrapper[29936]: I1205 12:50:04.910679 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:50:04.910730 master-0 kubenswrapper[29936]: I1205 12:50:04.910706 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.910730 master-0 kubenswrapper[29936]: I1205 12:50:04.910726 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-bin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.910888 master-0 kubenswrapper[29936]: I1205 12:50:04.910807 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:50:04.910888 master-0 kubenswrapper[29936]: I1205 12:50:04.910876 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.910946 master-0 kubenswrapper[29936]: I1205 12:50:04.910884 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e6babfe-724a-4eab-bb3b-bc318bf57b70-marketplace-trusted-ca\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:50:04.910946 master-0 kubenswrapper[29936]: I1205 12:50:04.910900 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-kubelet\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.910946 master-0 kubenswrapper[29936]: I1205 12:50:04.910928 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.911032 master-0 kubenswrapper[29936]: I1205 12:50:04.910958 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:04.911032 master-0 kubenswrapper[29936]: I1205 12:50:04.910992 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.911032 master-0 kubenswrapper[29936]: I1205 12:50:04.910986 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-catalog-content\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:50:04.911114 master-0 kubenswrapper[29936]: I1205 12:50:04.911044 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjbwh\" (UniqueName: \"kubernetes.io/projected/d3e283fe-a474-4f83-ad66-62971945060a-kube-api-access-pjbwh\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:50:04.911114 master-0 kubenswrapper[29936]: I1205 12:50:04.911073 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-certs\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:50:04.911114 master-0 kubenswrapper[29936]: I1205 12:50:04.911081 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8defe125-1529-4091-adff-e9d17a2b298f-catalog-content\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:50:04.911114 master-0 kubenswrapper[29936]: I1205 12:50:04.911090 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:04.911114 master-0 kubenswrapper[29936]: I1205 12:50:04.911107 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-var-lib-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.911287 master-0 kubenswrapper[29936]: I1205 12:50:04.911136 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/5efad170-c154-42ec-a7c0-b36a98d2bfcc-host-etc-kube\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:50:04.911287 master-0 kubenswrapper[29936]: I1205 12:50:04.911149 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2acba71-b9dc-4b85-be35-c995b8be2f19-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:50:04.911287 master-0 kubenswrapper[29936]: I1205 12:50:04.911156 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:50:04.911287 master-0 kubenswrapper[29936]: I1205 12:50:04.911234 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/5efad170-c154-42ec-a7c0-b36a98d2bfcc-host-etc-kube\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:50:04.911287 master-0 kubenswrapper[29936]: I1205 12:50:04.911266 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-config\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.911287 master-0 kubenswrapper[29936]: I1205 12:50:04.911282 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-tmp\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.911447 master-0 kubenswrapper[29936]: I1205 12:50:04.911299 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-hostroot\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.911447 master-0 kubenswrapper[29936]: I1205 12:50:04.911322 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-cnibin\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:04.911447 master-0 kubenswrapper[29936]: I1205 12:50:04.911342 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmq98\" (UniqueName: \"kubernetes.io/projected/4492c55f-701b-4ec8-ada1-0a5dc126d405-kube-api-access-dmq98\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.911447 master-0 kubenswrapper[29936]: I1205 12:50:04.911358 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.911447 master-0 kubenswrapper[29936]: I1205 12:50:04.911374 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:04.911447 master-0 kubenswrapper[29936]: I1205 12:50:04.911389 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-multus\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.911447 master-0 kubenswrapper[29936]: I1205 12:50:04.911406 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-trusted-ca\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:50:04.911447 master-0 kubenswrapper[29936]: I1205 12:50:04.911422 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-trusted-ca-bundle\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:04.911447 master-0 kubenswrapper[29936]: I1205 12:50:04.911445 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh58c\" (UniqueName: \"kubernetes.io/projected/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-kube-api-access-dh58c\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.911447 master-0 kubenswrapper[29936]: I1205 12:50:04.911452 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8233dad-bd19-4842-a4d5-cfa84f1feb83-webhook-cert\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:50:04.911708 master-0 kubenswrapper[29936]: I1205 12:50:04.911471 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-lib-modules\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.911708 master-0 kubenswrapper[29936]: I1205 12:50:04.911497 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjgb\" (UniqueName: \"kubernetes.io/projected/365bf663-fd5b-44df-a327-0438995c015d-kube-api-access-lqjgb\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:04.911708 master-0 kubenswrapper[29936]: I1205 12:50:04.911523 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:04.911708 master-0 kubenswrapper[29936]: I1205 12:50:04.911539 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-socket-dir-parent\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:04.911708 master-0 kubenswrapper[29936]: I1205 12:50:04.911556 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.911708 master-0 kubenswrapper[29936]: I1205 12:50:04.911578 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:04.911708 master-0 kubenswrapper[29936]: I1205 12:50:04.911601 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:04.911708 master-0 kubenswrapper[29936]: I1205 12:50:04.911619 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtjln\" (UniqueName: \"kubernetes.io/projected/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-kube-api-access-xtjln\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:50:04.911708 master-0 kubenswrapper[29936]: I1205 12:50:04.911635 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5hdg\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-kube-api-access-t5hdg\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:50:04.911953 master-0 kubenswrapper[29936]: I1205 12:50:04.911821 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2635f9f-219b-4d03-b5b3-496c0c836fae-tmp\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.912094 master-0 kubenswrapper[29936]: I1205 12:50:04.912071 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a906debd0c35952850935aee2d607cce\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:04.912138 master-0 kubenswrapper[29936]: I1205 12:50:04.912109 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:04.912625 master-0 kubenswrapper[29936]: I1205 12:50:04.912200 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-node-bootstrap-token\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:50:04.912625 master-0 kubenswrapper[29936]: I1205 12:50:04.912279 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-serving-ca\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:04.912625 master-0 kubenswrapper[29936]: I1205 12:50:04.912417 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26x2z\" (UniqueName: \"kubernetes.io/projected/49760d62-02e5-4882-b47f-663102b04946-kube-api-access-26x2z\") pod \"csi-snapshot-controller-operator-6bc8656fdc-zn7hv\" (UID: \"49760d62-02e5-4882-b47f-663102b04946\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" Dec 05 12:50:04.912625 master-0 kubenswrapper[29936]: I1205 12:50:04.912445 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-trusted-ca\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:50:04.912625 master-0 kubenswrapper[29936]: I1205 12:50:04.912486 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-textfile\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:04.912625 master-0 kubenswrapper[29936]: I1205 12:50:04.912524 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3792522-fec6-4022-90ac-0b8467fcd625-serving-cert\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:50:04.912625 master-0 kubenswrapper[29936]: I1205 12:50:04.912559 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/807d9093-aa67-4840-b5be-7f3abcc1beed-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:50:04.912625 master-0 kubenswrapper[29936]: I1205 12:50:04.912593 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-textfile\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:04.912874 master-0 kubenswrapper[29936]: I1205 12:50:04.912660 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-srv-cert\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:50:04.912874 master-0 kubenswrapper[29936]: I1205 12:50:04.912755 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-catalog-content\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:50:04.912874 master-0 kubenswrapper[29936]: I1205 12:50:04.912792 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfknz\" (UniqueName: \"kubernetes.io/projected/e943438b-1de8-435c-8a19-accd6a6292a4-kube-api-access-lfknz\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:04.912874 master-0 kubenswrapper[29936]: I1205 12:50:04.912806 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3792522-fec6-4022-90ac-0b8467fcd625-serving-cert\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:50:04.912874 master-0 kubenswrapper[29936]: I1205 12:50:04.912820 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebfbe878-1796-4a20-b3f0-76165038252e-catalog-content\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:50:04.912874 master-0 kubenswrapper[29936]: I1205 12:50:04.912820 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:04.912874 master-0 kubenswrapper[29936]: I1205 12:50:04.912858 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:04.912874 master-0 kubenswrapper[29936]: I1205 12:50:04.912878 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:04.913093 master-0 kubenswrapper[29936]: I1205 12:50:04.912897 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99996137-2621-458b-980d-584b3640d4ad-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:04.913093 master-0 kubenswrapper[29936]: I1205 12:50:04.912955 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nbxt\" (UniqueName: \"kubernetes.io/projected/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-kube-api-access-2nbxt\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:04.913093 master-0 kubenswrapper[29936]: I1205 12:50:04.912977 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysconfig\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:04.913093 master-0 kubenswrapper[29936]: I1205 12:50:04.913034 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-slash\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.913222 master-0 kubenswrapper[29936]: I1205 12:50:04.913105 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-etc-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.913222 master-0 kubenswrapper[29936]: I1205 12:50:04.913141 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-config\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.913222 master-0 kubenswrapper[29936]: I1205 12:50:04.913216 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:04.913300 master-0 kubenswrapper[29936]: I1205 12:50:04.913237 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:04.913300 master-0 kubenswrapper[29936]: I1205 12:50:04.913260 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtvzs\" (UniqueName: \"kubernetes.io/projected/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-kube-api-access-dtvzs\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:50:04.913300 master-0 kubenswrapper[29936]: I1205 12:50:04.913279 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-serving-ca\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.913384 master-0 kubenswrapper[29936]: I1205 12:50:04.913300 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:50:04.913384 master-0 kubenswrapper[29936]: I1205 12:50:04.913339 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4492c55f-701b-4ec8-ada1-0a5dc126d405-ovnkube-config\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.913384 master-0 kubenswrapper[29936]: I1205 12:50:04.913378 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:04.913467 master-0 kubenswrapper[29936]: I1205 12:50:04.913410 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:50:04.913467 master-0 kubenswrapper[29936]: I1205 12:50:04.913435 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c69rc\" (UniqueName: \"kubernetes.io/projected/99996137-2621-458b-980d-584b3640d4ad-kube-api-access-c69rc\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:04.913467 master-0 kubenswrapper[29936]: I1205 12:50:04.913457 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:04.913559 master-0 kubenswrapper[29936]: I1205 12:50:04.913491 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:50:04.913559 master-0 kubenswrapper[29936]: I1205 12:50:04.913555 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-operand-assets\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:50:04.913621 master-0 kubenswrapper[29936]: I1205 12:50:04.913567 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxw7\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-kube-api-access-fxxw7\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:50:04.913621 master-0 kubenswrapper[29936]: I1205 12:50:04.913595 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-serving-cert\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:50:04.913621 master-0 kubenswrapper[29936]: I1205 12:50:04.913605 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-host-slash\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:50:04.913621 master-0 kubenswrapper[29936]: I1205 12:50:04.913597 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-serving-cert\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:50:04.913727 master-0 kubenswrapper[29936]: I1205 12:50:04.913646 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-images\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:04.913727 master-0 kubenswrapper[29936]: I1205 12:50:04.913668 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-host-slash\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:50:04.913727 master-0 kubenswrapper[29936]: I1205 12:50:04.913672 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-systemd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.913817 master-0 kubenswrapper[29936]: I1205 12:50:04.913760 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-log-socket\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:04.913817 master-0 kubenswrapper[29936]: I1205 12:50:04.913796 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpxqg\" (UniqueName: \"kubernetes.io/projected/8defe125-1529-4091-adff-e9d17a2b298f-kube-api-access-jpxqg\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:50:04.913877 master-0 kubenswrapper[29936]: I1205 12:50:04.913824 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvbfq\" (UniqueName: \"kubernetes.io/projected/b8233dad-bd19-4842-a4d5-cfa84f1feb83-kube-api-access-mvbfq\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:50:04.913877 master-0 kubenswrapper[29936]: I1205 12:50:04.913851 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flxbg\" (UniqueName: \"kubernetes.io/projected/f3792522-fec6-4022-90ac-0b8467fcd625-kube-api-access-flxbg\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:50:04.914082 master-0 kubenswrapper[29936]: I1205 12:50:04.914035 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 12:50:04.915406 master-0 kubenswrapper[29936]: I1205 12:50:04.915364 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-client\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.935808 master-0 kubenswrapper[29936]: I1205 12:50:04.935762 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 12:50:04.954748 master-0 kubenswrapper[29936]: I1205 12:50:04.954601 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 12:50:04.966303 master-0 kubenswrapper[29936]: I1205 12:50:04.964190 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-serving-cert\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.976463 master-0 kubenswrapper[29936]: I1205 12:50:04.974507 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 12:50:04.984675 master-0 kubenswrapper[29936]: I1205 12:50:04.984563 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-encryption-config\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:04.997811 master-0 kubenswrapper[29936]: I1205 12:50:04.997576 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 12:50:05.018219 master-0 kubenswrapper[29936]: I1205 12:50:05.018122 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cnibin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.018551 master-0 kubenswrapper[29936]: I1205 12:50:05.018234 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:50:05.018551 master-0 kubenswrapper[29936]: I1205 12:50:05.018290 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-kubelet\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.018551 master-0 kubenswrapper[29936]: I1205 12:50:05.018318 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-dir\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:05.018551 master-0 kubenswrapper[29936]: I1205 12:50:05.018378 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-run\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.018551 master-0 kubenswrapper[29936]: I1205 12:50:05.018412 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-k8s-cni-cncf-io\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.018551 master-0 kubenswrapper[29936]: I1205 12:50:05.018484 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:50:05.018998 master-0 kubenswrapper[29936]: I1205 12:50:05.018588 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-run\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.018998 master-0 kubenswrapper[29936]: I1205 12:50:05.018509 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-kubelet\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.019078 master-0 kubenswrapper[29936]: I1205 12:50:05.018999 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-k8s-cni-cncf-io\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.019117 master-0 kubenswrapper[29936]: I1205 12:50:05.019086 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-netd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.019166 master-0 kubenswrapper[29936]: I1205 12:50:05.019149 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-var-lib-kubelet\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.019383 master-0 kubenswrapper[29936]: I1205 12:50:05.019339 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 12:50:05.019383 master-0 kubenswrapper[29936]: I1205 12:50:05.019365 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-dir\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:05.019383 master-0 kubenswrapper[29936]: I1205 12:50:05.019377 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-node-log\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.019514 master-0 kubenswrapper[29936]: I1205 12:50:05.019299 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-node-log\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.019514 master-0 kubenswrapper[29936]: I1205 12:50:05.019448 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-var-lib-kubelet\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.019514 master-0 kubenswrapper[29936]: I1205 12:50:05.019477 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.019514 master-0 kubenswrapper[29936]: I1205 12:50:05.019490 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-cnibin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.019514 master-0 kubenswrapper[29936]: I1205 12:50:05.019503 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:50:05.019705 master-0 kubenswrapper[29936]: I1205 12:50:05.019562 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-bin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.019705 master-0 kubenswrapper[29936]: I1205 12:50:05.019592 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-kubelet\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.019705 master-0 kubenswrapper[29936]: I1205 12:50:05.019607 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3169f44496ed8a28c6d6a15511ab0eec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"3169f44496ed8a28c6d6a15511ab0eec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 05 12:50:05.019705 master-0 kubenswrapper[29936]: I1205 12:50:05.019629 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.019705 master-0 kubenswrapper[29936]: I1205 12:50:05.019567 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-netd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.019705 master-0 kubenswrapper[29936]: I1205 12:50:05.019667 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-bin\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.019705 master-0 kubenswrapper[29936]: I1205 12:50:05.019673 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-kubelet\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.019965 master-0 kubenswrapper[29936]: I1205 12:50:05.019726 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-var-lib-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.019965 master-0 kubenswrapper[29936]: I1205 12:50:05.019789 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:05.019965 master-0 kubenswrapper[29936]: I1205 12:50:05.019813 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-hostroot\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.019965 master-0 kubenswrapper[29936]: I1205 12:50:05.019841 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-cnibin\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:05.019965 master-0 kubenswrapper[29936]: I1205 12:50:05.019886 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-lib-modules\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.019965 master-0 kubenswrapper[29936]: I1205 12:50:05.019934 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-multus\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.020293 master-0 kubenswrapper[29936]: I1205 12:50:05.019990 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-socket-dir-parent\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.020293 master-0 kubenswrapper[29936]: I1205 12:50:05.020030 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:05.020366 master-0 kubenswrapper[29936]: I1205 12:50:05.020324 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.020366 master-0 kubenswrapper[29936]: I1205 12:50:05.020351 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.020366 master-0 kubenswrapper[29936]: I1205 12:50:05.020374 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysconfig\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.020521 master-0 kubenswrapper[29936]: I1205 12:50:05.020400 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-slash\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.020521 master-0 kubenswrapper[29936]: I1205 12:50:05.020424 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-etc-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.020521 master-0 kubenswrapper[29936]: I1205 12:50:05.020453 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:05.020641 master-0 kubenswrapper[29936]: I1205 12:50:05.020545 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-systemd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.020641 master-0 kubenswrapper[29936]: I1205 12:50:05.020572 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-log-socket\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.020712 master-0 kubenswrapper[29936]: I1205 12:50:05.020645 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-netns\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.020712 master-0 kubenswrapper[29936]: I1205 12:50:05.020693 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.020809 master-0 kubenswrapper[29936]: I1205 12:50:05.020720 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-root\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:05.020809 master-0 kubenswrapper[29936]: I1205 12:50:05.020767 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.020809 master-0 kubenswrapper[29936]: I1205 12:50:05.020798 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-sys\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:05.020928 master-0 kubenswrapper[29936]: I1205 12:50:05.020837 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-system-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.021059 master-0 kubenswrapper[29936]: I1205 12:50:05.020939 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-kubernetes\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.021116 master-0 kubenswrapper[29936]: I1205 12:50:05.021072 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:50:05.021116 master-0 kubenswrapper[29936]: I1205 12:50:05.021103 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-slash\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.021217 master-0 kubenswrapper[29936]: I1205 12:50:05.021109 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-ovn\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.021217 master-0 kubenswrapper[29936]: I1205 12:50:05.021154 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-etc-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.021303 master-0 kubenswrapper[29936]: I1205 12:50:05.021216 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:05.021303 master-0 kubenswrapper[29936]: I1205 12:50:05.021218 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-systemd\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.021303 master-0 kubenswrapper[29936]: I1205 12:50:05.021250 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-systemd\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.021303 master-0 kubenswrapper[29936]: I1205 12:50:05.021274 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95d8fb27-8b2b-4749-add3-9e9b16edb693-rootfs\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:05.021559 master-0 kubenswrapper[29936]: I1205 12:50:05.021341 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:05.021559 master-0 kubenswrapper[29936]: I1205 12:50:05.021385 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-hostroot\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.021559 master-0 kubenswrapper[29936]: I1205 12:50:05.021426 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-cnibin\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:05.021671 master-0 kubenswrapper[29936]: I1205 12:50:05.021595 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-lib-modules\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.021671 master-0 kubenswrapper[29936]: I1205 12:50:05.021637 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-var-lib-cni-multus\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.021747 master-0 kubenswrapper[29936]: I1205 12:50:05.021690 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-socket-dir-parent\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.021747 master-0 kubenswrapper[29936]: I1205 12:50:05.021728 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:05.021816 master-0 kubenswrapper[29936]: I1205 12:50:05.021769 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-cert-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.021816 master-0 kubenswrapper[29936]: I1205 12:50:05.021804 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-data-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.021893 master-0 kubenswrapper[29936]: I1205 12:50:05.021850 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysconfig\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.021933 master-0 kubenswrapper[29936]: I1205 12:50:05.021891 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-systemd\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.021970 master-0 kubenswrapper[29936]: I1205 12:50:05.021938 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-kubernetes\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.022013 master-0 kubenswrapper[29936]: I1205 12:50:05.021976 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-netns\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.022057 master-0 kubenswrapper[29936]: I1205 12:50:05.022036 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.022095 master-0 kubenswrapper[29936]: I1205 12:50:05.022072 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-root\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:05.022169 master-0 kubenswrapper[29936]: I1205 12:50:05.022111 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-static-pod-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.022222 master-0 kubenswrapper[29936]: I1205 12:50:05.022196 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-system-cni-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.022301 master-0 kubenswrapper[29936]: I1205 12:50:05.020985 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-image-import-ca\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:05.022301 master-0 kubenswrapper[29936]: I1205 12:50:05.022231 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-system-cni-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:05.022301 master-0 kubenswrapper[29936]: I1205 12:50:05.022250 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-var-lib-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.022301 master-0 kubenswrapper[29936]: I1205 12:50:05.022197 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-sys\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:05.022301 master-0 kubenswrapper[29936]: I1205 12:50:05.022297 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-sys\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.022563 master-0 kubenswrapper[29936]: I1205 12:50:05.022327 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:05.022563 master-0 kubenswrapper[29936]: I1205 12:50:05.022381 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-sys\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.022563 master-0 kubenswrapper[29936]: I1205 12:50:05.022385 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:05.022563 master-0 kubenswrapper[29936]: I1205 12:50:05.022484 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/95d8fb27-8b2b-4749-add3-9e9b16edb693-rootfs\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:05.022563 master-0 kubenswrapper[29936]: I1205 12:50:05.022523 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:50:05.022738 master-0 kubenswrapper[29936]: I1205 12:50:05.022569 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-ovn\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.022738 master-0 kubenswrapper[29936]: I1205 12:50:05.022603 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-system-cni-dir\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:05.022850 master-0 kubenswrapper[29936]: I1205 12:50:05.022792 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.022850 master-0 kubenswrapper[29936]: I1205 12:50:05.022831 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-etc-kubernetes\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.022930 master-0 kubenswrapper[29936]: I1205 12:50:05.022863 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-modprobe-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.022930 master-0 kubenswrapper[29936]: I1205 12:50:05.022910 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-netns\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.023003 master-0 kubenswrapper[29936]: I1205 12:50:05.022934 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:50:05.023003 master-0 kubenswrapper[29936]: I1205 12:50:05.022933 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-resource-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.023070 master-0 kubenswrapper[29936]: I1205 12:50:05.023001 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-etc-kubernetes\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.023219 master-0 kubenswrapper[29936]: I1205 12:50:05.023119 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.023283 master-0 kubenswrapper[29936]: I1205 12:50:05.023239 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:05.023283 master-0 kubenswrapper[29936]: I1205 12:50:05.023272 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-modprobe-d\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.023367 master-0 kubenswrapper[29936]: I1205 12:50:05.023290 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-log-socket\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.023367 master-0 kubenswrapper[29936]: I1205 12:50:05.023314 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-ssl-certs\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:05.023367 master-0 kubenswrapper[29936]: I1205 12:50:05.023339 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.023367 master-0 kubenswrapper[29936]: I1205 12:50:05.023341 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-multus-certs\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.023367 master-0 kubenswrapper[29936]: I1205 12:50:05.023364 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-multus-certs\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028367 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028456 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-host-run-netns\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028506 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028604 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-wtmp\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028655 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-conf-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028691 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028781 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-node-pullsecrets\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028835 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-conf\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028866 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028919 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.028950 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-host\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029011 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-os-release\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029038 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-os-release\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029118 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-systemd-units\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029233 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit-dir\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029296 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-bin\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029333 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029367 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029439 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-var-lock\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029555 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-var-lock\") pod \"installer-3-master-0\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029599 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-run-ovn-kubernetes\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029661 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-wtmp\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029691 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-multus-conf-dir\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029719 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-log-dir\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029762 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-node-pullsecrets\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029916 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-etc-sysctl-conf\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029947 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7d0792bf-e2da-4ee7-91fe-032299cea42f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.029976 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-run-openvswitch\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.030012 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2635f9f-219b-4d03-b5b3-496c0c836fae-host\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.030070 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-os-release\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.030113 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/708bf629-9949-4b79-a88a-c73ba033475b-os-release\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.030142 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-systemd-units\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.030198 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit-dir\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.030234 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4492c55f-701b-4ec8-ada1-0a5dc126d405-host-cni-bin\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.030267 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/58d12e893528ad53a994f10901a644ea-usr-local-bin\") pod \"etcd-master-0\" (UID: \"58d12e893528ad53a994f10901a644ea\") " pod="openshift-etcd/etcd-master-0" Dec 05 12:50:05.033831 master-0 kubenswrapper[29936]: I1205 12:50:05.030299 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:50:05.037939 master-0 kubenswrapper[29936]: I1205 12:50:05.035894 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 12:50:05.042023 master-0 kubenswrapper[29936]: I1205 12:50:05.041967 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-config\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:05.056539 master-0 kubenswrapper[29936]: I1205 12:50:05.056300 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 12:50:05.060014 master-0 kubenswrapper[29936]: I1205 12:50:05.059957 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-audit-policies\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:05.076670 master-0 kubenswrapper[29936]: I1205 12:50:05.076269 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 12:50:05.079952 master-0 kubenswrapper[29936]: I1205 12:50:05.079902 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-encryption-config\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:05.096145 master-0 kubenswrapper[29936]: I1205 12:50:05.096084 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 12:50:05.104611 master-0 kubenswrapper[29936]: I1205 12:50:05.103656 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-etcd-serving-ca\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:05.118450 master-0 kubenswrapper[29936]: I1205 12:50:05.118387 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 12:50:05.123436 master-0 kubenswrapper[29936]: I1205 12:50:05.123390 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-audit\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:05.136998 master-0 kubenswrapper[29936]: I1205 12:50:05.136935 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 12:50:05.145967 master-0 kubenswrapper[29936]: I1205 12:50:05.145929 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-client\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:05.155073 master-0 kubenswrapper[29936]: I1205 12:50:05.155005 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 12:50:05.175447 master-0 kubenswrapper[29936]: I1205 12:50:05.175392 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 12:50:05.182772 master-0 kubenswrapper[29936]: I1205 12:50:05.182713 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-etcd-serving-ca\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:05.195861 master-0 kubenswrapper[29936]: I1205 12:50:05.194819 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 05 12:50:05.196097 master-0 kubenswrapper[29936]: I1205 12:50:05.195974 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 12:50:05.216006 master-0 kubenswrapper[29936]: I1205 12:50:05.215364 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 12:50:05.225294 master-0 kubenswrapper[29936]: I1205 12:50:05.225194 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d72b2b71-27b2-4aff-bf69-7054a9556318-serving-cert\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:05.235163 master-0 kubenswrapper[29936]: I1205 12:50:05.235105 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 12:50:05.243422 master-0 kubenswrapper[29936]: I1205 12:50:05.243310 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d72b2b71-27b2-4aff-bf69-7054a9556318-trusted-ca-bundle\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:05.255106 master-0 kubenswrapper[29936]: I1205 12:50:05.255043 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 12:50:05.256403 master-0 kubenswrapper[29936]: I1205 12:50:05.256363 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-key\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:50:05.276805 master-0 kubenswrapper[29936]: I1205 12:50:05.276736 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 12:50:05.295534 master-0 kubenswrapper[29936]: I1205 12:50:05.295459 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 12:50:05.320033 master-0 kubenswrapper[29936]: I1205 12:50:05.319966 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 12:50:05.332049 master-0 kubenswrapper[29936]: I1205 12:50:05.331974 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cf8247a1-703a-46b3-9a33-25a73b27ab99-signing-cabundle\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:50:05.335056 master-0 kubenswrapper[29936]: I1205 12:50:05.334999 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 05 12:50:05.339929 master-0 kubenswrapper[29936]: I1205 12:50:05.339859 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14df948-1ec4-4785-ad33-28d1e7063959-serving-cert\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:50:05.355521 master-0 kubenswrapper[29936]: I1205 12:50:05.355472 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-8mxmd" Dec 05 12:50:05.376193 master-0 kubenswrapper[29936]: I1205 12:50:05.376093 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 05 12:50:05.395029 master-0 kubenswrapper[29936]: I1205 12:50:05.394960 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 05 12:50:05.418979 master-0 kubenswrapper[29936]: I1205 12:50:05.418467 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 05 12:50:05.426955 master-0 kubenswrapper[29936]: I1205 12:50:05.426887 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-trusted-ca-bundle\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:50:05.445952 master-0 kubenswrapper[29936]: I1205 12:50:05.445643 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 05 12:50:05.448408 master-0 kubenswrapper[29936]: I1205 12:50:05.448355 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14df948-1ec4-4785-ad33-28d1e7063959-service-ca-bundle\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:50:05.468165 master-0 kubenswrapper[29936]: I1205 12:50:05.465708 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 12:50:05.474968 master-0 kubenswrapper[29936]: I1205 12:50:05.474870 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f0c6889-0739-48a3-99cd-6db9d1f83242-metrics-tls\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:50:05.476114 master-0 kubenswrapper[29936]: I1205 12:50:05.476045 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 12:50:05.496761 master-0 kubenswrapper[29936]: I1205 12:50:05.496657 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 12:50:05.515657 master-0 kubenswrapper[29936]: I1205 12:50:05.515583 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"tuned-dockercfg-5khzw" Dec 05 12:50:05.535208 master-0 kubenswrapper[29936]: I1205 12:50:05.535105 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 12:50:05.535692 master-0 kubenswrapper[29936]: I1205 12:50:05.535648 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-profile-collector-cert\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:50:05.536526 master-0 kubenswrapper[29936]: I1205 12:50:05.536417 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-profile-collector-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:50:05.560119 master-0 kubenswrapper[29936]: I1205 12:50:05.560037 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-285qn" Dec 05 12:50:05.574562 master-0 kubenswrapper[29936]: I1205 12:50:05.574507 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 12:50:05.583243 master-0 kubenswrapper[29936]: I1205 12:50:05.583164 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/db2e54b6-4879-40f4-9359-a8b0c31e76c2-srv-cert\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:50:05.595515 master-0 kubenswrapper[29936]: I1205 12:50:05.595454 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 12:50:05.616152 master-0 kubenswrapper[29936]: I1205 12:50:05.615587 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 12:50:05.625342 master-0 kubenswrapper[29936]: I1205 12:50:05.625272 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-config\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:50:05.635607 master-0 kubenswrapper[29936]: I1205 12:50:05.635548 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 12:50:05.654662 master-0 kubenswrapper[29936]: I1205 12:50:05.654588 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 12:50:05.664716 master-0 kubenswrapper[29936]: I1205 12:50:05.664650 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ee7a76b-cf1d-4513-b314-5aa314da818d-machine-api-operator-tls\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:50:05.675676 master-0 kubenswrapper[29936]: I1205 12:50:05.675629 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 12:50:05.677968 master-0 kubenswrapper[29936]: I1205 12:50:05.677928 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ee7a76b-cf1d-4513-b314-5aa314da818d-images\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:50:05.695431 master-0 kubenswrapper[29936]: I1205 12:50:05.695372 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-wrm9q" Dec 05 12:50:05.716741 master-0 kubenswrapper[29936]: I1205 12:50:05.715602 29936 request.go:700] Waited for 1.004561847s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/secrets?fieldSelector=metadata.name%3Dprometheus-operator-admission-webhook-tls&limit=500&resourceVersion=0 Dec 05 12:50:05.717728 master-0 kubenswrapper[29936]: I1205 12:50:05.717683 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 05 12:50:05.728605 master-0 kubenswrapper[29936]: I1205 12:50:05.728439 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/909ed395-8ad3-4350-95e3-b4b19c682f92-tls-certificates\") pod \"prometheus-operator-admission-webhook-7c85c4dffd-sbvlr\" (UID: \"909ed395-8ad3-4350-95e3-b4b19c682f92\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:50:05.735322 master-0 kubenswrapper[29936]: I1205 12:50:05.735263 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-jdkkl" Dec 05 12:50:05.757375 master-0 kubenswrapper[29936]: I1205 12:50:05.755500 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 05 12:50:05.762938 master-0 kubenswrapper[29936]: I1205 12:50:05.762876 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e5dfcb1e-1231-4f07-8c21-748965718729-auth-proxy-config\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:50:05.782872 master-0 kubenswrapper[29936]: I1205 12:50:05.782797 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 05 12:50:05.789846 master-0 kubenswrapper[29936]: E1205 12:50:05.789774 29936 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.790113 master-0 kubenswrapper[29936]: I1205 12:50:05.789961 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5dfcb1e-1231-4f07-8c21-748965718729-cert\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:50:05.790113 master-0 kubenswrapper[29936]: E1205 12:50:05.789972 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/665c4362-e2e5-4f96-92c0-1746c63c7422-cloud-credential-operator-serving-cert podName:665c4362-e2e5-4f96-92c0-1746c63c7422 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.289897973 +0000 UTC m=+3.421977654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/665c4362-e2e5-4f96-92c0-1746c63c7422-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-698c598cfc-jzrqj" (UID: "665c4362-e2e5-4f96-92c0-1746c63c7422") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.793645 master-0 kubenswrapper[29936]: E1205 12:50:05.793594 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.793844 master-0 kubenswrapper[29936]: E1205 12:50:05.793678 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.293661976 +0000 UTC m=+3.425741657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.794874 master-0 kubenswrapper[29936]: E1205 12:50:05.794829 29936 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.794942 master-0 kubenswrapper[29936]: E1205 12:50:05.794922 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cluster-baremetal-operator-tls podName:a280c582-685e-47ac-bf6b-248aa0c129a9 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.294898771 +0000 UTC m=+3.426978452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-78f758c7b9-5xg2k" (UID: "a280c582-685e-47ac-bf6b-248aa0c129a9") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.794998 master-0 kubenswrapper[29936]: E1205 12:50:05.794981 29936 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.795039 master-0 kubenswrapper[29936]: E1205 12:50:05.795021 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.295013574 +0000 UTC m=+3.427093255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.795071 master-0 kubenswrapper[29936]: E1205 12:50:05.795044 29936 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.795108 master-0 kubenswrapper[29936]: E1205 12:50:05.795072 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca podName:c39d2089-d3bf-4556-b6ef-c362a08c21a2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.295063685 +0000 UTC m=+3.427143356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca") pod "controller-manager-b59c5b9bc-vh8fw" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.795143 master-0 kubenswrapper[29936]: E1205 12:50:05.795111 29936 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.795143 master-0 kubenswrapper[29936]: E1205 12:50:05.795134 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config podName:c39d2089-d3bf-4556-b6ef-c362a08c21a2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.295128067 +0000 UTC m=+3.427207748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config") pod "controller-manager-b59c5b9bc-vh8fw" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.795223 master-0 kubenswrapper[29936]: I1205 12:50:05.795167 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-6t5rm" Dec 05 12:50:05.795291 master-0 kubenswrapper[29936]: E1205 12:50:05.795266 29936 secret.go:189] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.795323 master-0 kubenswrapper[29936]: E1205 12:50:05.795301 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-stats-auth podName:20a72c8b-0f12-446b-8a42-53d98864c8f8 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.295294721 +0000 UTC m=+3.427374402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-stats-auth") pod "router-default-5465c8b4db-dzlmb" (UID: "20a72c8b-0f12-446b-8a42-53d98864c8f8") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.795442 master-0 kubenswrapper[29936]: E1205 12:50:05.795406 29936 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.796795 master-0 kubenswrapper[29936]: E1205 12:50:05.796758 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-auth-proxy-config podName:dbe144b5-3b78-4946-bbf9-b825b0e47b07 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.296746171 +0000 UTC m=+3.428825852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" (UID: "dbe144b5-3b78-4946-bbf9-b825b0e47b07") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.797026 master-0 kubenswrapper[29936]: E1205 12:50:05.796806 29936 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.797026 master-0 kubenswrapper[29936]: E1205 12:50:05.796805 29936 secret.go:189] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.797026 master-0 kubenswrapper[29936]: E1205 12:50:05.796907 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-default-certificate podName:20a72c8b-0f12-446b-8a42-53d98864c8f8 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.296862045 +0000 UTC m=+3.428941726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-default-certificate") pod "router-default-5465c8b4db-dzlmb" (UID: "20a72c8b-0f12-446b-8a42-53d98864c8f8") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.797026 master-0 kubenswrapper[29936]: E1205 12:50:05.796919 29936 configmap.go:193] Couldn't get configMap openshift-machine-api/baremetal-kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.797026 master-0 kubenswrapper[29936]: E1205 12:50:05.796936 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert podName:c39d2089-d3bf-4556-b6ef-c362a08c21a2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.296920657 +0000 UTC m=+3.429000568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert") pod "controller-manager-b59c5b9bc-vh8fw" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.797026 master-0 kubenswrapper[29936]: E1205 12:50:05.796978 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-config podName:a280c582-685e-47ac-bf6b-248aa0c129a9 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.296967168 +0000 UTC m=+3.429047059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-config") pod "cluster-baremetal-operator-78f758c7b9-5xg2k" (UID: "a280c582-685e-47ac-bf6b-248aa0c129a9") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.797026 master-0 kubenswrapper[29936]: E1205 12:50:05.796981 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.797026 master-0 kubenswrapper[29936]: E1205 12:50:05.797025 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.29701735 +0000 UTC m=+3.429097031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.797440 master-0 kubenswrapper[29936]: E1205 12:50:05.797060 29936 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.797440 master-0 kubenswrapper[29936]: E1205 12:50:05.797128 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-srv-cert podName:f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.297086552 +0000 UTC m=+3.429166233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-srv-cert") pod "olm-operator-7cd7dbb44c-xdbtz" (UID: "f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.797440 master-0 kubenswrapper[29936]: E1205 12:50:05.797395 29936 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.797577 master-0 kubenswrapper[29936]: E1205 12:50:05.797502 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-config-volume podName:ce9e2a6b-8ce7-477c-8bc7-24033243eabe nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.297466422 +0000 UTC m=+3.429546293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-config-volume") pod "dns-default-rzl84" (UID: "ce9e2a6b-8ce7-477c-8bc7-24033243eabe") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.799758 master-0 kubenswrapper[29936]: E1205 12:50:05.799725 29936 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.799836 master-0 kubenswrapper[29936]: E1205 12:50:05.799775 29936 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.799913 master-0 kubenswrapper[29936]: E1205 12:50:05.799880 29936 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.799973 master-0 kubenswrapper[29936]: E1205 12:50:05.799893 29936 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.799973 master-0 kubenswrapper[29936]: E1205 12:50:05.799785 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-metrics-certs podName:20a72c8b-0f12-446b-8a42-53d98864c8f8 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.299771086 +0000 UTC m=+3.431850947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-metrics-certs") pod "router-default-5465c8b4db-dzlmb" (UID: "20a72c8b-0f12-446b-8a42-53d98864c8f8") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.800064 master-0 kubenswrapper[29936]: E1205 12:50:05.799993 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cert podName:a280c582-685e-47ac-bf6b-248aa0c129a9 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.299980502 +0000 UTC m=+3.432060393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cert") pod "cluster-baremetal-operator-78f758c7b9-5xg2k" (UID: "a280c582-685e-47ac-bf6b-248aa0c129a9") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.800064 master-0 kubenswrapper[29936]: E1205 12:50:05.800013 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/665c4362-e2e5-4f96-92c0-1746c63c7422-cco-trusted-ca podName:665c4362-e2e5-4f96-92c0-1746c63c7422 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.300005182 +0000 UTC m=+3.432085093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/665c4362-e2e5-4f96-92c0-1746c63c7422-cco-trusted-ca") pod "cloud-credential-operator-698c598cfc-jzrqj" (UID: "665c4362-e2e5-4f96-92c0-1746c63c7422") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.800064 master-0 kubenswrapper[29936]: E1205 12:50:05.800028 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.300021163 +0000 UTC m=+3.432101074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.807576 master-0 kubenswrapper[29936]: E1205 12:50:05.807469 29936 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.807823 master-0 kubenswrapper[29936]: E1205 12:50:05.807618 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/20a72c8b-0f12-446b-8a42-53d98864c8f8-service-ca-bundle podName:20a72c8b-0f12-446b-8a42-53d98864c8f8 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.307591262 +0000 UTC m=+3.439671143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/20a72c8b-0f12-446b-8a42-53d98864c8f8-service-ca-bundle") pod "router-default-5465c8b4db-dzlmb" (UID: "20a72c8b-0f12-446b-8a42-53d98864c8f8") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.807823 master-0 kubenswrapper[29936]: E1205 12:50:05.807667 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.807823 master-0 kubenswrapper[29936]: E1205 12:50:05.807702 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.307692575 +0000 UTC m=+3.439772466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.807823 master-0 kubenswrapper[29936]: E1205 12:50:05.807732 29936 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.807823 master-0 kubenswrapper[29936]: E1205 12:50:05.807761 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles podName:c39d2089-d3bf-4556-b6ef-c362a08c21a2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.307752986 +0000 UTC m=+3.439832897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles") pod "controller-manager-b59c5b9bc-vh8fw" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.807823 master-0 kubenswrapper[29936]: E1205 12:50:05.807790 29936 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.807823 master-0 kubenswrapper[29936]: E1205 12:50:05.807816 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-images podName:dbe144b5-3b78-4946-bbf9-b825b0e47b07 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.307808488 +0000 UTC m=+3.439888169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-images") pod "cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" (UID: "dbe144b5-3b78-4946-bbf9-b825b0e47b07") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.808166 master-0 kubenswrapper[29936]: E1205 12:50:05.807904 29936 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.808166 master-0 kubenswrapper[29936]: E1205 12:50:05.807938 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert podName:b13885ef-d2b5-4591-825d-446cf8729bc1 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.307926991 +0000 UTC m=+3.440006892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert") pod "packageserver-58c5755b49-6dx4c" (UID: "b13885ef-d2b5-4591-825d-446cf8729bc1") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.808166 master-0 kubenswrapper[29936]: E1205 12:50:05.807956 29936 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.808166 master-0 kubenswrapper[29936]: E1205 12:50:05.807987 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbe144b5-3b78-4946-bbf9-b825b0e47b07-cloud-controller-manager-operator-tls podName:dbe144b5-3b78-4946-bbf9-b825b0e47b07 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.307977162 +0000 UTC m=+3.440057053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/dbe144b5-3b78-4946-bbf9-b825b0e47b07-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" (UID: "dbe144b5-3b78-4946-bbf9-b825b0e47b07") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.808166 master-0 kubenswrapper[29936]: E1205 12:50:05.808017 29936 configmap.go:193] Couldn't get configMap openshift-machine-api/cluster-baremetal-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.808166 master-0 kubenswrapper[29936]: E1205 12:50:05.808047 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-images podName:a280c582-685e-47ac-bf6b-248aa0c129a9 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.308039924 +0000 UTC m=+3.440119885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-images") pod "cluster-baremetal-operator-78f758c7b9-5xg2k" (UID: "a280c582-685e-47ac-bf6b-248aa0c129a9") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.808166 master-0 kubenswrapper[29936]: E1205 12:50:05.808063 29936 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.808166 master-0 kubenswrapper[29936]: E1205 12:50:05.808090 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-metrics-tls podName:ce9e2a6b-8ce7-477c-8bc7-24033243eabe nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.308083346 +0000 UTC m=+3.440163247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-metrics-tls") pod "dns-default-rzl84" (UID: "ce9e2a6b-8ce7-477c-8bc7-24033243eabe") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.808166 master-0 kubenswrapper[29936]: E1205 12:50:05.808105 29936 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.808166 master-0 kubenswrapper[29936]: E1205 12:50:05.808133 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert podName:b13885ef-d2b5-4591-825d-446cf8729bc1 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.308126287 +0000 UTC m=+3.440206188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert") pod "packageserver-58c5755b49-6dx4c" (UID: "b13885ef-d2b5-4591-825d-446cf8729bc1") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.815243 master-0 kubenswrapper[29936]: I1205 12:50:05.815202 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-2vcf7" Dec 05 12:50:05.835493 master-0 kubenswrapper[29936]: I1205 12:50:05.835431 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 12:50:05.849421 master-0 kubenswrapper[29936]: I1205 12:50:05.849364 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-samples-operator-tls\") pod \"cluster-samples-operator-797cfd8b47-6v84m\" (UID: \"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:50:05.858394 master-0 kubenswrapper[29936]: I1205 12:50:05.858344 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-j8gcn" Dec 05 12:50:05.875252 master-0 kubenswrapper[29936]: I1205 12:50:05.875211 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 12:50:05.899908 master-0 kubenswrapper[29936]: I1205 12:50:05.898552 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 12:50:05.905412 master-0 kubenswrapper[29936]: E1205 12:50:05.905347 29936 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.905740 master-0 kubenswrapper[29936]: E1205 12:50:05.905712 29936 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.905805 master-0 kubenswrapper[29936]: E1205 12:50:05.905725 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d0792bf-e2da-4ee7-91fe-032299cea42f-serving-cert podName:7d0792bf-e2da-4ee7-91fe-032299cea42f nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.405701286 +0000 UTC m=+3.537780967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7d0792bf-e2da-4ee7-91fe-032299cea42f-serving-cert") pod "cluster-version-operator-6d5d5dcc89-gktn5" (UID: "7d0792bf-e2da-4ee7-91fe-032299cea42f") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.905805 master-0 kubenswrapper[29936]: E1205 12:50:05.905766 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls podName:5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.405757458 +0000 UTC m=+3.537837139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls") pod "openshift-state-metrics-5974b6b869-w9l2z" (UID: "5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.905904 master-0 kubenswrapper[29936]: E1205 12:50:05.905806 29936 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.905904 master-0 kubenswrapper[29936]: E1205 12:50:05.905831 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1478a21e-b6ac-46fb-ad01-805ac71f0a79-mcc-auth-proxy-config podName:1478a21e-b6ac-46fb-ad01-805ac71f0a79 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.40582542 +0000 UTC m=+3.537905101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcc-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/1478a21e-b6ac-46fb-ad01-805ac71f0a79-mcc-auth-proxy-config") pod "machine-config-controller-7c6d64c4cd-8qxz6" (UID: "1478a21e-b6ac-46fb-ad01-805ac71f0a79") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.905904 master-0 kubenswrapper[29936]: E1205 12:50:05.905842 29936 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.905904 master-0 kubenswrapper[29936]: E1205 12:50:05.905865 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls podName:60327040-f782-4cda-a32d-52a4f183073c nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.405859741 +0000 UTC m=+3.537939412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls") pod "node-exporter-z2nmc" (UID: "60327040-f782-4cda-a32d-52a4f183073c") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.905904 master-0 kubenswrapper[29936]: E1205 12:50:05.905883 29936 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.905904 master-0 kubenswrapper[29936]: E1205 12:50:05.905901 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60327040-f782-4cda-a32d-52a4f183073c-metrics-client-ca podName:60327040-f782-4cda-a32d-52a4f183073c nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.405896812 +0000 UTC m=+3.537976493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/60327040-f782-4cda-a32d-52a4f183073c-metrics-client-ca") pod "node-exporter-z2nmc" (UID: "60327040-f782-4cda-a32d-52a4f183073c") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.906132 master-0 kubenswrapper[29936]: E1205 12:50:05.905912 29936 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.906132 master-0 kubenswrapper[29936]: E1205 12:50:05.905931 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1478a21e-b6ac-46fb-ad01-805ac71f0a79-proxy-tls podName:1478a21e-b6ac-46fb-ad01-805ac71f0a79 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.405926962 +0000 UTC m=+3.538006633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1478a21e-b6ac-46fb-ad01-805ac71f0a79-proxy-tls") pod "machine-config-controller-7c6d64c4cd-8qxz6" (UID: "1478a21e-b6ac-46fb-ad01-805ac71f0a79") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.906132 master-0 kubenswrapper[29936]: E1205 12:50:05.905981 29936 secret.go:189] Couldn't get secret openshift-config-operator/config-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.906132 master-0 kubenswrapper[29936]: E1205 12:50:05.906001 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a45f340c-0eca-4460-8961-4ca360467eeb-serving-cert podName:a45f340c-0eca-4460-8961-4ca360467eeb nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.405995894 +0000 UTC m=+3.538075575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/a45f340c-0eca-4460-8961-4ca360467eeb-serving-cert") pod "openshift-config-operator-68758cbcdb-tmzpj" (UID: "a45f340c-0eca-4460-8961-4ca360467eeb") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.906132 master-0 kubenswrapper[29936]: E1205 12:50:05.905620 29936 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.906132 master-0 kubenswrapper[29936]: E1205 12:50:05.906025 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls podName:4e9ba71a-d1b5-4986-babe-2c15c19f9cc2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.406020445 +0000 UTC m=+3.538100126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls") pod "kube-state-metrics-5857974f64-8p9n7" (UID: "4e9ba71a-d1b5-4986-babe-2c15c19f9cc2") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.906466 master-0 kubenswrapper[29936]: E1205 12:50:05.906448 29936 secret.go:189] Couldn't get secret openshift-cluster-storage-operator/cluster-storage-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.906572 master-0 kubenswrapper[29936]: E1205 12:50:05.906559 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-cluster-storage-operator-serving-cert podName:3d96c85a-fc88-46af-83d5-6c71ec6e2c23 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.40654743 +0000 UTC m=+3.538627111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-storage-operator-serving-cert" (UniqueName: "kubernetes.io/secret/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-cluster-storage-operator-serving-cert") pod "cluster-storage-operator-dcf7fc84b-vxmv7" (UID: "3d96c85a-fc88-46af-83d5-6c71ec6e2c23") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.906654 master-0 kubenswrapper[29936]: E1205 12:50:05.906445 29936 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.906836 master-0 kubenswrapper[29936]: E1205 12:50:05.906812 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs podName:d3e283fe-a474-4f83-ad66-62971945060a nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.406787927 +0000 UTC m=+3.538867608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs") pod "multus-admission-controller-8dbbb5754-j7x5j" (UID: "d3e283fe-a474-4f83-ad66-62971945060a") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.907024 master-0 kubenswrapper[29936]: E1205 12:50:05.906477 29936 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.907138 master-0 kubenswrapper[29936]: E1205 12:50:05.907124 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-config podName:db27bee9-3d33-4c4a-b38b-72f7cec77c7a nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.407113416 +0000 UTC m=+3.539193097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-config") pod "machine-approver-74d9cbffbc-r7kbd" (UID: "db27bee9-3d33-4c4a-b38b-72f7cec77c7a") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.912819 master-0 kubenswrapper[29936]: E1205 12:50:05.912750 29936 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.912819 master-0 kubenswrapper[29936]: E1205 12:50:05.912796 29936 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.912914 master-0 kubenswrapper[29936]: E1205 12:50:05.906498 29936 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.912914 master-0 kubenswrapper[29936]: E1205 12:50:05.912874 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-metrics-client-ca podName:5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.412845714 +0000 UTC m=+3.544925485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-metrics-client-ca") pod "openshift-state-metrics-5974b6b869-w9l2z" (UID: "5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.912914 master-0 kubenswrapper[29936]: E1205 12:50:05.906877 29936 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.912914 master-0 kubenswrapper[29936]: E1205 12:50:05.912903 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-machine-approver-tls podName:db27bee9-3d33-4c4a-b38b-72f7cec77c7a nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.412893495 +0000 UTC m=+3.544973316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-machine-approver-tls") pod "machine-approver-74d9cbffbc-r7kbd" (UID: "db27bee9-3d33-4c4a-b38b-72f7cec77c7a") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913056 master-0 kubenswrapper[29936]: E1205 12:50:05.912938 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/365bf663-fd5b-44df-a327-0438995c015d-proxy-tls podName:365bf663-fd5b-44df-a327-0438995c015d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.412917476 +0000 UTC m=+3.544997157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/365bf663-fd5b-44df-a327-0438995c015d-proxy-tls") pod "machine-config-operator-dc5d7666f-dqtsx" (UID: "365bf663-fd5b-44df-a327-0438995c015d") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913056 master-0 kubenswrapper[29936]: E1205 12:50:05.912961 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-custom-resource-state-configmap podName:4e9ba71a-d1b5-4986-babe-2c15c19f9cc2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.412952187 +0000 UTC m=+3.545031958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-5857974f64-8p9n7" (UID: "4e9ba71a-d1b5-4986-babe-2c15c19f9cc2") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913056 master-0 kubenswrapper[29936]: E1205 12:50:05.906899 29936 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913056 master-0 kubenswrapper[29936]: E1205 12:50:05.908626 29936 configmap.go:193] Couldn't get configMap openshift-cluster-version/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913056 master-0 kubenswrapper[29936]: E1205 12:50:05.913005 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/95d8fb27-8b2b-4749-add3-9e9b16edb693-mcd-auth-proxy-config podName:95d8fb27-8b2b-4749-add3-9e9b16edb693 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.412995458 +0000 UTC m=+3.545075139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/95d8fb27-8b2b-4749-add3-9e9b16edb693-mcd-auth-proxy-config") pod "machine-config-daemon-45nwc" (UID: "95d8fb27-8b2b-4749-add3-9e9b16edb693") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913056 master-0 kubenswrapper[29936]: E1205 12:50:05.913041 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7d0792bf-e2da-4ee7-91fe-032299cea42f-service-ca podName:7d0792bf-e2da-4ee7-91fe-032299cea42f nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413029279 +0000 UTC m=+3.545109060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/7d0792bf-e2da-4ee7-91fe-032299cea42f-service-ca") pod "cluster-version-operator-6d5d5dcc89-gktn5" (UID: "7d0792bf-e2da-4ee7-91fe-032299cea42f") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913056 master-0 kubenswrapper[29936]: E1205 12:50:05.913051 29936 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913056 master-0 kubenswrapper[29936]: E1205 12:50:05.909790 29936 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913367 master-0 kubenswrapper[29936]: E1205 12:50:05.909991 29936 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913367 master-0 kubenswrapper[29936]: E1205 12:50:05.910012 29936 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913367 master-0 kubenswrapper[29936]: E1205 12:50:05.911234 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913367 master-0 kubenswrapper[29936]: E1205 12:50:05.911324 29936 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913367 master-0 kubenswrapper[29936]: E1205 12:50:05.911369 29936 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913367 master-0 kubenswrapper[29936]: E1205 12:50:05.911384 29936 secret.go:189] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913367 master-0 kubenswrapper[29936]: E1205 12:50:05.911397 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913367 master-0 kubenswrapper[29936]: E1205 12:50:05.911600 29936 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913367 master-0 kubenswrapper[29936]: E1205 12:50:05.912766 29936 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913367 master-0 kubenswrapper[29936]: E1205 12:50:05.913089 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99996137-2621-458b-980d-584b3640d4ad-metrics-client-ca podName:99996137-2621-458b-980d-584b3640d4ad nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.41307784 +0000 UTC m=+3.545157631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/99996137-2621-458b-980d-584b3640d4ad-metrics-client-ca") pod "prometheus-operator-6c74d9cb9f-2nwvk" (UID: "99996137-2621-458b-980d-584b3640d4ad") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913389 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95d8fb27-8b2b-4749-add3-9e9b16edb693-proxy-tls podName:95d8fb27-8b2b-4749-add3-9e9b16edb693 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413377899 +0000 UTC m=+3.545457580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/95d8fb27-8b2b-4749-add3-9e9b16edb693-proxy-tls") pod "machine-config-daemon-45nwc" (UID: "95d8fb27-8b2b-4749-add3-9e9b16edb693") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913193 29936 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913448 29936 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913419 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-kube-rbac-proxy-config podName:5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.41339936 +0000 UTC m=+3.545479041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-5974b6b869-w9l2z" (UID: "5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913482 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca podName:e943438b-1de8-435c-8a19-accd6a6292a4 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413469972 +0000 UTC m=+3.545549723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca") pod "route-controller-manager-554555dbc9-szqjx" (UID: "e943438b-1de8-435c-8a19-accd6a6292a4") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913501 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-tls podName:99996137-2621-458b-980d-584b3640d4ad nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413493032 +0000 UTC m=+3.545572843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-tls") pod "prometheus-operator-6c74d9cb9f-2nwvk" (UID: "99996137-2621-458b-980d-584b3640d4ad") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913516 29936 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913539 29936 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913518 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-auth-proxy-config podName:db27bee9-3d33-4c4a-b38b-72f7cec77c7a nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413511213 +0000 UTC m=+3.545590994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-auth-proxy-config") pod "machine-approver-74d9cbffbc-r7kbd" (UID: "db27bee9-3d33-4c4a-b38b-72f7cec77c7a") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913572 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-certs podName:dc5db54b-094f-4c36-a0ad-042e9fc2b61d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413559154 +0000 UTC m=+3.545638925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-certs") pod "machine-config-server-4s89l" (UID: "dc5db54b-094f-4c36-a0ad-042e9fc2b61d") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913590 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/531b8927-92db-4e9d-9a0a-12ff948cdaad-control-plane-machine-set-operator-tls podName:531b8927-92db-4e9d-9a0a-12ff948cdaad nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413582235 +0000 UTC m=+3.545662026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/531b8927-92db-4e9d-9a0a-12ff948cdaad-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-7df95c79b5-ldg5j" (UID: "531b8927-92db-4e9d-9a0a-12ff948cdaad") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913607 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-kube-rbac-proxy-config podName:99996137-2621-458b-980d-584b3640d4ad nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413598615 +0000 UTC m=+3.545678396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-6c74d9cb9f-2nwvk" (UID: "99996137-2621-458b-980d-584b3640d4ad") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913623 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-auth-proxy-config podName:365bf663-fd5b-44df-a327-0438995c015d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413616676 +0000 UTC m=+3.545696467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-auth-proxy-config") pod "machine-config-operator-dc5d7666f-dqtsx" (UID: "365bf663-fd5b-44df-a327-0438995c015d") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913635 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config podName:e943438b-1de8-435c-8a19-accd6a6292a4 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413629706 +0000 UTC m=+3.545709507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config") pod "route-controller-manager-554555dbc9-szqjx" (UID: "e943438b-1de8-435c-8a19-accd6a6292a4") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.913636 master-0 kubenswrapper[29936]: E1205 12:50:05.913652 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-node-bootstrap-token podName:dc5db54b-094f-4c36-a0ad-042e9fc2b61d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413644667 +0000 UTC m=+3.545724468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-node-bootstrap-token") pod "machine-config-server-4s89l" (UID: "dc5db54b-094f-4c36-a0ad-042e9fc2b61d") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.914096 master-0 kubenswrapper[29936]: E1205 12:50:05.913746 29936 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.914096 master-0 kubenswrapper[29936]: E1205 12:50:05.913792 29936 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.914944 master-0 kubenswrapper[29936]: E1205 12:50:05.914898 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-kube-rbac-proxy-config podName:60327040-f782-4cda-a32d-52a4f183073c nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.413664277 +0000 UTC m=+3.545744068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-kube-rbac-proxy-config") pod "node-exporter-z2nmc" (UID: "60327040-f782-4cda-a32d-52a4f183073c") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.915008 master-0 kubenswrapper[29936]: E1205 12:50:05.914949 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-metrics-client-ca podName:4e9ba71a-d1b5-4986-babe-2c15c19f9cc2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.414934962 +0000 UTC m=+3.547014723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-metrics-client-ca") pod "kube-state-metrics-5857974f64-8p9n7" (UID: "4e9ba71a-d1b5-4986-babe-2c15c19f9cc2") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.915008 master-0 kubenswrapper[29936]: E1205 12:50:05.914969 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-kube-rbac-proxy-config podName:4e9ba71a-d1b5-4986-babe-2c15c19f9cc2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.414961633 +0000 UTC m=+3.547041434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-5857974f64-8p9n7" (UID: "4e9ba71a-d1b5-4986-babe-2c15c19f9cc2") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.915008 master-0 kubenswrapper[29936]: E1205 12:50:05.914985 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert podName:e943438b-1de8-435c-8a19-accd6a6292a4 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.414979344 +0000 UTC m=+3.547059135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert") pod "route-controller-manager-554555dbc9-szqjx" (UID: "e943438b-1de8-435c-8a19-accd6a6292a4") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:05.915008 master-0 kubenswrapper[29936]: E1205 12:50:05.915003 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-images podName:365bf663-fd5b-44df-a327-0438995c015d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:06.414996424 +0000 UTC m=+3.547076265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-images") pod "machine-config-operator-dc5d7666f-dqtsx" (UID: "365bf663-fd5b-44df-a327-0438995c015d") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:05.915860 master-0 kubenswrapper[29936]: I1205 12:50:05.915825 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 12:50:05.935066 master-0 kubenswrapper[29936]: I1205 12:50:05.934960 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 12:50:05.956073 master-0 kubenswrapper[29936]: I1205 12:50:05.954808 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-s9ftm" Dec 05 12:50:05.975871 master-0 kubenswrapper[29936]: I1205 12:50:05.975786 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 05 12:50:05.995746 master-0 kubenswrapper[29936]: I1205 12:50:05.995705 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 12:50:06.015363 master-0 kubenswrapper[29936]: I1205 12:50:06.014938 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 05 12:50:06.042087 master-0 kubenswrapper[29936]: I1205 12:50:06.041611 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 05 12:50:06.054225 master-0 kubenswrapper[29936]: I1205 12:50:06.054189 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:50:06.054806 master-0 kubenswrapper[29936]: I1205 12:50:06.054787 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 05 12:50:06.075728 master-0 kubenswrapper[29936]: I1205 12:50:06.075644 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-zknmp" Dec 05 12:50:06.094362 master-0 kubenswrapper[29936]: I1205 12:50:06.094298 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 12:50:06.115365 master-0 kubenswrapper[29936]: I1205 12:50:06.115299 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 12:50:06.135356 master-0 kubenswrapper[29936]: I1205 12:50:06.135124 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-jbzfz" Dec 05 12:50:06.156338 master-0 kubenswrapper[29936]: I1205 12:50:06.156233 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 05 12:50:06.161614 master-0 kubenswrapper[29936]: I1205 12:50:06.161554 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-var-lock\") pod \"4d215811-6210-4ec2-8356-f1533dc43f65\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " Dec 05 12:50:06.161829 master-0 kubenswrapper[29936]: I1205 12:50:06.161712 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-kubelet-dir\") pod \"4d215811-6210-4ec2-8356-f1533dc43f65\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " Dec 05 12:50:06.161829 master-0 kubenswrapper[29936]: I1205 12:50:06.161731 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-var-lock" (OuterVolumeSpecName: "var-lock") pod "4d215811-6210-4ec2-8356-f1533dc43f65" (UID: "4d215811-6210-4ec2-8356-f1533dc43f65"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:50:06.161931 master-0 kubenswrapper[29936]: I1205 12:50:06.161834 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4d215811-6210-4ec2-8356-f1533dc43f65" (UID: "4d215811-6210-4ec2-8356-f1533dc43f65"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:50:06.163468 master-0 kubenswrapper[29936]: I1205 12:50:06.163429 29936 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:50:06.163468 master-0 kubenswrapper[29936]: I1205 12:50:06.163459 29936 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4d215811-6210-4ec2-8356-f1533dc43f65-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:50:06.177251 master-0 kubenswrapper[29936]: I1205 12:50:06.175978 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 12:50:06.196241 master-0 kubenswrapper[29936]: I1205 12:50:06.196110 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 12:50:06.216017 master-0 kubenswrapper[29936]: I1205 12:50:06.215940 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-xvwgq" Dec 05 12:50:06.235672 master-0 kubenswrapper[29936]: I1205 12:50:06.235582 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 12:50:06.254814 master-0 kubenswrapper[29936]: I1205 12:50:06.254682 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 12:50:06.275800 master-0 kubenswrapper[29936]: I1205 12:50:06.275739 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 12:50:06.295317 master-0 kubenswrapper[29936]: I1205 12:50:06.295098 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 12:50:06.324349 master-0 kubenswrapper[29936]: I1205 12:50:06.321395 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 12:50:06.335049 master-0 kubenswrapper[29936]: I1205 12:50:06.334980 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-tjfgr" Dec 05 12:50:06.356648 master-0 kubenswrapper[29936]: I1205 12:50:06.356587 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.367842 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a72c8b-0f12-446b-8a42-53d98864c8f8-service-ca-bundle\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.367916 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-default-certificate\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.367991 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368044 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-srv-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368082 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-config-volume\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368125 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-metrics-certs\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368163 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368206 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368244 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368295 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/dbe144b5-3b78-4946-bbf9-b825b0e47b07-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368321 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-metrics-tls\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368345 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368459 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/665c4362-e2e5-4f96-92c0-1746c63c7422-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368565 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368611 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368766 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-config\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368859 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368918 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cert\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.368942 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.369031 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.369110 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-images\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.369359 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.369416 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.369480 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-stats-auth\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.369507 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.369531 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.369622 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/665c4362-e2e5-4f96-92c0-1746c63c7422-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:50:06.370203 master-0 kubenswrapper[29936]: I1205 12:50:06.370013 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/665c4362-e2e5-4f96-92c0-1746c63c7422-cco-trusted-ca\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:50:06.371304 master-0 kubenswrapper[29936]: I1205 12:50:06.370638 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-config-volume\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:50:06.371304 master-0 kubenswrapper[29936]: I1205 12:50:06.371019 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-metrics-tls\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:50:06.371304 master-0 kubenswrapper[29936]: I1205 12:50:06.371227 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/665c4362-e2e5-4f96-92c0-1746c63c7422-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:50:06.371656 master-0 kubenswrapper[29936]: I1205 12:50:06.371611 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-srv-cert\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:50:06.375155 master-0 kubenswrapper[29936]: I1205 12:50:06.375114 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 12:50:06.381826 master-0 kubenswrapper[29936]: I1205 12:50:06.381736 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-apiservice-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:06.382097 master-0 kubenswrapper[29936]: I1205 12:50:06.381937 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b13885ef-d2b5-4591-825d-446cf8729bc1-webhook-cert\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:06.394975 master-0 kubenswrapper[29936]: I1205 12:50:06.394930 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 12:50:06.416874 master-0 kubenswrapper[29936]: I1205 12:50:06.416713 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-5tl2j" Dec 05 12:50:06.435539 master-0 kubenswrapper[29936]: I1205 12:50:06.435455 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 05 12:50:06.442373 master-0 kubenswrapper[29936]: I1205 12:50:06.442309 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-images\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:06.456745 master-0 kubenswrapper[29936]: I1205 12:50:06.456673 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-kj2kk" Dec 05 12:50:06.471325 master-0 kubenswrapper[29936]: I1205 12:50:06.471241 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:06.471616 master-0 kubenswrapper[29936]: I1205 12:50:06.471347 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-certs\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:50:06.471616 master-0 kubenswrapper[29936]: I1205 12:50:06.471416 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:06.471616 master-0 kubenswrapper[29936]: I1205 12:50:06.471536 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:06.471616 master-0 kubenswrapper[29936]: I1205 12:50:06.471602 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-node-bootstrap-token\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:50:06.471783 master-0 kubenswrapper[29936]: I1205 12:50:06.471632 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:06.471783 master-0 kubenswrapper[29936]: I1205 12:50:06.471660 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:06.471783 master-0 kubenswrapper[29936]: I1205 12:50:06.471722 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:06.471783 master-0 kubenswrapper[29936]: I1205 12:50:06.471750 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:06.471783 master-0 kubenswrapper[29936]: I1205 12:50:06.471778 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:06.471958 master-0 kubenswrapper[29936]: I1205 12:50:06.471806 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99996137-2621-458b-980d-584b3640d4ad-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:06.471958 master-0 kubenswrapper[29936]: I1205 12:50:06.471893 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:06.471958 master-0 kubenswrapper[29936]: I1205 12:50:06.471916 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-images\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:06.472062 master-0 kubenswrapper[29936]: I1205 12:50:06.472011 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1478a21e-b6ac-46fb-ad01-805ac71f0a79-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:50:06.472062 master-0 kubenswrapper[29936]: I1205 12:50:06.472042 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0792bf-e2da-4ee7-91fe-032299cea42f-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:06.472137 master-0 kubenswrapper[29936]: I1205 12:50:06.472071 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:50:06.472137 master-0 kubenswrapper[29936]: I1205 12:50:06.472105 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a45f340c-0eca-4460-8961-4ca360467eeb-serving-cert\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:50:06.472137 master-0 kubenswrapper[29936]: I1205 12:50:06.472133 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1478a21e-b6ac-46fb-ad01-805ac71f0a79-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:50:06.472325 master-0 kubenswrapper[29936]: I1205 12:50:06.472223 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:06.472325 master-0 kubenswrapper[29936]: I1205 12:50:06.472260 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60327040-f782-4cda-a32d-52a4f183073c-metrics-client-ca\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:06.472970 master-0 kubenswrapper[29936]: I1205 12:50:06.472902 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0792bf-e2da-4ee7-91fe-032299cea42f-serving-cert\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:06.473086 master-0 kubenswrapper[29936]: I1205 12:50:06.473054 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-auth-proxy-config\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:06.473252 master-0 kubenswrapper[29936]: I1205 12:50:06.473232 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/365bf663-fd5b-44df-a327-0438995c015d-images\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:06.473414 master-0 kubenswrapper[29936]: I1205 12:50:06.473345 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:06.473414 master-0 kubenswrapper[29936]: I1205 12:50:06.473407 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:06.473575 master-0 kubenswrapper[29936]: I1205 12:50:06.473542 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1478a21e-b6ac-46fb-ad01-805ac71f0a79-mcc-auth-proxy-config\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:50:06.473772 master-0 kubenswrapper[29936]: I1205 12:50:06.473728 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:06.473831 master-0 kubenswrapper[29936]: I1205 12:50:06.473817 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:50:06.473947 master-0 kubenswrapper[29936]: I1205 12:50:06.473884 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:06.474012 master-0 kubenswrapper[29936]: I1205 12:50:06.473990 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/365bf663-fd5b-44df-a327-0438995c015d-proxy-tls\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:06.474058 master-0 kubenswrapper[29936]: I1205 12:50:06.474034 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95d8fb27-8b2b-4749-add3-9e9b16edb693-mcd-auth-proxy-config\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:06.474239 master-0 kubenswrapper[29936]: I1205 12:50:06.474216 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d0792bf-e2da-4ee7-91fe-032299cea42f-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:06.474294 master-0 kubenswrapper[29936]: I1205 12:50:06.474265 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/95d8fb27-8b2b-4749-add3-9e9b16edb693-mcd-auth-proxy-config\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:06.474357 master-0 kubenswrapper[29936]: I1205 12:50:06.474333 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:06.474400 master-0 kubenswrapper[29936]: I1205 12:50:06.474363 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:06.474439 master-0 kubenswrapper[29936]: I1205 12:50:06.474422 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d8fb27-8b2b-4749-add3-9e9b16edb693-proxy-tls\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:06.474495 master-0 kubenswrapper[29936]: I1205 12:50:06.474457 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/365bf663-fd5b-44df-a327-0438995c015d-proxy-tls\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:06.474630 master-0 kubenswrapper[29936]: I1205 12:50:06.474572 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d0792bf-e2da-4ee7-91fe-032299cea42f-service-ca\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:06.474672 master-0 kubenswrapper[29936]: I1205 12:50:06.474637 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:06.474797 master-0 kubenswrapper[29936]: I1205 12:50:06.474775 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:06.474840 master-0 kubenswrapper[29936]: I1205 12:50:06.474811 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/531b8927-92db-4e9d-9a0a-12ff948cdaad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:50:06.474909 master-0 kubenswrapper[29936]: I1205 12:50:06.474860 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:50:06.475283 master-0 kubenswrapper[29936]: I1205 12:50:06.475244 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/531b8927-92db-4e9d-9a0a-12ff948cdaad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:50:06.475332 master-0 kubenswrapper[29936]: I1205 12:50:06.475315 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-rbhdx" Dec 05 12:50:06.495844 master-0 kubenswrapper[29936]: I1205 12:50:06.495745 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-wqbtd" Dec 05 12:50:06.516416 master-0 kubenswrapper[29936]: I1205 12:50:06.516264 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 05 12:50:06.522125 master-0 kubenswrapper[29936]: I1205 12:50:06.522081 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:06.535987 master-0 kubenswrapper[29936]: I1205 12:50:06.535924 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 05 12:50:06.543202 master-0 kubenswrapper[29936]: I1205 12:50:06.543121 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a280c582-685e-47ac-bf6b-248aa0c129a9-cert\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:06.555199 master-0 kubenswrapper[29936]: I1205 12:50:06.555103 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 12:50:06.561915 master-0 kubenswrapper[29936]: I1205 12:50:06.561863 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-stats-auth\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:06.575696 master-0 kubenswrapper[29936]: I1205 12:50:06.575610 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-hcp7n" Dec 05 12:50:06.596203 master-0 kubenswrapper[29936]: I1205 12:50:06.596130 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 12:50:06.601383 master-0 kubenswrapper[29936]: I1205 12:50:06.601335 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-default-certificate\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:06.615124 master-0 kubenswrapper[29936]: I1205 12:50:06.615033 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 12:50:06.624044 master-0 kubenswrapper[29936]: I1205 12:50:06.623988 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a45f340c-0eca-4460-8961-4ca360467eeb-serving-cert\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:50:06.640294 master-0 kubenswrapper[29936]: I1205 12:50:06.640226 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 12:50:06.656155 master-0 kubenswrapper[29936]: I1205 12:50:06.656072 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 12:50:06.675175 master-0 kubenswrapper[29936]: I1205 12:50:06.675093 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 12:50:06.695789 master-0 kubenswrapper[29936]: I1205 12:50:06.695721 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 12:50:06.706888 master-0 kubenswrapper[29936]: I1205 12:50:06.706797 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 05 12:50:06.712077 master-0 kubenswrapper[29936]: I1205 12:50:06.712005 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/20a72c8b-0f12-446b-8a42-53d98864c8f8-metrics-certs\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:06.719652 master-0 kubenswrapper[29936]: I1205 12:50:06.719583 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 12:50:06.733573 master-0 kubenswrapper[29936]: I1205 12:50:06.733507 29936 request.go:700] Waited for 1.997882312s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/configmaps?fieldSelector=metadata.name%3Dbaremetal-kube-rbac-proxy&limit=500&resourceVersion=0 Dec 05 12:50:06.735704 master-0 kubenswrapper[29936]: I1205 12:50:06.735630 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 05 12:50:06.741970 master-0 kubenswrapper[29936]: I1205 12:50:06.741905 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a280c582-685e-47ac-bf6b-248aa0c129a9-config\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:06.755641 master-0 kubenswrapper[29936]: I1205 12:50:06.755569 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-zpvbv" Dec 05 12:50:06.775050 master-0 kubenswrapper[29936]: I1205 12:50:06.774903 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 12:50:06.775298 master-0 kubenswrapper[29936]: I1205 12:50:06.775254 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/95d8fb27-8b2b-4749-add3-9e9b16edb693-proxy-tls\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:06.795323 master-0 kubenswrapper[29936]: I1205 12:50:06.795217 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-bnqtr" Dec 05 12:50:06.816052 master-0 kubenswrapper[29936]: I1205 12:50:06.815966 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 12:50:06.816629 master-0 kubenswrapper[29936]: I1205 12:50:06.816583 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-auth-proxy-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:06.835412 master-0 kubenswrapper[29936]: I1205 12:50:06.835174 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-cz7x2" Dec 05 12:50:06.855568 master-0 kubenswrapper[29936]: I1205 12:50:06.855486 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 12:50:06.863428 master-0 kubenswrapper[29936]: I1205 12:50:06.863336 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-machine-approver-tls\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:06.876095 master-0 kubenswrapper[29936]: I1205 12:50:06.876028 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 12:50:06.884711 master-0 kubenswrapper[29936]: I1205 12:50:06.884639 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-config\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:06.895242 master-0 kubenswrapper[29936]: I1205 12:50:06.895151 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 12:50:06.916307 master-0 kubenswrapper[29936]: I1205 12:50:06.916226 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-fb2xd" Dec 05 12:50:06.935629 master-0 kubenswrapper[29936]: I1205 12:50:06.935557 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 12:50:06.955421 master-0 kubenswrapper[29936]: I1205 12:50:06.955358 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-94n4t" Dec 05 12:50:06.976724 master-0 kubenswrapper[29936]: I1205 12:50:06.976650 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 12:50:06.980565 master-0 kubenswrapper[29936]: I1205 12:50:06.980514 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a72c8b-0f12-446b-8a42-53d98864c8f8-service-ca-bundle\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:06.996037 master-0 kubenswrapper[29936]: I1205 12:50:06.995884 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-m6vhr" Dec 05 12:50:07.020702 master-0 kubenswrapper[29936]: I1205 12:50:07.020635 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-s6pqp" Dec 05 12:50:07.034623 master-0 kubenswrapper[29936]: I1205 12:50:07.034489 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 05 12:50:07.042090 master-0 kubenswrapper[29936]: I1205 12:50:07.042041 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:07.054801 master-0 kubenswrapper[29936]: I1205 12:50:07.054758 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 05 12:50:07.061598 master-0 kubenswrapper[29936]: I1205 12:50:07.061561 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/dbe144b5-3b78-4946-bbf9-b825b0e47b07-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:07.076408 master-0 kubenswrapper[29936]: I1205 12:50:07.076337 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 05 12:50:07.081360 master-0 kubenswrapper[29936]: I1205 12:50:07.081315 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dbe144b5-3b78-4946-bbf9-b825b0e47b07-images\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:07.095372 master-0 kubenswrapper[29936]: I1205 12:50:07.095330 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:50:07.115315 master-0 kubenswrapper[29936]: I1205 12:50:07.115223 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 12:50:07.135563 master-0 kubenswrapper[29936]: I1205 12:50:07.135500 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 05 12:50:07.143612 master-0 kubenswrapper[29936]: I1205 12:50:07.143557 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60327040-f782-4cda-a32d-52a4f183073c-metrics-client-ca\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:07.143673 master-0 kubenswrapper[29936]: I1205 12:50:07.143638 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-metrics-client-ca\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:07.143705 master-0 kubenswrapper[29936]: I1205 12:50:07.143672 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/99996137-2621-458b-980d-584b3640d4ad-metrics-client-ca\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:07.143781 master-0 kubenswrapper[29936]: I1205 12:50:07.143677 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-metrics-client-ca\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:07.156131 master-0 kubenswrapper[29936]: I1205 12:50:07.156072 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 05 12:50:07.164392 master-0 kubenswrapper[29936]: I1205 12:50:07.164339 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:07.175640 master-0 kubenswrapper[29936]: I1205 12:50:07.175582 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 05 12:50:07.185951 master-0 kubenswrapper[29936]: I1205 12:50:07.185908 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/99996137-2621-458b-980d-584b3640d4ad-prometheus-operator-tls\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:07.194956 master-0 kubenswrapper[29936]: I1205 12:50:07.194906 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-ljblm" Dec 05 12:50:07.214854 master-0 kubenswrapper[29936]: I1205 12:50:07.214786 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 12:50:07.225087 master-0 kubenswrapper[29936]: I1205 12:50:07.225027 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1478a21e-b6ac-46fb-ad01-805ac71f0a79-proxy-tls\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:50:07.234681 master-0 kubenswrapper[29936]: I1205 12:50:07.234609 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-bcwx4" Dec 05 12:50:07.256946 master-0 kubenswrapper[29936]: I1205 12:50:07.256838 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 12:50:07.265845 master-0 kubenswrapper[29936]: I1205 12:50:07.265762 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-node-bootstrap-token\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:50:07.276696 master-0 kubenswrapper[29936]: I1205 12:50:07.276634 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 12:50:07.283361 master-0 kubenswrapper[29936]: I1205 12:50:07.283308 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-certs\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:50:07.295867 master-0 kubenswrapper[29936]: I1205 12:50:07.295493 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 05 12:50:07.305058 master-0 kubenswrapper[29936]: I1205 12:50:07.304973 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-tls\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:07.315091 master-0 kubenswrapper[29936]: I1205 12:50:07.315014 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 05 12:50:07.324944 master-0 kubenswrapper[29936]: I1205 12:50:07.324889 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:07.334283 master-0 kubenswrapper[29936]: I1205 12:50:07.334217 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-4gpnc" Dec 05 12:50:07.355225 master-0 kubenswrapper[29936]: I1205 12:50:07.355140 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 05 12:50:07.363719 master-0 kubenswrapper[29936]: I1205 12:50:07.363647 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:07.370835 master-0 kubenswrapper[29936]: E1205 12:50:07.370789 29936 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.371081 master-0 kubenswrapper[29936]: E1205 12:50:07.370870 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.370850588 +0000 UTC m=+5.502930269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.371487 master-0 kubenswrapper[29936]: E1205 12:50:07.371453 29936 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.371544 master-0 kubenswrapper[29936]: E1205 12:50:07.371476 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.371544 master-0 kubenswrapper[29936]: E1205 12:50:07.371521 29936 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.371648 master-0 kubenswrapper[29936]: E1205 12:50:07.371542 29936 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.371696 master-0 kubenswrapper[29936]: E1205 12:50:07.371502 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca podName:c39d2089-d3bf-4556-b6ef-c362a08c21a2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.371490216 +0000 UTC m=+5.503569907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca") pod "controller-manager-b59c5b9bc-vh8fw" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.371696 master-0 kubenswrapper[29936]: E1205 12:50:07.371501 29936 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.371696 master-0 kubenswrapper[29936]: E1205 12:50:07.371681 29936 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.371696 master-0 kubenswrapper[29936]: E1205 12:50:07.371498 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.371854 master-0 kubenswrapper[29936]: E1205 12:50:07.371706 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert podName:c39d2089-d3bf-4556-b6ef-c362a08c21a2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.371671781 +0000 UTC m=+5.503751512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert") pod "controller-manager-b59c5b9bc-vh8fw" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.371854 master-0 kubenswrapper[29936]: E1205 12:50:07.371530 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.371854 master-0 kubenswrapper[29936]: E1205 12:50:07.371746 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config podName:c39d2089-d3bf-4556-b6ef-c362a08c21a2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.371714962 +0000 UTC m=+5.503794653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config") pod "controller-manager-b59c5b9bc-vh8fw" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.371854 master-0 kubenswrapper[29936]: E1205 12:50:07.371786 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.371775613 +0000 UTC m=+5.503855304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.371854 master-0 kubenswrapper[29936]: E1205 12:50:07.371812 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.371803214 +0000 UTC m=+5.503882905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.371854 master-0 kubenswrapper[29936]: E1205 12:50:07.371849 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles podName:c39d2089-d3bf-4556-b6ef-c362a08c21a2 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.371835955 +0000 UTC m=+5.503915646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles") pod "controller-manager-b59c5b9bc-vh8fw" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.372076 master-0 kubenswrapper[29936]: E1205 12:50:07.371884 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.371865176 +0000 UTC m=+5.503944867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.372076 master-0 kubenswrapper[29936]: E1205 12:50:07.371921 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.371901597 +0000 UTC m=+5.503981478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.374696 master-0 kubenswrapper[29936]: I1205 12:50:07.374664 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 05 12:50:07.384698 master-0 kubenswrapper[29936]: I1205 12:50:07.384653 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:07.396156 master-0 kubenswrapper[29936]: I1205 12:50:07.396099 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 12:50:07.415966 master-0 kubenswrapper[29936]: I1205 12:50:07.415906 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 05 12:50:07.424814 master-0 kubenswrapper[29936]: I1205 12:50:07.424761 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:07.435441 master-0 kubenswrapper[29936]: I1205 12:50:07.435380 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 12:50:07.455643 master-0 kubenswrapper[29936]: I1205 12:50:07.455577 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 05 12:50:07.473674 master-0 kubenswrapper[29936]: E1205 12:50:07.473567 29936 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.473925 master-0 kubenswrapper[29936]: E1205 12:50:07.473752 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config podName:e943438b-1de8-435c-8a19-accd6a6292a4 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.473719614 +0000 UTC m=+5.605799295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config") pod "route-controller-manager-554555dbc9-szqjx" (UID: "e943438b-1de8-435c-8a19-accd6a6292a4") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.473925 master-0 kubenswrapper[29936]: E1205 12:50:07.473578 29936 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.473925 master-0 kubenswrapper[29936]: E1205 12:50:07.473812 29936 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.473925 master-0 kubenswrapper[29936]: E1205 12:50:07.473611 29936 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.473925 master-0 kubenswrapper[29936]: E1205 12:50:07.473873 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert podName:e943438b-1de8-435c-8a19-accd6a6292a4 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.473838948 +0000 UTC m=+5.605918639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert") pod "route-controller-manager-554555dbc9-szqjx" (UID: "e943438b-1de8-435c-8a19-accd6a6292a4") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.473925 master-0 kubenswrapper[29936]: E1205 12:50:07.473901 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls podName:5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.473892309 +0000 UTC m=+5.605972000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls") pod "openshift-state-metrics-5974b6b869-w9l2z" (UID: "5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.473925 master-0 kubenswrapper[29936]: E1205 12:50:07.473925 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs podName:d3e283fe-a474-4f83-ad66-62971945060a nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.47391398 +0000 UTC m=+5.605993671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs") pod "multus-admission-controller-8dbbb5754-j7x5j" (UID: "d3e283fe-a474-4f83-ad66-62971945060a") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.474843 master-0 kubenswrapper[29936]: E1205 12:50:07.474748 29936 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.474843 master-0 kubenswrapper[29936]: E1205 12:50:07.474843 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca podName:e943438b-1de8-435c-8a19-accd6a6292a4 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.474822925 +0000 UTC m=+5.606902606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca") pod "route-controller-manager-554555dbc9-szqjx" (UID: "e943438b-1de8-435c-8a19-accd6a6292a4") : failed to sync configmap cache: timed out waiting for the condition Dec 05 12:50:07.474974 master-0 kubenswrapper[29936]: E1205 12:50:07.474756 29936 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.474974 master-0 kubenswrapper[29936]: E1205 12:50:07.474936 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls podName:60327040-f782-4cda-a32d-52a4f183073c nodeName:}" failed. No retries permitted until 2025-12-05 12:50:08.474919308 +0000 UTC m=+5.606998999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls") pod "node-exporter-z2nmc" (UID: "60327040-f782-4cda-a32d-52a4f183073c") : failed to sync secret cache: timed out waiting for the condition Dec 05 12:50:07.476397 master-0 kubenswrapper[29936]: I1205 12:50:07.476358 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-4cwgg" Dec 05 12:50:07.496707 master-0 kubenswrapper[29936]: I1205 12:50:07.496630 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 05 12:50:07.517324 master-0 kubenswrapper[29936]: I1205 12:50:07.517205 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-bd2pn" Dec 05 12:50:08.168390 master-0 kubenswrapper[29936]: I1205 12:50:08.168305 29936 request.go:700] Waited for 3.379424569s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-network-operator/serviceaccounts/cluster-network-operator/token Dec 05 12:50:08.180364 master-0 kubenswrapper[29936]: I1205 12:50:08.180240 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 12:50:08.180570 master-0 kubenswrapper[29936]: I1205 12:50:08.180362 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 05 12:50:08.180908 master-0 kubenswrapper[29936]: I1205 12:50:08.180881 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 12:50:08.181563 master-0 kubenswrapper[29936]: I1205 12:50:08.181442 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-fqrhd" Dec 05 12:50:08.181974 master-0 kubenswrapper[29936]: I1205 12:50:08.181910 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 05 12:50:08.182576 master-0 kubenswrapper[29936]: I1205 12:50:08.182497 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-z9sgn" Dec 05 12:50:08.182955 master-0 kubenswrapper[29936]: I1205 12:50:08.182722 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-e63soeg91on8p" Dec 05 12:50:08.183120 master-0 kubenswrapper[29936]: I1205 12:50:08.183073 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-sxj7j" Dec 05 12:50:08.183365 master-0 kubenswrapper[29936]: I1205 12:50:08.183302 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 12:50:08.193171 master-0 kubenswrapper[29936]: I1205 12:50:08.184511 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 05 12:50:08.193171 master-0 kubenswrapper[29936]: I1205 12:50:08.185638 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 12:50:08.193171 master-0 kubenswrapper[29936]: I1205 12:50:08.188366 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 12:50:08.193171 master-0 kubenswrapper[29936]: I1205 12:50:08.188800 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 12:50:08.221207 master-0 kubenswrapper[29936]: I1205 12:50:08.221031 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 12:50:08.232400 master-0 kubenswrapper[29936]: I1205 12:50:08.226846 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 12:50:08.232400 master-0 kubenswrapper[29936]: I1205 12:50:08.226932 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 05 12:50:08.232400 master-0 kubenswrapper[29936]: I1205 12:50:08.226984 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 12:50:08.232400 master-0 kubenswrapper[29936]: I1205 12:50:08.229081 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g7mj\" (UniqueName: \"kubernetes.io/projected/3b741029-0eb5-409b-b7f1-95e8385dc400-kube-api-access-5g7mj\") pod \"catalogd-controller-manager-7cc89f4c4c-n28z2\" (UID: \"3b741029-0eb5-409b-b7f1-95e8385dc400\") " pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:08.232400 master-0 kubenswrapper[29936]: I1205 12:50:08.231216 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqdxl\" (UniqueName: \"kubernetes.io/projected/cf8247a1-703a-46b3-9a33-25a73b27ab99-kube-api-access-fqdxl\") pod \"service-ca-77c99c46b8-44qrw\" (UID: \"cf8247a1-703a-46b3-9a33-25a73b27ab99\") " pod="openshift-service-ca/service-ca-77c99c46b8-44qrw" Dec 05 12:50:08.232400 master-0 kubenswrapper[29936]: I1205 12:50:08.231852 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-996h9\" (UniqueName: \"kubernetes.io/projected/5efad170-c154-42ec-a7c0-b36a98d2bfcc-kube-api-access-996h9\") pod \"network-operator-79767b7ff9-h8qkj\" (UID: \"5efad170-c154-42ec-a7c0-b36a98d2bfcc\") " pod="openshift-network-operator/network-operator-79767b7ff9-h8qkj" Dec 05 12:50:08.232400 master-0 kubenswrapper[29936]: I1205 12:50:08.231846 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czcmr\" (UniqueName: \"kubernetes.io/projected/9c31f89c-b01b-4853-a901-bccc25441a46-kube-api-access-czcmr\") pod \"redhat-operators-wfk7f\" (UID: \"9c31f89c-b01b-4853-a901-bccc25441a46\") " pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:50:08.232871 master-0 kubenswrapper[29936]: I1205 12:50:08.232448 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-nz5rx" Dec 05 12:50:08.239792 master-0 kubenswrapper[29936]: I1205 12:50:08.239583 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdtr\" (UniqueName: \"kubernetes.io/projected/1ee7a76b-cf1d-4513-b314-5aa314da818d-kube-api-access-lkdtr\") pod \"machine-api-operator-88d48b57d-x947v\" (UID: \"1ee7a76b-cf1d-4513-b314-5aa314da818d\") " pod="openshift-machine-api/machine-api-operator-88d48b57d-x947v" Dec 05 12:50:08.248106 master-0 kubenswrapper[29936]: I1205 12:50:08.246988 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfl8f\" (UniqueName: \"kubernetes.io/projected/b9623eb8-55d2-4c5c-aa8d-74b6a27274d8-kube-api-access-hfl8f\") pod \"csi-snapshot-controller-6b958b6f94-7r5wv\" (UID: \"b9623eb8-55d2-4c5c-aa8d-74b6a27274d8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6b958b6f94-7r5wv" Dec 05 12:50:08.248106 master-0 kubenswrapper[29936]: I1205 12:50:08.247054 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wst\" (UniqueName: \"kubernetes.io/projected/b74e0607-6ed0-4119-8870-895b7d336830-kube-api-access-72wst\") pod \"community-operators-2pp25\" (UID: \"b74e0607-6ed0-4119-8870-895b7d336830\") " pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:50:08.248106 master-0 kubenswrapper[29936]: I1205 12:50:08.248010 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb46q\" (UniqueName: \"kubernetes.io/projected/e5dfcb1e-1231-4f07-8c21-748965718729-kube-api-access-pb46q\") pod \"cluster-autoscaler-operator-5f49d774cd-vdb8r\" (UID: \"e5dfcb1e-1231-4f07-8c21-748965718729\") " pod="openshift-machine-api/cluster-autoscaler-operator-5f49d774cd-vdb8r" Dec 05 12:50:08.248106 master-0 kubenswrapper[29936]: I1205 12:50:08.248035 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqq7\" (UniqueName: \"kubernetes.io/projected/a280c582-685e-47ac-bf6b-248aa0c129a9-kube-api-access-xkqq7\") pod \"cluster-baremetal-operator-78f758c7b9-5xg2k\" (UID: \"a280c582-685e-47ac-bf6b-248aa0c129a9\") " pod="openshift-machine-api/cluster-baremetal-operator-78f758c7b9-5xg2k" Dec 05 12:50:08.255111 master-0 kubenswrapper[29936]: I1205 12:50:08.248492 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lckv7\" (UniqueName: \"kubernetes.io/projected/665c4362-e2e5-4f96-92c0-1746c63c7422-kube-api-access-lckv7\") pod \"cloud-credential-operator-698c598cfc-jzrqj\" (UID: \"665c4362-e2e5-4f96-92c0-1746c63c7422\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-698c598cfc-jzrqj" Dec 05 12:50:08.255111 master-0 kubenswrapper[29936]: I1205 12:50:08.249173 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7n7\" (UniqueName: \"kubernetes.io/projected/a14df948-1ec4-4785-ad33-28d1e7063959-kube-api-access-2g7n7\") pod \"insights-operator-55965856b6-q9qdg\" (UID: \"a14df948-1ec4-4785-ad33-28d1e7063959\") " pod="openshift-insights/insights-operator-55965856b6-q9qdg" Dec 05 12:50:08.255111 master-0 kubenswrapper[29936]: I1205 12:50:08.250545 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dwm5\" (UniqueName: \"kubernetes.io/projected/20a72c8b-0f12-446b-8a42-53d98864c8f8-kube-api-access-6dwm5\") pod \"router-default-5465c8b4db-dzlmb\" (UID: \"20a72c8b-0f12-446b-8a42-53d98864c8f8\") " pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:08.255111 master-0 kubenswrapper[29936]: I1205 12:50:08.251000 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tngh\" (UniqueName: \"kubernetes.io/projected/d53a4886-db25-43a1-825a-66a9a9a58590-kube-api-access-2tngh\") pod \"openshift-controller-manager-operator-6c8676f99d-546vz\" (UID: \"d53a4886-db25-43a1-825a-66a9a9a58590\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-6c8676f99d-546vz" Dec 05 12:50:08.255111 master-0 kubenswrapper[29936]: I1205 12:50:08.252994 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 12:50:08.255111 master-0 kubenswrapper[29936]: I1205 12:50:08.254210 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-422c9\" (UniqueName: \"kubernetes.io/projected/ce9e2a6b-8ce7-477c-8bc7-24033243eabe-kube-api-access-422c9\") pod \"dns-default-rzl84\" (UID: \"ce9e2a6b-8ce7-477c-8bc7-24033243eabe\") " pod="openshift-dns/dns-default-rzl84" Dec 05 12:50:08.255457 master-0 kubenswrapper[29936]: I1205 12:50:08.255217 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht5kr\" (UniqueName: \"kubernetes.io/projected/f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb-kube-api-access-ht5kr\") pod \"olm-operator-7cd7dbb44c-xdbtz\" (UID: \"f7a85ed8-5cb1-44f3-a06d-9f8a6ab78ecb\") " pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:50:08.266334 master-0 kubenswrapper[29936]: I1205 12:50:08.265283 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55qpg\" (UniqueName: \"kubernetes.io/projected/ba095394-1873-4793-969d-3be979fa0771-kube-api-access-55qpg\") pod \"authentication-operator-6c968fdfdf-xxmfp\" (UID: \"ba095394-1873-4793-969d-3be979fa0771\") " pod="openshift-authentication-operator/authentication-operator-6c968fdfdf-xxmfp" Dec 05 12:50:08.287201 master-0 kubenswrapper[29936]: I1205 12:50:08.282482 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z8h9\" (UniqueName: \"kubernetes.io/projected/a757f807-e1bf-4f1e-9787-6b4acc8d09cf-kube-api-access-9z8h9\") pod \"ovnkube-control-plane-5df5548d54-7tvfb\" (UID: \"a757f807-e1bf-4f1e-9787-6b4acc8d09cf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5df5548d54-7tvfb" Dec 05 12:50:08.287201 master-0 kubenswrapper[29936]: I1205 12:50:08.284336 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1871a9d6-6369-4d08-816f-9c6310b61ddf-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f85974995-4vsjv\" (UID: \"1871a9d6-6369-4d08-816f-9c6310b61ddf\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f85974995-4vsjv" Dec 05 12:50:08.299985 master-0 kubenswrapper[29936]: I1205 12:50:08.297546 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps4ws\" (UniqueName: \"kubernetes.io/projected/58187662-b502-4d90-95ce-2aa91a81d256-kube-api-access-ps4ws\") pod \"cluster-monitoring-operator-7ff994598c-lgc7z\" (UID: \"58187662-b502-4d90-95ce-2aa91a81d256\") " pod="openshift-monitoring/cluster-monitoring-operator-7ff994598c-lgc7z" Dec 05 12:50:08.318351 master-0 kubenswrapper[29936]: I1205 12:50:08.317519 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr2r9\" (UniqueName: \"kubernetes.io/projected/153fec1f-a10b-4c6c-a997-60fa80c13a86-kube-api-access-dr2r9\") pod \"operator-controller-controller-manager-7cbd59c7f8-d9g7k\" (UID: \"153fec1f-a10b-4c6c-a997-60fa80c13a86\") " pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:08.336993 master-0 kubenswrapper[29936]: I1205 12:50:08.336010 29936 scope.go:117] "RemoveContainer" containerID="b7483a678d691fbf8a3207dd7d6ed1c739a3647a4a630049897502326cc17230" Dec 05 12:50:08.344446 master-0 kubenswrapper[29936]: I1205 12:50:08.343373 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwcr9\" (UniqueName: \"kubernetes.io/projected/f119ffe4-16bd-49eb-916d-b18ba0d79b54-kube-api-access-wwcr9\") pod \"etcd-operator-5bf4d88c6f-dxd24\" (UID: \"f119ffe4-16bd-49eb-916d-b18ba0d79b54\") " pod="openshift-etcd-operator/etcd-operator-5bf4d88c6f-dxd24" Dec 05 12:50:08.349652 master-0 kubenswrapper[29936]: I1205 12:50:08.349609 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmjkp\" (UniqueName: \"kubernetes.io/projected/b13885ef-d2b5-4591-825d-446cf8729bc1-kube-api-access-xmjkp\") pod \"packageserver-58c5755b49-6dx4c\" (UID: \"b13885ef-d2b5-4591-825d-446cf8729bc1\") " pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:08.385830 master-0 kubenswrapper[29936]: I1205 12:50:08.385769 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnwdh\" (UniqueName: \"kubernetes.io/projected/a5338041-f213-46ef-9d81-248567ba958d-kube-api-access-bnwdh\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.391966 master-0 kubenswrapper[29936]: I1205 12:50:08.391918 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.391966 master-0 kubenswrapper[29936]: I1205 12:50:08.391994 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:08.392227 master-0 kubenswrapper[29936]: I1205 12:50:08.392069 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.392227 master-0 kubenswrapper[29936]: I1205 12:50:08.392104 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:08.392308 master-0 kubenswrapper[29936]: I1205 12:50:08.392228 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:08.392308 master-0 kubenswrapper[29936]: I1205 12:50:08.392283 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.392308 master-0 kubenswrapper[29936]: I1205 12:50:08.392306 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.392787 master-0 kubenswrapper[29936]: I1205 12:50:08.392746 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.393260 master-0 kubenswrapper[29936]: I1205 12:50:08.393192 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.393318 master-0 kubenswrapper[29936]: I1205 12:50:08.393205 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.393350 master-0 kubenswrapper[29936]: I1205 12:50:08.393302 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:08.393485 master-0 kubenswrapper[29936]: I1205 12:50:08.393451 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:08.393604 master-0 kubenswrapper[29936]: I1205 12:50:08.393569 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.393810 master-0 kubenswrapper[29936]: I1205 12:50:08.393779 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.393849 master-0 kubenswrapper[29936]: I1205 12:50:08.393836 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:08.393907 master-0 kubenswrapper[29936]: I1205 12:50:08.393883 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:08.394156 master-0 kubenswrapper[29936]: I1205 12:50:08.394128 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:08.394248 master-0 kubenswrapper[29936]: I1205 12:50:08.394215 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles\") pod \"metrics-server-54c5748c8c-kqs7s\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:08.398707 master-0 kubenswrapper[29936]: I1205 12:50:08.398651 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69n5s\" (UniqueName: \"kubernetes.io/projected/fb7003a6-4341-49eb-bec3-76ba8610fa12-kube-api-access-69n5s\") pod \"network-metrics-daemon-99djw\" (UID: \"fb7003a6-4341-49eb-bec3-76ba8610fa12\") " pod="openshift-multus/network-metrics-daemon-99djw" Dec 05 12:50:08.408034 master-0 kubenswrapper[29936]: I1205 12:50:08.407924 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmjn7\" (UniqueName: \"kubernetes.io/projected/bc18a83a-998e-458e-87f0-d5368da52e1b-kube-api-access-bmjn7\") pod \"node-resolver-f6j7m\" (UID: \"bc18a83a-998e-458e-87f0-d5368da52e1b\") " pod="openshift-dns/node-resolver-f6j7m" Dec 05 12:50:08.427079 master-0 kubenswrapper[29936]: I1205 12:50:08.426948 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxwwh\" (UniqueName: \"kubernetes.io/projected/e2e2d968-9946-4711-aaf0-3e3a03bff415-kube-api-access-pxwwh\") pod \"network-check-source-85d8db45d4-kkllg\" (UID: \"e2e2d968-9946-4711-aaf0-3e3a03bff415\") " pod="openshift-network-diagnostics/network-check-source-85d8db45d4-kkllg" Dec 05 12:50:08.446994 master-0 kubenswrapper[29936]: I1205 12:50:08.446947 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62nqj\" (UniqueName: \"kubernetes.io/projected/0dda6d9b-cb3a-413a-85af-ef08f15ea42e-kube-api-access-62nqj\") pod \"package-server-manager-67477646d4-9vfxw\" (UID: \"0dda6d9b-cb3a-413a-85af-ef08f15ea42e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:50:08.465235 master-0 kubenswrapper[29936]: I1205 12:50:08.465157 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbg7w\" (UniqueName: \"kubernetes.io/projected/dbe144b5-3b78-4946-bbf9-b825b0e47b07-kube-api-access-mbg7w\") pod \"cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm\" (UID: \"dbe144b5-3b78-4946-bbf9-b825b0e47b07\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-758cf9d97b-fdtfm" Dec 05 12:50:08.486010 master-0 kubenswrapper[29936]: I1205 12:50:08.485955 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlnqb\" (UniqueName: \"kubernetes.io/projected/c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6-kube-api-access-mlnqb\") pod \"iptables-alerter-nwplt\" (UID: \"c60d8ba4-83ed-4b90-9359-0ea9e6ea3ef6\") " pod="openshift-network-operator/iptables-alerter-nwplt" Dec 05 12:50:08.495244 master-0 kubenswrapper[29936]: I1205 12:50:08.495198 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:08.495512 master-0 kubenswrapper[29936]: I1205 12:50:08.495476 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:08.495688 master-0 kubenswrapper[29936]: I1205 12:50:08.495639 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/60327040-f782-4cda-a32d-52a4f183073c-node-exporter-tls\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:08.495779 master-0 kubenswrapper[29936]: I1205 12:50:08.495742 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:08.495813 master-0 kubenswrapper[29936]: I1205 12:50:08.495794 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:08.495927 master-0 kubenswrapper[29936]: I1205 12:50:08.495895 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:08.495986 master-0 kubenswrapper[29936]: I1205 12:50:08.495961 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:50:08.496059 master-0 kubenswrapper[29936]: I1205 12:50:08.496035 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:08.496139 master-0 kubenswrapper[29936]: I1205 12:50:08.496062 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:08.496250 master-0 kubenswrapper[29936]: I1205 12:50:08.496222 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:08.496409 master-0 kubenswrapper[29936]: I1205 12:50:08.496376 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:08.496409 master-0 kubenswrapper[29936]: I1205 12:50:08.496393 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3e283fe-a474-4f83-ad66-62971945060a-webhook-certs\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:50:08.511920 master-0 kubenswrapper[29936]: I1205 12:50:08.511859 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr9jd\" (UniqueName: \"kubernetes.io/projected/c39d2089-d3bf-4556-b6ef-c362a08c21a2-kube-api-access-mr9jd\") pod \"controller-manager-b59c5b9bc-vh8fw\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:08.526835 master-0 kubenswrapper[29936]: I1205 12:50:08.526782 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69z2l\" (UniqueName: \"kubernetes.io/projected/f4a70855-80b5-4d6a-bed1-b42364940de0-kube-api-access-69z2l\") pod \"network-check-target-qsggt\" (UID: \"f4a70855-80b5-4d6a-bed1-b42364940de0\") " pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:50:08.566664 master-0 kubenswrapper[29936]: I1205 12:50:08.566596 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqblj\" (UniqueName: \"kubernetes.io/projected/531b8927-92db-4e9d-9a0a-12ff948cdaad-kube-api-access-xqblj\") pod \"control-plane-machine-set-operator-7df95c79b5-ldg5j\" (UID: \"531b8927-92db-4e9d-9a0a-12ff948cdaad\") " pod="openshift-machine-api/control-plane-machine-set-operator-7df95c79b5-ldg5j" Dec 05 12:50:08.589517 master-0 kubenswrapper[29936]: I1205 12:50:08.589431 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/594aaded-5615-4bed-87ee-6173059a73be-kube-api-access\") pod \"kube-controller-manager-operator-848f645654-g6nj5\" (UID: \"594aaded-5615-4bed-87ee-6173059a73be\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-848f645654-g6nj5" Dec 05 12:50:08.607467 master-0 kubenswrapper[29936]: I1205 12:50:08.607409 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tncxt\" (UniqueName: \"kubernetes.io/projected/ebfbe878-1796-4a20-b3f0-76165038252e-kube-api-access-tncxt\") pod \"redhat-marketplace-dmnvq\" (UID: \"ebfbe878-1796-4a20-b3f0-76165038252e\") " pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:50:08.635878 master-0 kubenswrapper[29936]: E1205 12:50:08.635813 29936 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 05 12:50:08.635878 master-0 kubenswrapper[29936]: E1205 12:50:08.635869 29936 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-3-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 05 12:50:08.636102 master-0 kubenswrapper[29936]: E1205 12:50:08.635971 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d215811-6210-4ec2-8356-f1533dc43f65-kube-api-access podName:4d215811-6210-4ec2-8356-f1533dc43f65 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:09.135931563 +0000 UTC m=+6.268011244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/4d215811-6210-4ec2-8356-f1533dc43f65-kube-api-access") pod "installer-3-master-0" (UID: "4d215811-6210-4ec2-8356-f1533dc43f65") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 05 12:50:08.651332 master-0 kubenswrapper[29936]: I1205 12:50:08.651269 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2gd8\" (UniqueName: \"kubernetes.io/projected/1e6babfe-724a-4eab-bb3b-bc318bf57b70-kube-api-access-c2gd8\") pod \"marketplace-operator-f797b99b6-vwhxt\" (UID: \"1e6babfe-724a-4eab-bb3b-bc318bf57b70\") " pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:50:08.674979 master-0 kubenswrapper[29936]: I1205 12:50:08.674239 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwrwm\" (UniqueName: \"kubernetes.io/projected/f2635f9f-219b-4d03-b5b3-496c0c836fae-kube-api-access-fwrwm\") pod \"tuned-dcvtr\" (UID: \"f2635f9f-219b-4d03-b5b3-496c0c836fae\") " pod="openshift-cluster-node-tuning-operator/tuned-dcvtr" Dec 05 12:50:08.687821 master-0 kubenswrapper[29936]: I1205 12:50:08.687728 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7ftf\" (UniqueName: \"kubernetes.io/projected/a45f340c-0eca-4460-8961-4ca360467eeb-kube-api-access-r7ftf\") pod \"openshift-config-operator-68758cbcdb-tmzpj\" (UID: \"a45f340c-0eca-4460-8961-4ca360467eeb\") " pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:50:08.698832 master-0 kubenswrapper[29936]: I1205 12:50:08.698799 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d215811-6210-4ec2-8356-f1533dc43f65-kube-api-access\") pod \"4d215811-6210-4ec2-8356-f1533dc43f65\" (UID: \"4d215811-6210-4ec2-8356-f1533dc43f65\") " Dec 05 12:50:08.702228 master-0 kubenswrapper[29936]: I1205 12:50:08.701891 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d215811-6210-4ec2-8356-f1533dc43f65-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4d215811-6210-4ec2-8356-f1533dc43f65" (UID: "4d215811-6210-4ec2-8356-f1533dc43f65"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:50:08.713440 master-0 kubenswrapper[29936]: I1205 12:50:08.713372 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz4q6\" (UniqueName: \"kubernetes.io/projected/1478a21e-b6ac-46fb-ad01-805ac71f0a79-kube-api-access-fz4q6\") pod \"machine-config-controller-7c6d64c4cd-8qxz6\" (UID: \"1478a21e-b6ac-46fb-ad01-805ac71f0a79\") " pod="openshift-machine-config-operator/machine-config-controller-7c6d64c4cd-8qxz6" Dec 05 12:50:08.730116 master-0 kubenswrapper[29936]: I1205 12:50:08.730054 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwz29\" (UniqueName: \"kubernetes.io/projected/db2e54b6-4879-40f4-9359-a8b0c31e76c2-kube-api-access-nwz29\") pod \"catalog-operator-fbc6455c4-jmn7x\" (UID: \"db2e54b6-4879-40f4-9359-a8b0c31e76c2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:50:08.745965 master-0 kubenswrapper[29936]: I1205 12:50:08.745919 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb42t\" (UniqueName: \"kubernetes.io/projected/95d8fb27-8b2b-4749-add3-9e9b16edb693-kube-api-access-fb42t\") pod \"machine-config-daemon-45nwc\" (UID: \"95d8fb27-8b2b-4749-add3-9e9b16edb693\") " pod="openshift-machine-config-operator/machine-config-daemon-45nwc" Dec 05 12:50:08.773749 master-0 kubenswrapper[29936]: I1205 12:50:08.773701 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bjs8\" (UniqueName: \"kubernetes.io/projected/4e9ba71a-d1b5-4986-babe-2c15c19f9cc2-kube-api-access-4bjs8\") pod \"kube-state-metrics-5857974f64-8p9n7\" (UID: \"4e9ba71a-d1b5-4986-babe-2c15c19f9cc2\") " pod="openshift-monitoring/kube-state-metrics-5857974f64-8p9n7" Dec 05 12:50:08.793696 master-0 kubenswrapper[29936]: I1205 12:50:08.793648 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6wsq\" (UniqueName: \"kubernetes.io/projected/7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f-kube-api-access-b6wsq\") pod \"cluster-samples-operator-797cfd8b47-6v84m\" (UID: \"7c7a79c9-cf12-4bd6-a1fe-cf36e11eab9f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-797cfd8b47-6v84m" Dec 05 12:50:08.801301 master-0 kubenswrapper[29936]: I1205 12:50:08.801252 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4d215811-6210-4ec2-8356-f1533dc43f65-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:50:08.808578 master-0 kubenswrapper[29936]: I1205 12:50:08.808535 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d0792bf-e2da-4ee7-91fe-032299cea42f-kube-api-access\") pod \"cluster-version-operator-6d5d5dcc89-gktn5\" (UID: \"7d0792bf-e2da-4ee7-91fe-032299cea42f\") " pod="openshift-cluster-version/cluster-version-operator-6d5d5dcc89-gktn5" Dec 05 12:50:08.827065 master-0 kubenswrapper[29936]: I1205 12:50:08.827028 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp957\" (UniqueName: \"kubernetes.io/projected/60327040-f782-4cda-a32d-52a4f183073c-kube-api-access-zp957\") pod \"node-exporter-z2nmc\" (UID: \"60327040-f782-4cda-a32d-52a4f183073c\") " pod="openshift-monitoring/node-exporter-z2nmc" Dec 05 12:50:08.849125 master-0 kubenswrapper[29936]: I1205 12:50:08.849070 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-bound-sa-token\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:50:08.871311 master-0 kubenswrapper[29936]: I1205 12:50:08.871238 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nml2g\" (UniqueName: \"kubernetes.io/projected/a2acba71-b9dc-4b85-be35-c995b8be2f19-kube-api-access-nml2g\") pod \"cluster-node-tuning-operator-85cff47f46-p9xtc\" (UID: \"a2acba71-b9dc-4b85-be35-c995b8be2f19\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-85cff47f46-p9xtc" Dec 05 12:50:08.891512 master-0 kubenswrapper[29936]: I1205 12:50:08.891434 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48ns8\" (UniqueName: \"kubernetes.io/projected/480c1f6e-0e13-49f9-bc4e-07350842f16c-kube-api-access-48ns8\") pod \"migrator-74b7b57c65-fp4s6\" (UID: \"480c1f6e-0e13-49f9-bc4e-07350842f16c\") " pod="openshift-kube-storage-version-migrator/migrator-74b7b57c65-fp4s6" Dec 05 12:50:08.907387 master-0 kubenswrapper[29936]: I1205 12:50:08.907324 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5p5d\" (UniqueName: \"kubernetes.io/projected/5f0c6889-0739-48a3-99cd-6db9d1f83242-kube-api-access-p5p5d\") pod \"dns-operator-7c56cf9b74-z9g7c\" (UID: \"5f0c6889-0739-48a3-99cd-6db9d1f83242\") " pod="openshift-dns-operator/dns-operator-7c56cf9b74-z9g7c" Dec 05 12:50:08.927658 master-0 kubenswrapper[29936]: I1205 12:50:08.927616 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss5kh\" (UniqueName: \"kubernetes.io/projected/3d96c85a-fc88-46af-83d5-6c71ec6e2c23-kube-api-access-ss5kh\") pod \"cluster-storage-operator-dcf7fc84b-vxmv7\" (UID: \"3d96c85a-fc88-46af-83d5-6c71ec6e2c23\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-dcf7fc84b-vxmv7" Dec 05 12:50:08.947926 master-0 kubenswrapper[29936]: I1205 12:50:08.947878 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqvfm\" (UniqueName: \"kubernetes.io/projected/5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76-kube-api-access-nqvfm\") pod \"openshift-state-metrics-5974b6b869-w9l2z\" (UID: \"5a5feb84-8f02-4c13-b7e3-82a7ac8f7f76\") " pod="openshift-monitoring/openshift-state-metrics-5974b6b869-w9l2z" Dec 05 12:50:08.967715 master-0 kubenswrapper[29936]: I1205 12:50:08.967665 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x59kd\" (UniqueName: \"kubernetes.io/projected/f725fa37-ef11-479a-8cf9-f4b90fe5e7a1-kube-api-access-x59kd\") pod \"multus-5nqhk\" (UID: \"f725fa37-ef11-479a-8cf9-f4b90fe5e7a1\") " pod="openshift-multus/multus-5nqhk" Dec 05 12:50:08.989174 master-0 kubenswrapper[29936]: I1205 12:50:08.989089 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjp62\" (UniqueName: \"kubernetes.io/projected/d72b2b71-27b2-4aff-bf69-7054a9556318-kube-api-access-wjp62\") pod \"apiserver-5bdfbf6949-2bhqv\" (UID: \"d72b2b71-27b2-4aff-bf69-7054a9556318\") " pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:09.016120 master-0 kubenswrapper[29936]: I1205 12:50:09.016065 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjkjz\" (UniqueName: \"kubernetes.io/projected/dc5db54b-094f-4c36-a0ad-042e9fc2b61d-kube-api-access-tjkjz\") pod \"machine-config-server-4s89l\" (UID: \"dc5db54b-094f-4c36-a0ad-042e9fc2b61d\") " pod="openshift-machine-config-operator/machine-config-server-4s89l" Dec 05 12:50:09.030125 master-0 kubenswrapper[29936]: I1205 12:50:09.030073 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph9w6\" (UniqueName: \"kubernetes.io/projected/ce3d73c1-f4bd-4c91-936a-086dfa5e3460-kube-api-access-ph9w6\") pod \"cluster-olm-operator-56fcb6cc5f-q9njf\" (UID: \"ce3d73c1-f4bd-4c91-936a-086dfa5e3460\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-56fcb6cc5f-q9njf" Dec 05 12:50:09.047100 master-0 kubenswrapper[29936]: I1205 12:50:09.046437 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-bound-sa-token\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:50:09.067712 master-0 kubenswrapper[29936]: I1205 12:50:09.067646 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vx2z\" (UniqueName: \"kubernetes.io/projected/708bf629-9949-4b79-a88a-c73ba033475b-kube-api-access-6vx2z\") pod \"multus-additional-cni-plugins-prt97\" (UID: \"708bf629-9949-4b79-a88a-c73ba033475b\") " pod="openshift-multus/multus-additional-cni-plugins-prt97" Dec 05 12:50:09.087874 master-0 kubenswrapper[29936]: I1205 12:50:09.087806 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjbwh\" (UniqueName: \"kubernetes.io/projected/d3e283fe-a474-4f83-ad66-62971945060a-kube-api-access-pjbwh\") pod \"multus-admission-controller-8dbbb5754-j7x5j\" (UID: \"d3e283fe-a474-4f83-ad66-62971945060a\") " pod="openshift-multus/multus-admission-controller-8dbbb5754-j7x5j" Dec 05 12:50:09.106005 master-0 kubenswrapper[29936]: I1205 12:50:09.105944 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh58c\" (UniqueName: \"kubernetes.io/projected/f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb-kube-api-access-dh58c\") pod \"apiserver-845d4454f8-kcq9s\" (UID: \"f118b6b2-40f8-4e1b-bf14-5e5cfcd155eb\") " pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:09.126523 master-0 kubenswrapper[29936]: I1205 12:50:09.126474 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5hdg\" (UniqueName: \"kubernetes.io/projected/38941513-e968-45f1-9cb2-b63d40338f36-kube-api-access-t5hdg\") pod \"cluster-image-registry-operator-6fb9f88b7-sxxpq\" (UID: \"38941513-e968-45f1-9cb2-b63d40338f36\") " pod="openshift-image-registry/cluster-image-registry-operator-6fb9f88b7-sxxpq" Dec 05 12:50:09.156575 master-0 kubenswrapper[29936]: I1205 12:50:09.156512 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjgb\" (UniqueName: \"kubernetes.io/projected/365bf663-fd5b-44df-a327-0438995c015d-kube-api-access-lqjgb\") pod \"machine-config-operator-dc5d7666f-dqtsx\" (UID: \"365bf663-fd5b-44df-a327-0438995c015d\") " pod="openshift-machine-config-operator/machine-config-operator-dc5d7666f-dqtsx" Dec 05 12:50:09.166516 master-0 kubenswrapper[29936]: I1205 12:50:09.166449 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtjln\" (UniqueName: \"kubernetes.io/projected/4b7f0d8d-a2bf-4550-b6e6-1c56adae827e-kube-api-access-xtjln\") pod \"openshift-apiserver-operator-7bf7f6b755-b2pxs\" (UID: \"4b7f0d8d-a2bf-4550-b6e6-1c56adae827e\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-7bf7f6b755-b2pxs" Dec 05 12:50:09.173091 master-0 kubenswrapper[29936]: I1205 12:50:09.173010 29936 request.go:700] Waited for 4.260686285s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ovn-kubernetes/serviceaccounts/ovn-kubernetes-node/token Dec 05 12:50:09.193926 master-0 kubenswrapper[29936]: I1205 12:50:09.193854 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmq98\" (UniqueName: \"kubernetes.io/projected/4492c55f-701b-4ec8-ada1-0a5dc126d405-kube-api-access-dmq98\") pod \"ovnkube-node-9vqtb\" (UID: \"4492c55f-701b-4ec8-ada1-0a5dc126d405\") " pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:09.210158 master-0 kubenswrapper[29936]: I1205 12:50:09.210011 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26x2z\" (UniqueName: \"kubernetes.io/projected/49760d62-02e5-4882-b47f-663102b04946-kube-api-access-26x2z\") pod \"csi-snapshot-controller-operator-6bc8656fdc-zn7hv\" (UID: \"49760d62-02e5-4882-b47f-663102b04946\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6bc8656fdc-zn7hv" Dec 05 12:50:09.226377 master-0 kubenswrapper[29936]: I1205 12:50:09.226318 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/807d9093-aa67-4840-b5be-7f3abcc1beed-kube-api-access\") pod \"kube-apiserver-operator-765d9ff747-rw57t\" (UID: \"807d9093-aa67-4840-b5be-7f3abcc1beed\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-765d9ff747-rw57t" Dec 05 12:50:09.248606 master-0 kubenswrapper[29936]: I1205 12:50:09.248522 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfknz\" (UniqueName: \"kubernetes.io/projected/e943438b-1de8-435c-8a19-accd6a6292a4-kube-api-access-lfknz\") pod \"route-controller-manager-554555dbc9-szqjx\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:09.267280 master-0 kubenswrapper[29936]: I1205 12:50:09.267217 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nbxt\" (UniqueName: \"kubernetes.io/projected/db27bee9-3d33-4c4a-b38b-72f7cec77c7a-kube-api-access-2nbxt\") pod \"machine-approver-74d9cbffbc-r7kbd\" (UID: \"db27bee9-3d33-4c4a-b38b-72f7cec77c7a\") " pod="openshift-cluster-machine-approver/machine-approver-74d9cbffbc-r7kbd" Dec 05 12:50:09.286208 master-0 kubenswrapper[29936]: I1205 12:50:09.286122 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtvzs\" (UniqueName: \"kubernetes.io/projected/7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9-kube-api-access-dtvzs\") pod \"kube-storage-version-migrator-operator-b9c5dfc78-2n8gt\" (UID: \"7889e61e-b7ae-4ab6-a7d3-a1c5c83243b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b9c5dfc78-2n8gt" Dec 05 12:50:09.309921 master-0 kubenswrapper[29936]: I1205 12:50:09.309842 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c69rc\" (UniqueName: \"kubernetes.io/projected/99996137-2621-458b-980d-584b3640d4ad-kube-api-access-c69rc\") pod \"prometheus-operator-6c74d9cb9f-2nwvk\" (UID: \"99996137-2621-458b-980d-584b3640d4ad\") " pod="openshift-monitoring/prometheus-operator-6c74d9cb9f-2nwvk" Dec 05 12:50:09.326383 master-0 kubenswrapper[29936]: I1205 12:50:09.326315 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxw7\" (UniqueName: \"kubernetes.io/projected/a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7-kube-api-access-fxxw7\") pod \"ingress-operator-8649c48786-7xrk6\" (UID: \"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7\") " pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" Dec 05 12:50:09.350293 master-0 kubenswrapper[29936]: I1205 12:50:09.349552 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpxqg\" (UniqueName: \"kubernetes.io/projected/8defe125-1529-4091-adff-e9d17a2b298f-kube-api-access-jpxqg\") pod \"certified-operators-4p8p6\" (UID: \"8defe125-1529-4091-adff-e9d17a2b298f\") " pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:50:09.375089 master-0 kubenswrapper[29936]: I1205 12:50:09.375032 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvbfq\" (UniqueName: \"kubernetes.io/projected/b8233dad-bd19-4842-a4d5-cfa84f1feb83-kube-api-access-mvbfq\") pod \"network-node-identity-xwx26\" (UID: \"b8233dad-bd19-4842-a4d5-cfa84f1feb83\") " pod="openshift-network-node-identity/network-node-identity-xwx26" Dec 05 12:50:09.389359 master-0 kubenswrapper[29936]: I1205 12:50:09.389308 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flxbg\" (UniqueName: \"kubernetes.io/projected/f3792522-fec6-4022-90ac-0b8467fcd625-kube-api-access-flxbg\") pod \"service-ca-operator-77758bc754-hfqsp\" (UID: \"f3792522-fec6-4022-90ac-0b8467fcd625\") " pod="openshift-service-ca-operator/service-ca-operator-77758bc754-hfqsp" Dec 05 12:50:09.408081 master-0 kubenswrapper[29936]: E1205 12:50:09.408023 29936 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="4.222s" Dec 05 12:50:09.408212 master-0 kubenswrapper[29936]: I1205 12:50:09.408088 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:09.408212 master-0 kubenswrapper[29936]: I1205 12:50:09.408210 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:09.408308 master-0 kubenswrapper[29936]: I1205 12:50:09.408224 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 05 12:50:09.408308 master-0 kubenswrapper[29936]: I1205 12:50:09.408236 29936 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="358b1a0e-7600-48d9-9639-b356d354dad2" Dec 05 12:50:09.408308 master-0 kubenswrapper[29936]: I1205 12:50:09.408279 29936 status_manager.go:379] "Container startup changed for unknown container" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" containerID="cri-o://b7483a678d691fbf8a3207dd7d6ed1c739a3647a4a630049897502326cc17230" Dec 05 12:50:09.408308 master-0 kubenswrapper[29936]: I1205 12:50:09.408289 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:09.418953 master-0 kubenswrapper[29936]: I1205 12:50:09.418894 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 05 12:50:09.455923 master-0 kubenswrapper[29936]: I1205 12:50:09.455844 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:09.455923 master-0 kubenswrapper[29936]: I1205 12:50:09.455886 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"4d215811-6210-4ec2-8356-f1533dc43f65","Type":"ContainerDied","Data":"a8ddc41afaf0c618d55e894f2ce13b792424c9105a66a883a048089812798f25"} Dec 05 12:50:09.455923 master-0 kubenswrapper[29936]: I1205 12:50:09.455910 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8ddc41afaf0c618d55e894f2ce13b792424c9105a66a883a048089812798f25" Dec 05 12:50:09.455923 master-0 kubenswrapper[29936]: I1205 12:50:09.455922 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" event={"ID":"20a72c8b-0f12-446b-8a42-53d98864c8f8","Type":"ContainerStarted","Data":"c07dce8ebd5bd9971651cc28257fc2d3808e37b131bfe3f1eea1690c62e85ec3"} Dec 05 12:50:09.455923 master-0 kubenswrapper[29936]: I1205 12:50:09.455940 29936 status_manager.go:379] "Container startup changed for unknown container" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" containerID="cri-o://b7483a678d691fbf8a3207dd7d6ed1c739a3647a4a630049897502326cc17230" Dec 05 12:50:09.455923 master-0 kubenswrapper[29936]: I1205 12:50:09.455950 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.455962 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.455976 29936 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="358b1a0e-7600-48d9-9639-b356d354dad2" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456024 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456035 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456045 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456063 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456073 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456082 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456091 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456101 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456153 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456190 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456210 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456226 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-7c85c4dffd-sbvlr" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456235 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456260 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rzl84" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456275 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rzl84" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456285 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:50:09.456384 master-0 kubenswrapper[29936]: I1205 12:50:09.456313 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:50:09.457232 master-0 kubenswrapper[29936]: I1205 12:50:09.457153 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:09.457370 master-0 kubenswrapper[29936]: I1205 12:50:09.457335 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:09.457438 master-0 kubenswrapper[29936]: I1205 12:50:09.457402 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:09.457481 master-0 kubenswrapper[29936]: I1205 12:50:09.457443 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-58c5755b49-6dx4c" Dec 05 12:50:09.457481 master-0 kubenswrapper[29936]: I1205 12:50:09.457468 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:50:09.457561 master-0 kubenswrapper[29936]: I1205 12:50:09.457485 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68758cbcdb-tmzpj" Dec 05 12:50:09.457561 master-0 kubenswrapper[29936]: I1205 12:50:09.457496 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:09.457561 master-0 kubenswrapper[29936]: I1205 12:50:09.457506 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:09.457561 master-0 kubenswrapper[29936]: I1205 12:50:09.457524 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:09.457561 master-0 kubenswrapper[29936]: I1205 12:50:09.457534 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 05 12:50:09.457561 master-0 kubenswrapper[29936]: I1205 12:50:09.457555 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:50:09.462163 master-0 kubenswrapper[29936]: I1205 12:50:09.462076 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:09.504886 master-0 kubenswrapper[29936]: I1205 12:50:09.504795 29936 scope.go:117] "RemoveContainer" containerID="a62572546062b2df435bc85f27bda94544b75d65580e59f21beaef134a43b821" Dec 05 12:50:10.237840 master-0 kubenswrapper[29936]: I1205 12:50:10.237714 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-8649c48786-7xrk6_a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7/ingress-operator/4.log" Dec 05 12:50:10.238896 master-0 kubenswrapper[29936]: I1205 12:50:10.238151 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-8649c48786-7xrk6" event={"ID":"a517c3c4-0fbe-4850-a9b3-ff21c42ee5d7","Type":"ContainerStarted","Data":"d0641b4ee24b8f21052f89e374fa80db5fc3dc2e1b6253d50702e501c5fd180c"} Dec 05 12:50:10.610320 master-0 kubenswrapper[29936]: I1205 12:50:10.610077 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pzxlc"] Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: E1205 12:50:10.610442 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4957e218-f580-41a9-866a-fd4f92a3c007" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: I1205 12:50:10.610458 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4957e218-f580-41a9-866a-fd4f92a3c007" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: E1205 12:50:10.610476 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="076dafdf-a5d2-4e2d-9c38-6932910f7327" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: I1205 12:50:10.610482 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="076dafdf-a5d2-4e2d-9c38-6932910f7327" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: E1205 12:50:10.610497 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21de9318-06b4-42ba-8791-6d22055a04f2" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: I1205 12:50:10.610502 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="21de9318-06b4-42ba-8791-6d22055a04f2" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: E1205 12:50:10.610510 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="954c5c79-a96c-4c47-a4bc-024aaf4dc789" containerName="collect-profiles" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: I1205 12:50:10.610516 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="954c5c79-a96c-4c47-a4bc-024aaf4dc789" containerName="collect-profiles" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: E1205 12:50:10.610528 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d627fcf3-2a80-4739-add9-e21ad4efc6eb" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: I1205 12:50:10.610534 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d627fcf3-2a80-4739-add9-e21ad4efc6eb" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: E1205 12:50:10.610552 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2415969-33ad-418b-9df0-4a6c7bb279db" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: I1205 12:50:10.610558 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2415969-33ad-418b-9df0-4a6c7bb279db" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: E1205 12:50:10.610576 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fa3513-5467-4b0f-a03d-9279d36317bd" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: I1205 12:50:10.610582 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fa3513-5467-4b0f-a03d-9279d36317bd" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: E1205 12:50:10.610591 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af196a48-6fcc-47d1-95ac-7c0acd63dd21" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: I1205 12:50:10.610599 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="af196a48-6fcc-47d1-95ac-7c0acd63dd21" containerName="installer" Dec 05 12:50:10.610602 master-0 kubenswrapper[29936]: E1205 12:50:10.610609 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d215811-6210-4ec2-8356-f1533dc43f65" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610615 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d215811-6210-4ec2-8356-f1533dc43f65" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: E1205 12:50:10.610626 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerName="assisted-installer-controller" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610632 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerName="assisted-installer-controller" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: E1205 12:50:10.610646 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610652 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: E1205 12:50:10.610674 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565d5ef6-b0e7-4f04-9460-61f1d3903d37" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610680 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="565d5ef6-b0e7-4f04-9460-61f1d3903d37" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610806 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b47694fcc32464ab24d09c23d6efb57" containerName="kube-controller-manager" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610822 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="565d5ef6-b0e7-4f04-9460-61f1d3903d37" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610835 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="4957e218-f580-41a9-866a-fd4f92a3c007" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610852 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fa3513-5467-4b0f-a03d-9279d36317bd" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610867 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="af196a48-6fcc-47d1-95ac-7c0acd63dd21" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610878 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7807b90-1059-4c0d-9224-a0d57a572bfc" containerName="assisted-installer-controller" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610887 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="076dafdf-a5d2-4e2d-9c38-6932910f7327" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610895 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2415969-33ad-418b-9df0-4a6c7bb279db" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610908 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d215811-6210-4ec2-8356-f1533dc43f65" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610916 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="954c5c79-a96c-4c47-a4bc-024aaf4dc789" containerName="collect-profiles" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610925 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="21de9318-06b4-42ba-8791-6d22055a04f2" containerName="installer" Dec 05 12:50:10.611146 master-0 kubenswrapper[29936]: I1205 12:50:10.610933 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d627fcf3-2a80-4739-add9-e21ad4efc6eb" containerName="installer" Dec 05 12:50:10.611782 master-0 kubenswrapper[29936]: I1205 12:50:10.611462 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:10.624098 master-0 kubenswrapper[29936]: I1205 12:50:10.621595 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 12:50:10.633284 master-0 kubenswrapper[29936]: I1205 12:50:10.631219 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pzxlc"] Dec 05 12:50:10.637551 master-0 kubenswrapper[29936]: I1205 12:50:10.635021 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 12:50:10.649292 master-0 kubenswrapper[29936]: I1205 12:50:10.648782 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:10.649292 master-0 kubenswrapper[29936]: I1205 12:50:10.648895 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mttpm\" (UniqueName: \"kubernetes.io/projected/0936af9a-19c5-4950-b2d9-934c426bdf77-kube-api-access-mttpm\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:10.655953 master-0 kubenswrapper[29936]: I1205 12:50:10.655562 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 12:50:10.750616 master-0 kubenswrapper[29936]: I1205 12:50:10.750558 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:10.750616 master-0 kubenswrapper[29936]: I1205 12:50:10.750624 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mttpm\" (UniqueName: \"kubernetes.io/projected/0936af9a-19c5-4950-b2d9-934c426bdf77-kube-api-access-mttpm\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:10.751696 master-0 kubenswrapper[29936]: E1205 12:50:10.750888 29936 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Dec 05 12:50:10.751696 master-0 kubenswrapper[29936]: E1205 12:50:10.750948 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert podName:0936af9a-19c5-4950-b2d9-934c426bdf77 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:11.250928036 +0000 UTC m=+8.383007717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert") pod "ingress-canary-pzxlc" (UID: "0936af9a-19c5-4950-b2d9-934c426bdf77") : secret "canary-serving-cert" not found Dec 05 12:50:10.752859 master-0 kubenswrapper[29936]: I1205 12:50:10.752791 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:50:10.753699 master-0 kubenswrapper[29936]: I1205 12:50:10.752971 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:10.761237 master-0 kubenswrapper[29936]: I1205 12:50:10.761154 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-67477646d4-9vfxw" Dec 05 12:50:10.793268 master-0 kubenswrapper[29936]: I1205 12:50:10.792197 29936 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 12:50:10.797369 master-0 kubenswrapper[29936]: I1205 12:50:10.797315 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mttpm\" (UniqueName: \"kubernetes.io/projected/0936af9a-19c5-4950-b2d9-934c426bdf77-kube-api-access-mttpm\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:10.894308 master-0 kubenswrapper[29936]: I1205 12:50:10.894084 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:10.898523 master-0 kubenswrapper[29936]: I1205 12:50:10.898468 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:11.228721 master-0 kubenswrapper[29936]: I1205 12:50:11.228631 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:50:11.247427 master-0 kubenswrapper[29936]: I1205 12:50:11.247364 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:11.259303 master-0 kubenswrapper[29936]: I1205 12:50:11.258349 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:11.259303 master-0 kubenswrapper[29936]: E1205 12:50:11.258630 29936 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Dec 05 12:50:11.259303 master-0 kubenswrapper[29936]: E1205 12:50:11.258744 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert podName:0936af9a-19c5-4950-b2d9-934c426bdf77 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:12.258714577 +0000 UTC m=+9.390794258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert") pod "ingress-canary-pzxlc" (UID: "0936af9a-19c5-4950-b2d9-934c426bdf77") : secret "canary-serving-cert" not found Dec 05 12:50:11.278547 master-0 kubenswrapper[29936]: I1205 12:50:11.278488 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wfk7f" Dec 05 12:50:11.344550 master-0 kubenswrapper[29936]: I1205 12:50:11.344452 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:11.346995 master-0 kubenswrapper[29936]: I1205 12:50:11.346929 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5465c8b4db-dzlmb" Dec 05 12:50:11.446837 master-0 kubenswrapper[29936]: I1205 12:50:11.446705 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=18.446682258 podStartE2EDuration="18.446682258s" podCreationTimestamp="2025-12-05 12:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:50:11.445850835 +0000 UTC m=+8.577930526" watchObservedRunningTime="2025-12-05 12:50:11.446682258 +0000 UTC m=+8.578761939" Dec 05 12:50:11.989771 master-0 kubenswrapper[29936]: I1205 12:50:11.989695 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:11.990012 master-0 kubenswrapper[29936]: I1205 12:50:11.989854 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:11.998190 master-0 kubenswrapper[29936]: I1205 12:50:11.998119 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-7cbd59c7f8-d9g7k" Dec 05 12:50:12.202395 master-0 kubenswrapper[29936]: I1205 12:50:12.201916 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:12.209830 master-0 kubenswrapper[29936]: I1205 12:50:12.209322 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:50:12.210955 master-0 kubenswrapper[29936]: I1205 12:50:12.210510 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5f469489fd-59qjd"] Dec 05 12:50:12.211872 master-0 kubenswrapper[29936]: I1205 12:50:12.211289 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.218039 master-0 kubenswrapper[29936]: I1205 12:50:12.213723 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 12:50:12.218039 master-0 kubenswrapper[29936]: I1205 12:50:12.215121 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 12:50:12.218039 master-0 kubenswrapper[29936]: I1205 12:50:12.215405 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-xzgbv" Dec 05 12:50:12.218039 master-0 kubenswrapper[29936]: I1205 12:50:12.215610 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 12:50:12.218039 master-0 kubenswrapper[29936]: I1205 12:50:12.215805 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 12:50:12.218039 master-0 kubenswrapper[29936]: I1205 12:50:12.216120 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 12:50:12.220503 master-0 kubenswrapper[29936]: I1205 12:50:12.220073 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 12:50:12.220503 master-0 kubenswrapper[29936]: I1205 12:50:12.220228 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 12:50:12.220503 master-0 kubenswrapper[29936]: I1205 12:50:12.220307 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 12:50:12.220503 master-0 kubenswrapper[29936]: I1205 12:50:12.220469 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 12:50:12.221009 master-0 kubenswrapper[29936]: I1205 12:50:12.220935 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 12:50:12.221155 master-0 kubenswrapper[29936]: I1205 12:50:12.221126 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 12:50:12.223217 master-0 kubenswrapper[29936]: I1205 12:50:12.223165 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f469489fd-59qjd"] Dec 05 12:50:12.232369 master-0 kubenswrapper[29936]: I1205 12:50:12.232304 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 12:50:12.234737 master-0 kubenswrapper[29936]: I1205 12:50:12.234691 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287296 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-policies\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287405 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287439 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-session\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287473 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-dir\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287503 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287535 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287562 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287588 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287644 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287665 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26ccx\" (UniqueName: \"kubernetes.io/projected/91de1093-448a-432c-bc02-f4d0492c2e2b-kube-api-access-26ccx\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287698 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287720 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287744 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-error\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.288209 master-0 kubenswrapper[29936]: I1205 12:50:12.287772 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-login\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.289331 master-0 kubenswrapper[29936]: E1205 12:50:12.288628 29936 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Dec 05 12:50:12.289331 master-0 kubenswrapper[29936]: E1205 12:50:12.288690 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert podName:0936af9a-19c5-4950-b2d9-934c426bdf77 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:14.288671636 +0000 UTC m=+11.420751317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert") pod "ingress-canary-pzxlc" (UID: "0936af9a-19c5-4950-b2d9-934c426bdf77") : secret "canary-serving-cert" not found Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.389615 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-policies\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.389766 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-session\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.389813 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-dir\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.389855 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.389901 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.389927 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.389969 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.390028 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.390054 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26ccx\" (UniqueName: \"kubernetes.io/projected/91de1093-448a-432c-bc02-f4d0492c2e2b-kube-api-access-26ccx\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.390101 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.390134 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.390162 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-error\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.391248 master-0 kubenswrapper[29936]: I1205 12:50:12.390218 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-login\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.393277 master-0 kubenswrapper[29936]: I1205 12:50:12.393167 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-policies\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.393370 master-0 kubenswrapper[29936]: I1205 12:50:12.393305 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-dir\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.396057 master-0 kubenswrapper[29936]: E1205 12:50:12.395887 29936 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Dec 05 12:50:12.396057 master-0 kubenswrapper[29936]: E1205 12:50:12.396013 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-cliconfig podName:91de1093-448a-432c-bc02-f4d0492c2e2b nodeName:}" failed. No retries permitted until 2025-12-05 12:50:12.895952185 +0000 UTC m=+10.028032046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-cliconfig") pod "oauth-openshift-5f469489fd-59qjd" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b") : configmap "v4-0-config-system-cliconfig" not found Dec 05 12:50:12.396997 master-0 kubenswrapper[29936]: I1205 12:50:12.396956 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-service-ca\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.397138 master-0 kubenswrapper[29936]: I1205 12:50:12.397069 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.397661 master-0 kubenswrapper[29936]: I1205 12:50:12.397477 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.397964 master-0 kubenswrapper[29936]: I1205 12:50:12.397919 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.398036 master-0 kubenswrapper[29936]: I1205 12:50:12.397970 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-session\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.398036 master-0 kubenswrapper[29936]: I1205 12:50:12.398008 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-login\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.401281 master-0 kubenswrapper[29936]: I1205 12:50:12.401215 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-router-certs\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.401822 master-0 kubenswrapper[29936]: I1205 12:50:12.401755 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.402681 master-0 kubenswrapper[29936]: I1205 12:50:12.402636 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-error\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.431999 master-0 kubenswrapper[29936]: I1205 12:50:12.431854 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26ccx\" (UniqueName: \"kubernetes.io/projected/91de1093-448a-432c-bc02-f4d0492c2e2b-kube-api-access-26ccx\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.450709 master-0 kubenswrapper[29936]: I1205 12:50:12.450619 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:50:12.450987 master-0 kubenswrapper[29936]: I1205 12:50:12.450796 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:12.454419 master-0 kubenswrapper[29936]: I1205 12:50:12.454345 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qsggt" Dec 05 12:50:12.490853 master-0 kubenswrapper[29936]: I1205 12:50:12.490282 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=19.490260034 podStartE2EDuration="19.490260034s" podCreationTimestamp="2025-12-05 12:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:50:12.488594888 +0000 UTC m=+9.620674579" watchObservedRunningTime="2025-12-05 12:50:12.490260034 +0000 UTC m=+9.622339715" Dec 05 12:50:12.597512 master-0 kubenswrapper[29936]: I1205 12:50:12.597343 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-54dbc87ccb-twkmk"] Dec 05 12:50:12.598161 master-0 kubenswrapper[29936]: I1205 12:50:12.598136 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.600320 master-0 kubenswrapper[29936]: I1205 12:50:12.600277 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 12:50:12.600590 master-0 kubenswrapper[29936]: I1205 12:50:12.600542 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 12:50:12.601802 master-0 kubenswrapper[29936]: I1205 12:50:12.601770 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 12:50:12.602491 master-0 kubenswrapper[29936]: I1205 12:50:12.602461 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 12:50:12.615213 master-0 kubenswrapper[29936]: I1205 12:50:12.615132 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 12:50:12.631224 master-0 kubenswrapper[29936]: I1205 12:50:12.631130 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-54dbc87ccb-twkmk"] Dec 05 12:50:12.697558 master-0 kubenswrapper[29936]: I1205 12:50:12.697471 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3072fe-3e73-4a72-8a0d-b34518af240e-serving-cert\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.697895 master-0 kubenswrapper[29936]: I1205 12:50:12.697577 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3072fe-3e73-4a72-8a0d-b34518af240e-config\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.697895 master-0 kubenswrapper[29936]: I1205 12:50:12.697604 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s27gj\" (UniqueName: \"kubernetes.io/projected/bd3072fe-3e73-4a72-8a0d-b34518af240e-kube-api-access-s27gj\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.697895 master-0 kubenswrapper[29936]: I1205 12:50:12.697639 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd3072fe-3e73-4a72-8a0d-b34518af240e-trusted-ca\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.708865 master-0 kubenswrapper[29936]: I1205 12:50:12.708648 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:50:12.798911 master-0 kubenswrapper[29936]: I1205 12:50:12.798828 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3072fe-3e73-4a72-8a0d-b34518af240e-serving-cert\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.799216 master-0 kubenswrapper[29936]: I1205 12:50:12.799137 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3072fe-3e73-4a72-8a0d-b34518af240e-config\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.799283 master-0 kubenswrapper[29936]: I1205 12:50:12.799170 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s27gj\" (UniqueName: \"kubernetes.io/projected/bd3072fe-3e73-4a72-8a0d-b34518af240e-kube-api-access-s27gj\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.799283 master-0 kubenswrapper[29936]: I1205 12:50:12.799270 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd3072fe-3e73-4a72-8a0d-b34518af240e-trusted-ca\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.801018 master-0 kubenswrapper[29936]: I1205 12:50:12.800968 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3072fe-3e73-4a72-8a0d-b34518af240e-config\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.804989 master-0 kubenswrapper[29936]: I1205 12:50:12.804924 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3072fe-3e73-4a72-8a0d-b34518af240e-serving-cert\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.805248 master-0 kubenswrapper[29936]: I1205 12:50:12.805208 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd3072fe-3e73-4a72-8a0d-b34518af240e-trusted-ca\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.852225 master-0 kubenswrapper[29936]: I1205 12:50:12.851835 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s27gj\" (UniqueName: \"kubernetes.io/projected/bd3072fe-3e73-4a72-8a0d-b34518af240e-kube-api-access-s27gj\") pod \"console-operator-54dbc87ccb-twkmk\" (UID: \"bd3072fe-3e73-4a72-8a0d-b34518af240e\") " pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.901488 master-0 kubenswrapper[29936]: I1205 12:50:12.901392 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.903113 master-0 kubenswrapper[29936]: I1205 12:50:12.903055 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5f469489fd-59qjd\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:12.912549 master-0 kubenswrapper[29936]: I1205 12:50:12.912474 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:12.915678 master-0 kubenswrapper[29936]: I1205 12:50:12.914770 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:12.915678 master-0 kubenswrapper[29936]: I1205 12:50:12.914816 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:12.915678 master-0 kubenswrapper[29936]: I1205 12:50:12.914825 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:12.942364 master-0 kubenswrapper[29936]: I1205 12:50:12.942110 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:12.943617 master-0 kubenswrapper[29936]: I1205 12:50:12.943256 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:12.970635 master-0 kubenswrapper[29936]: I1205 12:50:12.966458 29936 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 05 12:50:12.970635 master-0 kubenswrapper[29936]: I1205 12:50:12.966666 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="a906debd0c35952850935aee2d607cce" containerName="startup-monitor" containerID="cri-o://4d7c7fd9f6be698bd81fc9eb6c8b4d1eab76e44ec95ef9874a47a2596768ed58" gracePeriod=5 Dec 05 12:50:13.079265 master-0 kubenswrapper[29936]: I1205 12:50:13.070461 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:13.084732 master-0 kubenswrapper[29936]: I1205 12:50:13.081653 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:13.099636 master-0 kubenswrapper[29936]: I1205 12:50:13.099434 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:50:13.141827 master-0 kubenswrapper[29936]: I1205 12:50:13.141526 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:13.151607 master-0 kubenswrapper[29936]: I1205 12:50:13.151122 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:13.162551 master-0 kubenswrapper[29936]: I1205 12:50:13.162072 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:50:13.183879 master-0 kubenswrapper[29936]: I1205 12:50:13.183521 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:13.300501 master-0 kubenswrapper[29936]: I1205 12:50:13.300437 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:13.478845 master-0 kubenswrapper[29936]: I1205 12:50:13.478686 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-54dbc87ccb-twkmk"] Dec 05 12:50:13.489456 master-0 kubenswrapper[29936]: I1205 12:50:13.489371 29936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:50:13.610801 master-0 kubenswrapper[29936]: I1205 12:50:13.610717 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5f469489fd-59qjd"] Dec 05 12:50:13.620575 master-0 kubenswrapper[29936]: W1205 12:50:13.620508 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91de1093_448a_432c_bc02_f4d0492c2e2b.slice/crio-5e4390b51c123622157c2975a08c75c10e159ea23c9d986afab1c424ef161265 WatchSource:0}: Error finding container 5e4390b51c123622157c2975a08c75c10e159ea23c9d986afab1c424ef161265: Status 404 returned error can't find the container with id 5e4390b51c123622157c2975a08c75c10e159ea23c9d986afab1c424ef161265 Dec 05 12:50:13.747069 master-0 kubenswrapper[29936]: I1205 12:50:13.746948 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:13.747319 master-0 kubenswrapper[29936]: I1205 12:50:13.747163 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:13.747890 master-0 kubenswrapper[29936]: I1205 12:50:13.747854 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7cc89f4c4c-n28z2" Dec 05 12:50:14.216225 master-0 kubenswrapper[29936]: I1205 12:50:14.216116 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-845d4454f8-kcq9s" Dec 05 12:50:14.216591 master-0 kubenswrapper[29936]: I1205 12:50:14.216317 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-5bdfbf6949-2bhqv" Dec 05 12:50:14.314994 master-0 kubenswrapper[29936]: I1205 12:50:14.314846 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" event={"ID":"bd3072fe-3e73-4a72-8a0d-b34518af240e","Type":"ContainerStarted","Data":"894c0851888d7c3f19ae009c67f3dbd54504ba39abe722deb7728da6c24ad1c6"} Dec 05 12:50:14.318093 master-0 kubenswrapper[29936]: I1205 12:50:14.317998 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" event={"ID":"91de1093-448a-432c-bc02-f4d0492c2e2b","Type":"ContainerStarted","Data":"5e4390b51c123622157c2975a08c75c10e159ea23c9d986afab1c424ef161265"} Dec 05 12:50:14.325514 master-0 kubenswrapper[29936]: I1205 12:50:14.318636 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:14.355750 master-0 kubenswrapper[29936]: I1205 12:50:14.355705 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:14.360492 master-0 kubenswrapper[29936]: E1205 12:50:14.357660 29936 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Dec 05 12:50:14.360492 master-0 kubenswrapper[29936]: E1205 12:50:14.357746 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert podName:0936af9a-19c5-4950-b2d9-934c426bdf77 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:18.357720648 +0000 UTC m=+15.489800329 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert") pod "ingress-canary-pzxlc" (UID: "0936af9a-19c5-4950-b2d9-934c426bdf77") : secret "canary-serving-cert" not found Dec 05 12:50:14.678366 master-0 kubenswrapper[29936]: I1205 12:50:14.677211 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:50:14.722668 master-0 kubenswrapper[29936]: I1205 12:50:14.722602 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:50:14.872398 master-0 kubenswrapper[29936]: I1205 12:50:14.872146 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 05 12:50:14.887952 master-0 kubenswrapper[29936]: I1205 12:50:14.887642 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 05 12:50:15.336002 master-0 kubenswrapper[29936]: I1205 12:50:15.335924 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 05 12:50:15.631739 master-0 kubenswrapper[29936]: I1205 12:50:15.631621 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:50:15.631942 master-0 kubenswrapper[29936]: I1205 12:50:15.631891 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:15.676336 master-0 kubenswrapper[29936]: I1205 12:50:15.676270 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dmnvq" Dec 05 12:50:15.834460 master-0 kubenswrapper[29936]: I1205 12:50:15.834367 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:50:15.852273 master-0 kubenswrapper[29936]: I1205 12:50:15.852154 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:15.852539 master-0 kubenswrapper[29936]: I1205 12:50:15.852395 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:15.861955 master-0 kubenswrapper[29936]: I1205 12:50:15.861843 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:50:15.906278 master-0 kubenswrapper[29936]: I1205 12:50:15.903335 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4p8p6" Dec 05 12:50:15.928921 master-0 kubenswrapper[29936]: I1205 12:50:15.927657 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:50:15.928921 master-0 kubenswrapper[29936]: I1205 12:50:15.927856 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:15.939778 master-0 kubenswrapper[29936]: I1205 12:50:15.939292 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-7cd7dbb44c-xdbtz" Dec 05 12:50:16.087218 master-0 kubenswrapper[29936]: I1205 12:50:16.084142 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:16.087218 master-0 kubenswrapper[29936]: I1205 12:50:16.084379 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:16.089085 master-0 kubenswrapper[29936]: I1205 12:50:16.089020 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:50:16.335700 master-0 kubenswrapper[29936]: I1205 12:50:16.335631 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:50:16.335968 master-0 kubenswrapper[29936]: I1205 12:50:16.335745 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:16.341023 master-0 kubenswrapper[29936]: I1205 12:50:16.340957 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-f797b99b6-vwhxt" Dec 05 12:50:16.932408 master-0 kubenswrapper[29936]: I1205 12:50:16.932331 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:50:16.932778 master-0 kubenswrapper[29936]: I1205 12:50:16.932490 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:16.937434 master-0 kubenswrapper[29936]: I1205 12:50:16.937381 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-fbc6455c4-jmn7x" Dec 05 12:50:17.341638 master-0 kubenswrapper[29936]: I1205 12:50:17.341570 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-54dbc87ccb-twkmk_bd3072fe-3e73-4a72-8a0d-b34518af240e/console-operator/0.log" Dec 05 12:50:17.341638 master-0 kubenswrapper[29936]: I1205 12:50:17.341630 29936 generic.go:334] "Generic (PLEG): container finished" podID="bd3072fe-3e73-4a72-8a0d-b34518af240e" containerID="3d9d7269862c2ba75c74b044b7421786c0532458c9b259c916258b45a5f83303" exitCode=255 Dec 05 12:50:17.342078 master-0 kubenswrapper[29936]: I1205 12:50:17.341717 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" event={"ID":"bd3072fe-3e73-4a72-8a0d-b34518af240e","Type":"ContainerDied","Data":"3d9d7269862c2ba75c74b044b7421786c0532458c9b259c916258b45a5f83303"} Dec 05 12:50:17.342165 master-0 kubenswrapper[29936]: I1205 12:50:17.342125 29936 scope.go:117] "RemoveContainer" containerID="3d9d7269862c2ba75c74b044b7421786c0532458c9b259c916258b45a5f83303" Dec 05 12:50:17.345262 master-0 kubenswrapper[29936]: I1205 12:50:17.345166 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" event={"ID":"91de1093-448a-432c-bc02-f4d0492c2e2b","Type":"ContainerStarted","Data":"f1692e1daee8b4206325c60f2bd4dc179ef50f075c69b7d076873252ca37c2b0"} Dec 05 12:50:17.345853 master-0 kubenswrapper[29936]: I1205 12:50:17.345542 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:17.348617 master-0 kubenswrapper[29936]: I1205 12:50:17.348572 29936 patch_prober.go:28] interesting pod/oauth-openshift-5f469489fd-59qjd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.88:6443/healthz\": dial tcp 10.128.0.88:6443: connect: connection refused" start-of-body= Dec 05 12:50:17.348697 master-0 kubenswrapper[29936]: I1205 12:50:17.348638 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" podUID="91de1093-448a-432c-bc02-f4d0492c2e2b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.88:6443/healthz\": dial tcp 10.128.0.88:6443: connect: connection refused" Dec 05 12:50:18.353577 master-0 kubenswrapper[29936]: I1205 12:50:18.353529 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-54dbc87ccb-twkmk_bd3072fe-3e73-4a72-8a0d-b34518af240e/console-operator/1.log" Dec 05 12:50:18.354303 master-0 kubenswrapper[29936]: I1205 12:50:18.354247 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-54dbc87ccb-twkmk_bd3072fe-3e73-4a72-8a0d-b34518af240e/console-operator/0.log" Dec 05 12:50:18.354377 master-0 kubenswrapper[29936]: I1205 12:50:18.354306 29936 generic.go:334] "Generic (PLEG): container finished" podID="bd3072fe-3e73-4a72-8a0d-b34518af240e" containerID="5c6fd30f937fff5c305893c18a48426c7f3abb37a139287ad25444cd144f17bc" exitCode=255 Dec 05 12:50:18.354433 master-0 kubenswrapper[29936]: I1205 12:50:18.354403 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" event={"ID":"bd3072fe-3e73-4a72-8a0d-b34518af240e","Type":"ContainerDied","Data":"5c6fd30f937fff5c305893c18a48426c7f3abb37a139287ad25444cd144f17bc"} Dec 05 12:50:18.354497 master-0 kubenswrapper[29936]: I1205 12:50:18.354446 29936 scope.go:117] "RemoveContainer" containerID="3d9d7269862c2ba75c74b044b7421786c0532458c9b259c916258b45a5f83303" Dec 05 12:50:18.355073 master-0 kubenswrapper[29936]: I1205 12:50:18.355041 29936 scope.go:117] "RemoveContainer" containerID="5c6fd30f937fff5c305893c18a48426c7f3abb37a139287ad25444cd144f17bc" Dec 05 12:50:18.355387 master-0 kubenswrapper[29936]: E1205 12:50:18.355355 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-54dbc87ccb-twkmk_openshift-console-operator(bd3072fe-3e73-4a72-8a0d-b34518af240e)\"" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" podUID="bd3072fe-3e73-4a72-8a0d-b34518af240e" Dec 05 12:50:18.357064 master-0 kubenswrapper[29936]: I1205 12:50:18.357002 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a906debd0c35952850935aee2d607cce/startup-monitor/0.log" Dec 05 12:50:18.357158 master-0 kubenswrapper[29936]: I1205 12:50:18.357073 29936 generic.go:334] "Generic (PLEG): container finished" podID="a906debd0c35952850935aee2d607cce" containerID="4d7c7fd9f6be698bd81fc9eb6c8b4d1eab76e44ec95ef9874a47a2596768ed58" exitCode=137 Dec 05 12:50:18.364482 master-0 kubenswrapper[29936]: I1205 12:50:18.364409 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:50:18.394955 master-0 kubenswrapper[29936]: I1205 12:50:18.394858 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" podStartSLOduration=3.072709901 podStartE2EDuration="6.394832366s" podCreationTimestamp="2025-12-05 12:50:12 +0000 UTC" firstStartedPulling="2025-12-05 12:50:13.62308521 +0000 UTC m=+10.755164891" lastFinishedPulling="2025-12-05 12:50:16.945207675 +0000 UTC m=+14.077287356" observedRunningTime="2025-12-05 12:50:17.44087196 +0000 UTC m=+14.572951651" watchObservedRunningTime="2025-12-05 12:50:18.394832366 +0000 UTC m=+15.526912047" Dec 05 12:50:18.443091 master-0 kubenswrapper[29936]: I1205 12:50:18.443007 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:18.444983 master-0 kubenswrapper[29936]: E1205 12:50:18.444923 29936 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Dec 05 12:50:18.445075 master-0 kubenswrapper[29936]: E1205 12:50:18.445041 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert podName:0936af9a-19c5-4950-b2d9-934c426bdf77 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:26.445008445 +0000 UTC m=+23.577088296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert") pod "ingress-canary-pzxlc" (UID: "0936af9a-19c5-4950-b2d9-934c426bdf77") : secret "canary-serving-cert" not found Dec 05 12:50:18.498294 master-0 kubenswrapper[29936]: I1205 12:50:18.498222 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:50:18.514685 master-0 kubenswrapper[29936]: I1205 12:50:18.514425 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:18.514685 master-0 kubenswrapper[29936]: I1205 12:50:18.514666 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:18.544726 master-0 kubenswrapper[29936]: I1205 12:50:18.544668 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2pp25" Dec 05 12:50:18.551154 master-0 kubenswrapper[29936]: I1205 12:50:18.551098 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9vqtb" Dec 05 12:50:18.553879 master-0 kubenswrapper[29936]: I1205 12:50:18.553841 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a906debd0c35952850935aee2d607cce/startup-monitor/0.log" Dec 05 12:50:18.553955 master-0 kubenswrapper[29936]: I1205 12:50:18.553935 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:18.650495 master-0 kubenswrapper[29936]: I1205 12:50:18.650303 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") pod \"a906debd0c35952850935aee2d607cce\" (UID: \"a906debd0c35952850935aee2d607cce\") " Dec 05 12:50:18.650886 master-0 kubenswrapper[29936]: I1205 12:50:18.650864 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") pod \"a906debd0c35952850935aee2d607cce\" (UID: \"a906debd0c35952850935aee2d607cce\") " Dec 05 12:50:18.651027 master-0 kubenswrapper[29936]: I1205 12:50:18.651007 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") pod \"a906debd0c35952850935aee2d607cce\" (UID: \"a906debd0c35952850935aee2d607cce\") " Dec 05 12:50:18.652107 master-0 kubenswrapper[29936]: I1205 12:50:18.651333 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") pod \"a906debd0c35952850935aee2d607cce\" (UID: \"a906debd0c35952850935aee2d607cce\") " Dec 05 12:50:18.652107 master-0 kubenswrapper[29936]: I1205 12:50:18.651371 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") pod \"a906debd0c35952850935aee2d607cce\" (UID: \"a906debd0c35952850935aee2d607cce\") " Dec 05 12:50:18.652107 master-0 kubenswrapper[29936]: I1205 12:50:18.650479 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock" (OuterVolumeSpecName: "var-lock") pod "a906debd0c35952850935aee2d607cce" (UID: "a906debd0c35952850935aee2d607cce"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:50:18.652107 master-0 kubenswrapper[29936]: I1205 12:50:18.651723 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests" (OuterVolumeSpecName: "manifests") pod "a906debd0c35952850935aee2d607cce" (UID: "a906debd0c35952850935aee2d607cce"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:50:18.652107 master-0 kubenswrapper[29936]: I1205 12:50:18.652026 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a906debd0c35952850935aee2d607cce" (UID: "a906debd0c35952850935aee2d607cce"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:50:18.652107 master-0 kubenswrapper[29936]: I1205 12:50:18.652023 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log" (OuterVolumeSpecName: "var-log") pod "a906debd0c35952850935aee2d607cce" (UID: "a906debd0c35952850935aee2d607cce"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:50:18.652526 master-0 kubenswrapper[29936]: I1205 12:50:18.652358 29936 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:50:18.652526 master-0 kubenswrapper[29936]: I1205 12:50:18.652390 29936 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-manifests\") on node \"master-0\" DevicePath \"\"" Dec 05 12:50:18.652526 master-0 kubenswrapper[29936]: I1205 12:50:18.652420 29936 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-var-log\") on node \"master-0\" DevicePath \"\"" Dec 05 12:50:18.652526 master-0 kubenswrapper[29936]: I1205 12:50:18.652444 29936 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:50:18.659016 master-0 kubenswrapper[29936]: I1205 12:50:18.658943 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "a906debd0c35952850935aee2d607cce" (UID: "a906debd0c35952850935aee2d607cce"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:50:18.754403 master-0 kubenswrapper[29936]: I1205 12:50:18.754310 29936 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a906debd0c35952850935aee2d607cce-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:50:18.792004 master-0 kubenswrapper[29936]: I1205 12:50:18.790688 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:18.792004 master-0 kubenswrapper[29936]: I1205 12:50:18.790926 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 12:50:18.800318 master-0 kubenswrapper[29936]: I1205 12:50:18.799969 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:50:19.203553 master-0 kubenswrapper[29936]: I1205 12:50:19.203460 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a906debd0c35952850935aee2d607cce" path="/var/lib/kubelet/pods/a906debd0c35952850935aee2d607cce/volumes" Dec 05 12:50:19.208201 master-0 kubenswrapper[29936]: I1205 12:50:19.204940 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Dec 05 12:50:19.230531 master-0 kubenswrapper[29936]: I1205 12:50:19.229192 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 05 12:50:19.230531 master-0 kubenswrapper[29936]: I1205 12:50:19.229257 29936 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="7b442737-46c3-434b-b6ee-4888cfdff6f5" Dec 05 12:50:19.237292 master-0 kubenswrapper[29936]: I1205 12:50:19.237174 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 05 12:50:19.237469 master-0 kubenswrapper[29936]: I1205 12:50:19.237293 29936 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="7b442737-46c3-434b-b6ee-4888cfdff6f5" Dec 05 12:50:19.367913 master-0 kubenswrapper[29936]: I1205 12:50:19.367703 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-54dbc87ccb-twkmk_bd3072fe-3e73-4a72-8a0d-b34518af240e/console-operator/1.log" Dec 05 12:50:19.368664 master-0 kubenswrapper[29936]: I1205 12:50:19.368332 29936 scope.go:117] "RemoveContainer" containerID="5c6fd30f937fff5c305893c18a48426c7f3abb37a139287ad25444cd144f17bc" Dec 05 12:50:19.368664 master-0 kubenswrapper[29936]: E1205 12:50:19.368589 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-54dbc87ccb-twkmk_openshift-console-operator(bd3072fe-3e73-4a72-8a0d-b34518af240e)\"" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" podUID="bd3072fe-3e73-4a72-8a0d-b34518af240e" Dec 05 12:50:19.371386 master-0 kubenswrapper[29936]: I1205 12:50:19.371344 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a906debd0c35952850935aee2d607cce/startup-monitor/0.log" Dec 05 12:50:19.371819 master-0 kubenswrapper[29936]: I1205 12:50:19.371656 29936 scope.go:117] "RemoveContainer" containerID="4d7c7fd9f6be698bd81fc9eb6c8b4d1eab76e44ec95ef9874a47a2596768ed58" Dec 05 12:50:19.371895 master-0 kubenswrapper[29936]: I1205 12:50:19.371829 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:50:21.366509 master-0 kubenswrapper[29936]: I1205 12:50:21.366451 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:21.367751 master-0 kubenswrapper[29936]: I1205 12:50:21.367693 29936 scope.go:117] "RemoveContainer" containerID="5c6fd30f937fff5c305893c18a48426c7f3abb37a139287ad25444cd144f17bc" Dec 05 12:50:21.368030 master-0 kubenswrapper[29936]: E1205 12:50:21.367990 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-54dbc87ccb-twkmk_openshift-console-operator(bd3072fe-3e73-4a72-8a0d-b34518af240e)\"" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" podUID="bd3072fe-3e73-4a72-8a0d-b34518af240e" Dec 05 12:50:25.047419 master-0 kubenswrapper[29936]: I1205 12:50:25.047306 29936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:25.048130 master-0 kubenswrapper[29936]: I1205 12:50:25.048112 29936 scope.go:117] "RemoveContainer" containerID="5c6fd30f937fff5c305893c18a48426c7f3abb37a139287ad25444cd144f17bc" Dec 05 12:50:25.048455 master-0 kubenswrapper[29936]: E1205 12:50:25.048404 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-54dbc87ccb-twkmk_openshift-console-operator(bd3072fe-3e73-4a72-8a0d-b34518af240e)\"" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" podUID="bd3072fe-3e73-4a72-8a0d-b34518af240e" Dec 05 12:50:26.481917 master-0 kubenswrapper[29936]: I1205 12:50:26.481840 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:26.482845 master-0 kubenswrapper[29936]: E1205 12:50:26.482205 29936 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Dec 05 12:50:26.482845 master-0 kubenswrapper[29936]: E1205 12:50:26.482362 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert podName:0936af9a-19c5-4950-b2d9-934c426bdf77 nodeName:}" failed. No retries permitted until 2025-12-05 12:50:42.482330231 +0000 UTC m=+39.614409942 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert") pod "ingress-canary-pzxlc" (UID: "0936af9a-19c5-4950-b2d9-934c426bdf77") : secret "canary-serving-cert" not found Dec 05 12:50:31.451362 master-0 kubenswrapper[29936]: I1205 12:50:31.451288 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jfr6d"] Dec 05 12:50:31.452657 master-0 kubenswrapper[29936]: E1205 12:50:31.452621 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a906debd0c35952850935aee2d607cce" containerName="startup-monitor" Dec 05 12:50:31.452657 master-0 kubenswrapper[29936]: I1205 12:50:31.452656 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a906debd0c35952850935aee2d607cce" containerName="startup-monitor" Dec 05 12:50:31.454376 master-0 kubenswrapper[29936]: I1205 12:50:31.454340 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a906debd0c35952850935aee2d607cce" containerName="startup-monitor" Dec 05 12:50:31.456715 master-0 kubenswrapper[29936]: I1205 12:50:31.456668 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:31.460566 master-0 kubenswrapper[29936]: I1205 12:50:31.460511 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 12:50:31.461094 master-0 kubenswrapper[29936]: I1205 12:50:31.460977 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-br7gx" Dec 05 12:50:31.560530 master-0 kubenswrapper[29936]: I1205 12:50:31.560448 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee331702-41da-4653-ad95-b9fd524851cb-host\") pod \"node-ca-jfr6d\" (UID: \"ee331702-41da-4653-ad95-b9fd524851cb\") " pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:31.560839 master-0 kubenswrapper[29936]: I1205 12:50:31.560659 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ee331702-41da-4653-ad95-b9fd524851cb-serviceca\") pod \"node-ca-jfr6d\" (UID: \"ee331702-41da-4653-ad95-b9fd524851cb\") " pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:31.560839 master-0 kubenswrapper[29936]: I1205 12:50:31.560793 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7lds\" (UniqueName: \"kubernetes.io/projected/ee331702-41da-4653-ad95-b9fd524851cb-kube-api-access-g7lds\") pod \"node-ca-jfr6d\" (UID: \"ee331702-41da-4653-ad95-b9fd524851cb\") " pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:31.663153 master-0 kubenswrapper[29936]: I1205 12:50:31.663089 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee331702-41da-4653-ad95-b9fd524851cb-host\") pod \"node-ca-jfr6d\" (UID: \"ee331702-41da-4653-ad95-b9fd524851cb\") " pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:31.663556 master-0 kubenswrapper[29936]: I1205 12:50:31.663539 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ee331702-41da-4653-ad95-b9fd524851cb-serviceca\") pod \"node-ca-jfr6d\" (UID: \"ee331702-41da-4653-ad95-b9fd524851cb\") " pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:31.663670 master-0 kubenswrapper[29936]: I1205 12:50:31.663657 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7lds\" (UniqueName: \"kubernetes.io/projected/ee331702-41da-4653-ad95-b9fd524851cb-kube-api-access-g7lds\") pod \"node-ca-jfr6d\" (UID: \"ee331702-41da-4653-ad95-b9fd524851cb\") " pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:31.664274 master-0 kubenswrapper[29936]: I1205 12:50:31.663304 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee331702-41da-4653-ad95-b9fd524851cb-host\") pod \"node-ca-jfr6d\" (UID: \"ee331702-41da-4653-ad95-b9fd524851cb\") " pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:31.664419 master-0 kubenswrapper[29936]: I1205 12:50:31.664369 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ee331702-41da-4653-ad95-b9fd524851cb-serviceca\") pod \"node-ca-jfr6d\" (UID: \"ee331702-41da-4653-ad95-b9fd524851cb\") " pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:31.821324 master-0 kubenswrapper[29936]: I1205 12:50:31.812871 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7lds\" (UniqueName: \"kubernetes.io/projected/ee331702-41da-4653-ad95-b9fd524851cb-kube-api-access-g7lds\") pod \"node-ca-jfr6d\" (UID: \"ee331702-41da-4653-ad95-b9fd524851cb\") " pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:32.097848 master-0 kubenswrapper[29936]: I1205 12:50:32.097738 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jfr6d" Dec 05 12:50:32.125848 master-0 kubenswrapper[29936]: W1205 12:50:32.125787 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee331702_41da_4653_ad95_b9fd524851cb.slice/crio-ecc173472b1e1762dd6138df3595e0054f257d748d213a3485614d8fb3eed3f5 WatchSource:0}: Error finding container ecc173472b1e1762dd6138df3595e0054f257d748d213a3485614d8fb3eed3f5: Status 404 returned error can't find the container with id ecc173472b1e1762dd6138df3595e0054f257d748d213a3485614d8fb3eed3f5 Dec 05 12:50:32.485531 master-0 kubenswrapper[29936]: I1205 12:50:32.485446 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jfr6d" event={"ID":"ee331702-41da-4653-ad95-b9fd524851cb","Type":"ContainerStarted","Data":"ecc173472b1e1762dd6138df3595e0054f257d748d213a3485614d8fb3eed3f5"} Dec 05 12:50:32.574995 master-0 kubenswrapper[29936]: I1205 12:50:32.574925 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-56b6c94668-k88cs"] Dec 05 12:50:32.576106 master-0 kubenswrapper[29936]: I1205 12:50:32.576048 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" Dec 05 12:50:32.580351 master-0 kubenswrapper[29936]: I1205 12:50:32.578592 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-w5k28" Dec 05 12:50:32.580769 master-0 kubenswrapper[29936]: I1205 12:50:32.580724 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 05 12:50:32.587543 master-0 kubenswrapper[29936]: I1205 12:50:32.587483 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-56b6c94668-k88cs"] Dec 05 12:50:32.683025 master-0 kubenswrapper[29936]: I1205 12:50:32.682965 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a2c37759-9414-4ca7-8bd4-20c4f689189b-monitoring-plugin-cert\") pod \"monitoring-plugin-56b6c94668-k88cs\" (UID: \"a2c37759-9414-4ca7-8bd4-20c4f689189b\") " pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" Dec 05 12:50:32.786642 master-0 kubenswrapper[29936]: I1205 12:50:32.786477 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a2c37759-9414-4ca7-8bd4-20c4f689189b-monitoring-plugin-cert\") pod \"monitoring-plugin-56b6c94668-k88cs\" (UID: \"a2c37759-9414-4ca7-8bd4-20c4f689189b\") " pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" Dec 05 12:50:32.798225 master-0 kubenswrapper[29936]: I1205 12:50:32.790662 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a2c37759-9414-4ca7-8bd4-20c4f689189b-monitoring-plugin-cert\") pod \"monitoring-plugin-56b6c94668-k88cs\" (UID: \"a2c37759-9414-4ca7-8bd4-20c4f689189b\") " pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" Dec 05 12:50:32.897866 master-0 kubenswrapper[29936]: I1205 12:50:32.897804 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" Dec 05 12:50:33.323223 master-0 kubenswrapper[29936]: I1205 12:50:33.322736 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-56b6c94668-k88cs"] Dec 05 12:50:33.349546 master-0 kubenswrapper[29936]: W1205 12:50:33.349473 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c37759_9414_4ca7_8bd4_20c4f689189b.slice/crio-9a52cbd6fbe23825c1470655da3ab68674cd3776b092896841a68e428064c1ff WatchSource:0}: Error finding container 9a52cbd6fbe23825c1470655da3ab68674cd3776b092896841a68e428064c1ff: Status 404 returned error can't find the container with id 9a52cbd6fbe23825c1470655da3ab68674cd3776b092896841a68e428064c1ff Dec 05 12:50:33.497033 master-0 kubenswrapper[29936]: I1205 12:50:33.496958 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" event={"ID":"a2c37759-9414-4ca7-8bd4-20c4f689189b","Type":"ContainerStarted","Data":"9a52cbd6fbe23825c1470655da3ab68674cd3776b092896841a68e428064c1ff"} Dec 05 12:50:35.517229 master-0 kubenswrapper[29936]: I1205 12:50:35.517120 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jfr6d" event={"ID":"ee331702-41da-4653-ad95-b9fd524851cb","Type":"ContainerStarted","Data":"aa40c8ef52929b39de99ec0ac221f736eccc5bc8917e67a2925f169130e7631c"} Dec 05 12:50:35.542989 master-0 kubenswrapper[29936]: I1205 12:50:35.542912 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jfr6d" podStartSLOduration=2.360316169 podStartE2EDuration="4.542887931s" podCreationTimestamp="2025-12-05 12:50:31 +0000 UTC" firstStartedPulling="2025-12-05 12:50:32.127679211 +0000 UTC m=+29.259758892" lastFinishedPulling="2025-12-05 12:50:34.310250973 +0000 UTC m=+31.442330654" observedRunningTime="2025-12-05 12:50:35.542689545 +0000 UTC m=+32.674769246" watchObservedRunningTime="2025-12-05 12:50:35.542887931 +0000 UTC m=+32.674967612" Dec 05 12:50:36.186685 master-0 kubenswrapper[29936]: I1205 12:50:36.186603 29936 scope.go:117] "RemoveContainer" containerID="5c6fd30f937fff5c305893c18a48426c7f3abb37a139287ad25444cd144f17bc" Dec 05 12:50:36.526648 master-0 kubenswrapper[29936]: I1205 12:50:36.526545 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" event={"ID":"a2c37759-9414-4ca7-8bd4-20c4f689189b","Type":"ContainerStarted","Data":"d7f23bbeaa1c82e3247b30df8f006c095fc00a421e05c2e364c2c95154bd5ea5"} Dec 05 12:50:36.527496 master-0 kubenswrapper[29936]: I1205 12:50:36.527072 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" Dec 05 12:50:36.530251 master-0 kubenswrapper[29936]: I1205 12:50:36.530212 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-54dbc87ccb-twkmk_bd3072fe-3e73-4a72-8a0d-b34518af240e/console-operator/1.log" Dec 05 12:50:36.530355 master-0 kubenswrapper[29936]: I1205 12:50:36.530314 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" event={"ID":"bd3072fe-3e73-4a72-8a0d-b34518af240e","Type":"ContainerStarted","Data":"82a3b62b08ca18f7bacb3474b9b44561c3c092815c9473b3b01cb23429d55ed0"} Dec 05 12:50:36.530704 master-0 kubenswrapper[29936]: I1205 12:50:36.530650 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:36.535211 master-0 kubenswrapper[29936]: I1205 12:50:36.535140 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" Dec 05 12:50:36.547087 master-0 kubenswrapper[29936]: I1205 12:50:36.546988 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" podStartSLOduration=2.530149157 podStartE2EDuration="4.546958374s" podCreationTimestamp="2025-12-05 12:50:32 +0000 UTC" firstStartedPulling="2025-12-05 12:50:33.3517291 +0000 UTC m=+30.483808781" lastFinishedPulling="2025-12-05 12:50:35.368538317 +0000 UTC m=+32.500617998" observedRunningTime="2025-12-05 12:50:36.546389538 +0000 UTC m=+33.678469219" watchObservedRunningTime="2025-12-05 12:50:36.546958374 +0000 UTC m=+33.679038075" Dec 05 12:50:36.567852 master-0 kubenswrapper[29936]: I1205 12:50:36.567758 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" podStartSLOduration=21.051421271 podStartE2EDuration="24.567730598s" podCreationTimestamp="2025-12-05 12:50:12 +0000 UTC" firstStartedPulling="2025-12-05 12:50:13.489295759 +0000 UTC m=+10.621375440" lastFinishedPulling="2025-12-05 12:50:17.005604846 +0000 UTC m=+14.137684767" observedRunningTime="2025-12-05 12:50:36.567308887 +0000 UTC m=+33.699388578" watchObservedRunningTime="2025-12-05 12:50:36.567730598 +0000 UTC m=+33.699810289" Dec 05 12:50:36.944879 master-0 kubenswrapper[29936]: I1205 12:50:36.944808 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-54dbc87ccb-twkmk" Dec 05 12:50:37.214163 master-0 kubenswrapper[29936]: I1205 12:50:37.213988 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-69cd4c69bf-mx2xk"] Dec 05 12:50:37.215302 master-0 kubenswrapper[29936]: I1205 12:50:37.215254 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-69cd4c69bf-mx2xk" Dec 05 12:50:37.218118 master-0 kubenswrapper[29936]: I1205 12:50:37.218070 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-thjz4" Dec 05 12:50:37.218544 master-0 kubenswrapper[29936]: I1205 12:50:37.218486 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 12:50:37.218652 master-0 kubenswrapper[29936]: I1205 12:50:37.218622 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 12:50:37.235585 master-0 kubenswrapper[29936]: I1205 12:50:37.235517 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-69cd4c69bf-mx2xk"] Dec 05 12:50:37.264660 master-0 kubenswrapper[29936]: I1205 12:50:37.264568 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5jv\" (UniqueName: \"kubernetes.io/projected/6936401b-4cb1-451f-b083-ee6721409cca-kube-api-access-dj5jv\") pod \"downloads-69cd4c69bf-mx2xk\" (UID: \"6936401b-4cb1-451f-b083-ee6721409cca\") " pod="openshift-console/downloads-69cd4c69bf-mx2xk" Dec 05 12:50:37.366492 master-0 kubenswrapper[29936]: I1205 12:50:37.366429 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj5jv\" (UniqueName: \"kubernetes.io/projected/6936401b-4cb1-451f-b083-ee6721409cca-kube-api-access-dj5jv\") pod \"downloads-69cd4c69bf-mx2xk\" (UID: \"6936401b-4cb1-451f-b083-ee6721409cca\") " pod="openshift-console/downloads-69cd4c69bf-mx2xk" Dec 05 12:50:37.409205 master-0 kubenswrapper[29936]: I1205 12:50:37.408984 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj5jv\" (UniqueName: \"kubernetes.io/projected/6936401b-4cb1-451f-b083-ee6721409cca-kube-api-access-dj5jv\") pod \"downloads-69cd4c69bf-mx2xk\" (UID: \"6936401b-4cb1-451f-b083-ee6721409cca\") " pod="openshift-console/downloads-69cd4c69bf-mx2xk" Dec 05 12:50:37.537518 master-0 kubenswrapper[29936]: I1205 12:50:37.537371 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-69cd4c69bf-mx2xk" Dec 05 12:50:38.113848 master-0 kubenswrapper[29936]: I1205 12:50:38.113784 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-69cd4c69bf-mx2xk"] Dec 05 12:50:38.129124 master-0 kubenswrapper[29936]: W1205 12:50:38.129072 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6936401b_4cb1_451f_b083_ee6721409cca.slice/crio-a584b5567ed35376fb08452282ef5fa34ed8e824778e54937331283c5ac10817 WatchSource:0}: Error finding container a584b5567ed35376fb08452282ef5fa34ed8e824778e54937331283c5ac10817: Status 404 returned error can't find the container with id a584b5567ed35376fb08452282ef5fa34ed8e824778e54937331283c5ac10817 Dec 05 12:50:38.548665 master-0 kubenswrapper[29936]: I1205 12:50:38.548593 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-69cd4c69bf-mx2xk" event={"ID":"6936401b-4cb1-451f-b083-ee6721409cca","Type":"ContainerStarted","Data":"a584b5567ed35376fb08452282ef5fa34ed8e824778e54937331283c5ac10817"} Dec 05 12:50:40.299882 master-0 kubenswrapper[29936]: I1205 12:50:40.299822 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7495f49968-4tq6k"] Dec 05 12:50:40.300910 master-0 kubenswrapper[29936]: I1205 12:50:40.300881 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.308889 master-0 kubenswrapper[29936]: I1205 12:50:40.308390 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 12:50:40.308889 master-0 kubenswrapper[29936]: I1205 12:50:40.308709 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 12:50:40.309171 master-0 kubenswrapper[29936]: I1205 12:50:40.309091 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-mpkz5" Dec 05 12:50:40.309413 master-0 kubenswrapper[29936]: I1205 12:50:40.309314 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 12:50:40.309507 master-0 kubenswrapper[29936]: I1205 12:50:40.309484 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 12:50:40.309857 master-0 kubenswrapper[29936]: I1205 12:50:40.309835 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 12:50:40.317941 master-0 kubenswrapper[29936]: I1205 12:50:40.317644 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7495f49968-4tq6k"] Dec 05 12:50:40.434001 master-0 kubenswrapper[29936]: I1205 12:50:40.433818 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shncq\" (UniqueName: \"kubernetes.io/projected/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-kube-api-access-shncq\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.434001 master-0 kubenswrapper[29936]: I1205 12:50:40.433897 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-oauth-serving-cert\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.434453 master-0 kubenswrapper[29936]: I1205 12:50:40.434027 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-service-ca\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.434453 master-0 kubenswrapper[29936]: I1205 12:50:40.434124 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-config\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.434453 master-0 kubenswrapper[29936]: I1205 12:50:40.434233 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-serving-cert\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.434453 master-0 kubenswrapper[29936]: I1205 12:50:40.434273 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-oauth-config\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.539216 master-0 kubenswrapper[29936]: I1205 12:50:40.536363 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shncq\" (UniqueName: \"kubernetes.io/projected/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-kube-api-access-shncq\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.539216 master-0 kubenswrapper[29936]: I1205 12:50:40.536428 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-oauth-serving-cert\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.539216 master-0 kubenswrapper[29936]: I1205 12:50:40.536460 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-service-ca\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.539216 master-0 kubenswrapper[29936]: I1205 12:50:40.536490 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-config\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.539216 master-0 kubenswrapper[29936]: I1205 12:50:40.536521 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-serving-cert\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.539216 master-0 kubenswrapper[29936]: I1205 12:50:40.536562 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-oauth-config\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.539786 master-0 kubenswrapper[29936]: I1205 12:50:40.539551 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-service-ca\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.541640 master-0 kubenswrapper[29936]: I1205 12:50:40.540779 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-oauth-serving-cert\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.541640 master-0 kubenswrapper[29936]: I1205 12:50:40.541597 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-config\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.542346 master-0 kubenswrapper[29936]: I1205 12:50:40.542296 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-oauth-config\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.547708 master-0 kubenswrapper[29936]: I1205 12:50:40.547653 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-serving-cert\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.566698 master-0 kubenswrapper[29936]: I1205 12:50:40.563789 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shncq\" (UniqueName: \"kubernetes.io/projected/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-kube-api-access-shncq\") pod \"console-7495f49968-4tq6k\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:40.635210 master-0 kubenswrapper[29936]: I1205 12:50:40.634528 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:41.166870 master-0 kubenswrapper[29936]: I1205 12:50:41.165818 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7495f49968-4tq6k"] Dec 05 12:50:41.181326 master-0 kubenswrapper[29936]: W1205 12:50:41.179664 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ec0c161_e3a9_4b81_a9e7_deba8be2b5f5.slice/crio-174bcae07eb3cd4b7fdce32bd0e0bfd0c11ecf799abf2e1f44f9c05e951ecfa1 WatchSource:0}: Error finding container 174bcae07eb3cd4b7fdce32bd0e0bfd0c11ecf799abf2e1f44f9c05e951ecfa1: Status 404 returned error can't find the container with id 174bcae07eb3cd4b7fdce32bd0e0bfd0c11ecf799abf2e1f44f9c05e951ecfa1 Dec 05 12:50:41.575356 master-0 kubenswrapper[29936]: I1205 12:50:41.575255 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7495f49968-4tq6k" event={"ID":"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5","Type":"ContainerStarted","Data":"174bcae07eb3cd4b7fdce32bd0e0bfd0c11ecf799abf2e1f44f9c05e951ecfa1"} Dec 05 12:50:42.487368 master-0 kubenswrapper[29936]: I1205 12:50:42.487266 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:42.493209 master-0 kubenswrapper[29936]: I1205 12:50:42.493141 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0936af9a-19c5-4950-b2d9-934c426bdf77-cert\") pod \"ingress-canary-pzxlc\" (UID: \"0936af9a-19c5-4950-b2d9-934c426bdf77\") " pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:42.751155 master-0 kubenswrapper[29936]: I1205 12:50:42.751009 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pzxlc" Dec 05 12:50:43.266034 master-0 kubenswrapper[29936]: I1205 12:50:43.264117 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pzxlc"] Dec 05 12:50:43.283117 master-0 kubenswrapper[29936]: W1205 12:50:43.283036 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0936af9a_19c5_4950_b2d9_934c426bdf77.slice/crio-ab607ec781fe8ead8b785f3b54282fccb5105704c251b69c88ac03b3d5ba061c WatchSource:0}: Error finding container ab607ec781fe8ead8b785f3b54282fccb5105704c251b69c88ac03b3d5ba061c: Status 404 returned error can't find the container with id ab607ec781fe8ead8b785f3b54282fccb5105704c251b69c88ac03b3d5ba061c Dec 05 12:50:43.351994 master-0 kubenswrapper[29936]: I1205 12:50:43.351925 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Dec 05 12:50:43.352843 master-0 kubenswrapper[29936]: I1205 12:50:43.352807 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:43.354585 master-0 kubenswrapper[29936]: I1205 12:50:43.354526 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Dec 05 12:50:43.358016 master-0 kubenswrapper[29936]: I1205 12:50:43.357699 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-bv9jw" Dec 05 12:50:43.368923 master-0 kubenswrapper[29936]: I1205 12:50:43.368862 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Dec 05 12:50:43.408531 master-0 kubenswrapper[29936]: I1205 12:50:43.408382 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:43.408836 master-0 kubenswrapper[29936]: I1205 12:50:43.408794 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-var-lock\") pod \"installer-6-master-0\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:43.409010 master-0 kubenswrapper[29936]: I1205 12:50:43.408989 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kube-api-access\") pod \"installer-6-master-0\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:43.510520 master-0 kubenswrapper[29936]: I1205 12:50:43.510424 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kube-api-access\") pod \"installer-6-master-0\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:43.510520 master-0 kubenswrapper[29936]: I1205 12:50:43.510545 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:43.510949 master-0 kubenswrapper[29936]: I1205 12:50:43.510578 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-var-lock\") pod \"installer-6-master-0\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:43.510949 master-0 kubenswrapper[29936]: I1205 12:50:43.510727 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-var-lock\") pod \"installer-6-master-0\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:43.510949 master-0 kubenswrapper[29936]: I1205 12:50:43.510779 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:43.537689 master-0 kubenswrapper[29936]: I1205 12:50:43.537408 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kube-api-access\") pod \"installer-6-master-0\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:43.602005 master-0 kubenswrapper[29936]: I1205 12:50:43.601932 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pzxlc" event={"ID":"0936af9a-19c5-4950-b2d9-934c426bdf77","Type":"ContainerStarted","Data":"1f96d500d9b22f6690ac30016e62c696ea5ad8a90c1cf0c63e94140b7815d58f"} Dec 05 12:50:43.602005 master-0 kubenswrapper[29936]: I1205 12:50:43.602008 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pzxlc" event={"ID":"0936af9a-19c5-4950-b2d9-934c426bdf77","Type":"ContainerStarted","Data":"ab607ec781fe8ead8b785f3b54282fccb5105704c251b69c88ac03b3d5ba061c"} Dec 05 12:50:43.630977 master-0 kubenswrapper[29936]: I1205 12:50:43.630878 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pzxlc" podStartSLOduration=33.630849919 podStartE2EDuration="33.630849919s" podCreationTimestamp="2025-12-05 12:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:50:43.627779464 +0000 UTC m=+40.759859165" watchObservedRunningTime="2025-12-05 12:50:43.630849919 +0000 UTC m=+40.762929600" Dec 05 12:50:43.755388 master-0 kubenswrapper[29936]: I1205 12:50:43.747610 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:50:44.097503 master-0 kubenswrapper[29936]: I1205 12:50:44.097412 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5f469489fd-59qjd"] Dec 05 12:50:44.218613 master-0 kubenswrapper[29936]: I1205 12:50:44.218557 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-6-master-0"] Dec 05 12:50:45.137953 master-0 kubenswrapper[29936]: I1205 12:50:45.137403 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-74ffd5f75f-slrkr"] Dec 05 12:50:45.138493 master-0 kubenswrapper[29936]: I1205 12:50:45.138473 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.156719 master-0 kubenswrapper[29936]: I1205 12:50:45.156666 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 12:50:45.157562 master-0 kubenswrapper[29936]: I1205 12:50:45.157502 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74ffd5f75f-slrkr"] Dec 05 12:50:45.247832 master-0 kubenswrapper[29936]: I1205 12:50:45.247712 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-service-ca\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.248269 master-0 kubenswrapper[29936]: I1205 12:50:45.248224 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-oauth-config\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.248392 master-0 kubenswrapper[29936]: I1205 12:50:45.248365 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-oauth-serving-cert\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.248442 master-0 kubenswrapper[29936]: I1205 12:50:45.248401 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-config\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.248490 master-0 kubenswrapper[29936]: I1205 12:50:45.248440 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4tpx\" (UniqueName: \"kubernetes.io/projected/5f5f6985-a4f8-467b-8277-4ea20bfc4570-kube-api-access-q4tpx\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.248968 master-0 kubenswrapper[29936]: I1205 12:50:45.248883 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-trusted-ca-bundle\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.249097 master-0 kubenswrapper[29936]: I1205 12:50:45.249033 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-serving-cert\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.350890 master-0 kubenswrapper[29936]: I1205 12:50:45.350816 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-serving-cert\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.351211 master-0 kubenswrapper[29936]: I1205 12:50:45.350905 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-service-ca\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.351211 master-0 kubenswrapper[29936]: I1205 12:50:45.350964 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-oauth-config\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.351211 master-0 kubenswrapper[29936]: I1205 12:50:45.351058 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-oauth-serving-cert\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.351211 master-0 kubenswrapper[29936]: I1205 12:50:45.351087 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-config\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.351622 master-0 kubenswrapper[29936]: I1205 12:50:45.351594 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4tpx\" (UniqueName: \"kubernetes.io/projected/5f5f6985-a4f8-467b-8277-4ea20bfc4570-kube-api-access-q4tpx\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.351732 master-0 kubenswrapper[29936]: I1205 12:50:45.351696 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-trusted-ca-bundle\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.352470 master-0 kubenswrapper[29936]: I1205 12:50:45.352442 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-service-ca\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.352802 master-0 kubenswrapper[29936]: I1205 12:50:45.352748 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-config\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.353292 master-0 kubenswrapper[29936]: I1205 12:50:45.353231 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-oauth-serving-cert\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.354492 master-0 kubenswrapper[29936]: I1205 12:50:45.354436 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-trusted-ca-bundle\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.359616 master-0 kubenswrapper[29936]: I1205 12:50:45.359564 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-oauth-config\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.362528 master-0 kubenswrapper[29936]: I1205 12:50:45.362479 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-serving-cert\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.374130 master-0 kubenswrapper[29936]: I1205 12:50:45.374042 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4tpx\" (UniqueName: \"kubernetes.io/projected/5f5f6985-a4f8-467b-8277-4ea20bfc4570-kube-api-access-q4tpx\") pod \"console-74ffd5f75f-slrkr\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:45.466930 master-0 kubenswrapper[29936]: I1205 12:50:45.466387 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:46.266035 master-0 kubenswrapper[29936]: I1205 12:50:46.265913 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 05 12:50:46.267136 master-0 kubenswrapper[29936]: I1205 12:50:46.267100 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.271775 master-0 kubenswrapper[29936]: I1205 12:50:46.270519 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kkk9n" Dec 05 12:50:46.271775 master-0 kubenswrapper[29936]: I1205 12:50:46.270956 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 12:50:46.279147 master-0 kubenswrapper[29936]: I1205 12:50:46.279081 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 05 12:50:46.370657 master-0 kubenswrapper[29936]: I1205 12:50:46.370520 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.370657 master-0 kubenswrapper[29936]: I1205 12:50:46.370602 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-var-lock\") pod \"installer-4-master-0\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.370657 master-0 kubenswrapper[29936]: I1205 12:50:46.370661 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9eb9db93-5e90-400b-8b54-e5cea89daabf-kube-api-access\") pod \"installer-4-master-0\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.406694 master-0 kubenswrapper[29936]: W1205 12:50:46.406602 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode66ebd90_1d8c_47ff_b569_1831bfc110ce.slice/crio-a47445f5142e106a08eab5a6ce3670a2fd9727c3d75649eed68d941e1c1ecf9c WatchSource:0}: Error finding container a47445f5142e106a08eab5a6ce3670a2fd9727c3d75649eed68d941e1c1ecf9c: Status 404 returned error can't find the container with id a47445f5142e106a08eab5a6ce3670a2fd9727c3d75649eed68d941e1c1ecf9c Dec 05 12:50:46.471557 master-0 kubenswrapper[29936]: I1205 12:50:46.471499 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.471666 master-0 kubenswrapper[29936]: I1205 12:50:46.471576 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-var-lock\") pod \"installer-4-master-0\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.471666 master-0 kubenswrapper[29936]: I1205 12:50:46.471625 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9eb9db93-5e90-400b-8b54-e5cea89daabf-kube-api-access\") pod \"installer-4-master-0\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.472120 master-0 kubenswrapper[29936]: I1205 12:50:46.472091 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.472210 master-0 kubenswrapper[29936]: I1205 12:50:46.472136 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-var-lock\") pod \"installer-4-master-0\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.496424 master-0 kubenswrapper[29936]: I1205 12:50:46.496380 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9eb9db93-5e90-400b-8b54-e5cea89daabf-kube-api-access\") pod \"installer-4-master-0\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.606371 master-0 kubenswrapper[29936]: I1205 12:50:46.605760 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:50:46.635255 master-0 kubenswrapper[29936]: I1205 12:50:46.634215 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"e66ebd90-1d8c-47ff-b569-1831bfc110ce","Type":"ContainerStarted","Data":"a47445f5142e106a08eab5a6ce3670a2fd9727c3d75649eed68d941e1c1ecf9c"} Dec 05 12:50:46.885543 master-0 kubenswrapper[29936]: I1205 12:50:46.885233 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74ffd5f75f-slrkr"] Dec 05 12:50:46.891619 master-0 kubenswrapper[29936]: W1205 12:50:46.891558 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f5f6985_a4f8_467b_8277_4ea20bfc4570.slice/crio-806da7cc8b3f2ccc3bc34f5676a5083409af02a71f48c7bc097f3c69fbe20029 WatchSource:0}: Error finding container 806da7cc8b3f2ccc3bc34f5676a5083409af02a71f48c7bc097f3c69fbe20029: Status 404 returned error can't find the container with id 806da7cc8b3f2ccc3bc34f5676a5083409af02a71f48c7bc097f3c69fbe20029 Dec 05 12:50:47.093285 master-0 kubenswrapper[29936]: I1205 12:50:47.091586 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 05 12:50:47.645709 master-0 kubenswrapper[29936]: I1205 12:50:47.645631 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7495f49968-4tq6k" event={"ID":"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5","Type":"ContainerStarted","Data":"7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba"} Dec 05 12:50:47.657355 master-0 kubenswrapper[29936]: I1205 12:50:47.657268 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"9eb9db93-5e90-400b-8b54-e5cea89daabf","Type":"ContainerStarted","Data":"4093f3e755952787ec8d524ca81827afeecb189d619370b6131751e043918683"} Dec 05 12:50:47.657355 master-0 kubenswrapper[29936]: I1205 12:50:47.657353 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"9eb9db93-5e90-400b-8b54-e5cea89daabf","Type":"ContainerStarted","Data":"cd379a044d84900fae9526c5ef529568f0eb518ec9cdf8409f4a3d8d094b0a6f"} Dec 05 12:50:47.663963 master-0 kubenswrapper[29936]: I1205 12:50:47.663858 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"e66ebd90-1d8c-47ff-b569-1831bfc110ce","Type":"ContainerStarted","Data":"76cecff7fa7ab1620c5fea3791fa06e91fd4391f443781b4e1e7f5b5e4f3ee7e"} Dec 05 12:50:47.666911 master-0 kubenswrapper[29936]: I1205 12:50:47.666873 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74ffd5f75f-slrkr" event={"ID":"5f5f6985-a4f8-467b-8277-4ea20bfc4570","Type":"ContainerStarted","Data":"04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd"} Dec 05 12:50:47.666911 master-0 kubenswrapper[29936]: I1205 12:50:47.666909 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74ffd5f75f-slrkr" event={"ID":"5f5f6985-a4f8-467b-8277-4ea20bfc4570","Type":"ContainerStarted","Data":"806da7cc8b3f2ccc3bc34f5676a5083409af02a71f48c7bc097f3c69fbe20029"} Dec 05 12:50:47.680223 master-0 kubenswrapper[29936]: I1205 12:50:47.680050 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7495f49968-4tq6k" podStartSLOduration=2.40579264 podStartE2EDuration="7.68000341s" podCreationTimestamp="2025-12-05 12:50:40 +0000 UTC" firstStartedPulling="2025-12-05 12:50:41.1856942 +0000 UTC m=+38.317773881" lastFinishedPulling="2025-12-05 12:50:46.45990497 +0000 UTC m=+43.591984651" observedRunningTime="2025-12-05 12:50:47.677290565 +0000 UTC m=+44.809370256" watchObservedRunningTime="2025-12-05 12:50:47.68000341 +0000 UTC m=+44.812083091" Dec 05 12:50:47.698605 master-0 kubenswrapper[29936]: I1205 12:50:47.697367 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-6-master-0" podStartSLOduration=4.697334809 podStartE2EDuration="4.697334809s" podCreationTimestamp="2025-12-05 12:50:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:50:47.694523172 +0000 UTC m=+44.826602863" watchObservedRunningTime="2025-12-05 12:50:47.697334809 +0000 UTC m=+44.829414480" Dec 05 12:50:47.732368 master-0 kubenswrapper[29936]: I1205 12:50:47.728131 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74ffd5f75f-slrkr" podStartSLOduration=2.72809042 podStartE2EDuration="2.72809042s" podCreationTimestamp="2025-12-05 12:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:50:47.720348137 +0000 UTC m=+44.852427838" watchObservedRunningTime="2025-12-05 12:50:47.72809042 +0000 UTC m=+44.860170121" Dec 05 12:50:47.765465 master-0 kubenswrapper[29936]: I1205 12:50:47.765371 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=1.765339872 podStartE2EDuration="1.765339872s" podCreationTimestamp="2025-12-05 12:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:50:47.750927523 +0000 UTC m=+44.883007204" watchObservedRunningTime="2025-12-05 12:50:47.765339872 +0000 UTC m=+44.897419553" Dec 05 12:50:50.635507 master-0 kubenswrapper[29936]: I1205 12:50:50.635429 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:50.635507 master-0 kubenswrapper[29936]: I1205 12:50:50.635502 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:50:50.637714 master-0 kubenswrapper[29936]: I1205 12:50:50.637664 29936 patch_prober.go:28] interesting pod/console-7495f49968-4tq6k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 05 12:50:50.637798 master-0 kubenswrapper[29936]: I1205 12:50:50.637714 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 05 12:50:55.467644 master-0 kubenswrapper[29936]: I1205 12:50:55.467473 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:55.467644 master-0 kubenswrapper[29936]: I1205 12:50:55.467608 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:50:55.469568 master-0 kubenswrapper[29936]: I1205 12:50:55.469511 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:50:55.469654 master-0 kubenswrapper[29936]: I1205 12:50:55.469605 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:51:00.637131 master-0 kubenswrapper[29936]: I1205 12:51:00.637058 29936 patch_prober.go:28] interesting pod/console-7495f49968-4tq6k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 05 12:51:00.637850 master-0 kubenswrapper[29936]: I1205 12:51:00.637161 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 05 12:51:05.467863 master-0 kubenswrapper[29936]: I1205 12:51:05.467769 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:51:05.468952 master-0 kubenswrapper[29936]: I1205 12:51:05.467885 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:51:08.670883 master-0 kubenswrapper[29936]: I1205 12:51:08.669252 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw"] Dec 05 12:51:08.670883 master-0 kubenswrapper[29936]: I1205 12:51:08.669707 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" podUID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" containerName="controller-manager" containerID="cri-o://712d1506cd51ade164ab750d49a330bfc3046901061c8f5155346b2e5325a1c2" gracePeriod=30 Dec 05 12:51:08.700258 master-0 kubenswrapper[29936]: I1205 12:51:08.699749 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx"] Dec 05 12:51:08.700258 master-0 kubenswrapper[29936]: I1205 12:51:08.700046 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" podUID="e943438b-1de8-435c-8a19-accd6a6292a4" containerName="route-controller-manager" containerID="cri-o://d9a47a4e65ab5cf4baf6b36c8ce1ba7fd5756eae201f48950bc988deec039fe0" gracePeriod=30 Dec 05 12:51:08.792210 master-0 kubenswrapper[29936]: I1205 12:51:08.792123 29936 patch_prober.go:28] interesting pod/route-controller-manager-554555dbc9-szqjx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.75:8443/healthz\": dial tcp 10.128.0.75:8443: connect: connection refused" start-of-body= Dec 05 12:51:08.793120 master-0 kubenswrapper[29936]: I1205 12:51:08.792243 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" podUID="e943438b-1de8-435c-8a19-accd6a6292a4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.75:8443/healthz\": dial tcp 10.128.0.75:8443: connect: connection refused" Dec 05 12:51:08.864038 master-0 kubenswrapper[29936]: I1205 12:51:08.863902 29936 generic.go:334] "Generic (PLEG): container finished" podID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" containerID="712d1506cd51ade164ab750d49a330bfc3046901061c8f5155346b2e5325a1c2" exitCode=0 Dec 05 12:51:08.864038 master-0 kubenswrapper[29936]: I1205 12:51:08.863997 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" event={"ID":"c39d2089-d3bf-4556-b6ef-c362a08c21a2","Type":"ContainerDied","Data":"712d1506cd51ade164ab750d49a330bfc3046901061c8f5155346b2e5325a1c2"} Dec 05 12:51:08.864562 master-0 kubenswrapper[29936]: I1205 12:51:08.864123 29936 scope.go:117] "RemoveContainer" containerID="fa4f02496398ccdc5c55acbb60e75e3c69d9850820e087e65cbe9d00bf63d07e" Dec 05 12:51:08.868702 master-0 kubenswrapper[29936]: I1205 12:51:08.868636 29936 generic.go:334] "Generic (PLEG): container finished" podID="e943438b-1de8-435c-8a19-accd6a6292a4" containerID="d9a47a4e65ab5cf4baf6b36c8ce1ba7fd5756eae201f48950bc988deec039fe0" exitCode=0 Dec 05 12:51:08.868793 master-0 kubenswrapper[29936]: I1205 12:51:08.868704 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" event={"ID":"e943438b-1de8-435c-8a19-accd6a6292a4","Type":"ContainerDied","Data":"d9a47a4e65ab5cf4baf6b36c8ce1ba7fd5756eae201f48950bc988deec039fe0"} Dec 05 12:51:09.131810 master-0 kubenswrapper[29936]: I1205 12:51:09.131709 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" podUID="91de1093-448a-432c-bc02-f4d0492c2e2b" containerName="oauth-openshift" containerID="cri-o://f1692e1daee8b4206325c60f2bd4dc179ef50f075c69b7d076873252ca37c2b0" gracePeriod=15 Dec 05 12:51:09.878991 master-0 kubenswrapper[29936]: I1205 12:51:09.878905 29936 generic.go:334] "Generic (PLEG): container finished" podID="91de1093-448a-432c-bc02-f4d0492c2e2b" containerID="f1692e1daee8b4206325c60f2bd4dc179ef50f075c69b7d076873252ca37c2b0" exitCode=0 Dec 05 12:51:09.878991 master-0 kubenswrapper[29936]: I1205 12:51:09.878960 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" event={"ID":"91de1093-448a-432c-bc02-f4d0492c2e2b","Type":"ContainerDied","Data":"f1692e1daee8b4206325c60f2bd4dc179ef50f075c69b7d076873252ca37c2b0"} Dec 05 12:51:10.625252 master-0 kubenswrapper[29936]: I1205 12:51:10.625145 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 05 12:51:10.627007 master-0 kubenswrapper[29936]: I1205 12:51:10.626645 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:10.629954 master-0 kubenswrapper[29936]: I1205 12:51:10.629894 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-74gjx" Dec 05 12:51:10.630043 master-0 kubenswrapper[29936]: I1205 12:51:10.629984 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 12:51:10.636196 master-0 kubenswrapper[29936]: I1205 12:51:10.636125 29936 patch_prober.go:28] interesting pod/console-7495f49968-4tq6k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 05 12:51:10.636266 master-0 kubenswrapper[29936]: I1205 12:51:10.636215 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 05 12:51:10.708955 master-0 kubenswrapper[29936]: I1205 12:51:10.708876 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-var-lock\") pod \"installer-4-master-0\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:10.709351 master-0 kubenswrapper[29936]: I1205 12:51:10.709028 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67037a7b-9105-4c7d-80ac-7481c14997f1-kube-api-access\") pod \"installer-4-master-0\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:10.709419 master-0 kubenswrapper[29936]: I1205 12:51:10.709376 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:10.811341 master-0 kubenswrapper[29936]: I1205 12:51:10.811253 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-var-lock\") pod \"installer-4-master-0\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:10.811341 master-0 kubenswrapper[29936]: I1205 12:51:10.811338 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67037a7b-9105-4c7d-80ac-7481c14997f1-kube-api-access\") pod \"installer-4-master-0\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:10.811750 master-0 kubenswrapper[29936]: I1205 12:51:10.811507 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-var-lock\") pod \"installer-4-master-0\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:10.811750 master-0 kubenswrapper[29936]: I1205 12:51:10.811630 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:10.811842 master-0 kubenswrapper[29936]: I1205 12:51:10.811769 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:10.914406 master-0 kubenswrapper[29936]: I1205 12:51:10.913838 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 05 12:51:10.938080 master-0 kubenswrapper[29936]: I1205 12:51:10.938016 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67037a7b-9105-4c7d-80ac-7481c14997f1-kube-api-access\") pod \"installer-4-master-0\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:10.949003 master-0 kubenswrapper[29936]: I1205 12:51:10.948969 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:13.153052 master-0 kubenswrapper[29936]: I1205 12:51:13.152975 29936 patch_prober.go:28] interesting pod/oauth-openshift-5f469489fd-59qjd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.88:6443/healthz\": dial tcp 10.128.0.88:6443: connect: connection refused" start-of-body= Dec 05 12:51:13.153731 master-0 kubenswrapper[29936]: I1205 12:51:13.153064 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" podUID="91de1093-448a-432c-bc02-f4d0492c2e2b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.88:6443/healthz\": dial tcp 10.128.0.88:6443: connect: connection refused" Dec 05 12:51:15.467466 master-0 kubenswrapper[29936]: I1205 12:51:15.467418 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:51:15.468070 master-0 kubenswrapper[29936]: I1205 12:51:15.467492 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:51:15.773358 master-0 kubenswrapper[29936]: I1205 12:51:15.773313 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:51:15.814991 master-0 kubenswrapper[29936]: I1205 12:51:15.813684 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx"] Dec 05 12:51:15.814991 master-0 kubenswrapper[29936]: E1205 12:51:15.814053 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e943438b-1de8-435c-8a19-accd6a6292a4" containerName="route-controller-manager" Dec 05 12:51:15.814991 master-0 kubenswrapper[29936]: I1205 12:51:15.814069 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e943438b-1de8-435c-8a19-accd6a6292a4" containerName="route-controller-manager" Dec 05 12:51:15.814991 master-0 kubenswrapper[29936]: I1205 12:51:15.814247 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e943438b-1de8-435c-8a19-accd6a6292a4" containerName="route-controller-manager" Dec 05 12:51:15.814991 master-0 kubenswrapper[29936]: I1205 12:51:15.814893 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:15.849952 master-0 kubenswrapper[29936]: I1205 12:51:15.847437 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx"] Dec 05 12:51:15.885348 master-0 kubenswrapper[29936]: I1205 12:51:15.885289 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:51:15.901489 master-0 kubenswrapper[29936]: I1205 12:51:15.900726 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfknz\" (UniqueName: \"kubernetes.io/projected/e943438b-1de8-435c-8a19-accd6a6292a4-kube-api-access-lfknz\") pod \"e943438b-1de8-435c-8a19-accd6a6292a4\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " Dec 05 12:51:15.901489 master-0 kubenswrapper[29936]: I1205 12:51:15.900817 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert\") pod \"e943438b-1de8-435c-8a19-accd6a6292a4\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " Dec 05 12:51:15.901489 master-0 kubenswrapper[29936]: I1205 12:51:15.900901 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca\") pod \"e943438b-1de8-435c-8a19-accd6a6292a4\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " Dec 05 12:51:15.901489 master-0 kubenswrapper[29936]: I1205 12:51:15.900954 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config\") pod \"e943438b-1de8-435c-8a19-accd6a6292a4\" (UID: \"e943438b-1de8-435c-8a19-accd6a6292a4\") " Dec 05 12:51:15.901839 master-0 kubenswrapper[29936]: I1205 12:51:15.901676 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa069191-27f9-4d03-9bd1-8a301128c84d-serving-cert\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:15.901839 master-0 kubenswrapper[29936]: I1205 12:51:15.901824 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa069191-27f9-4d03-9bd1-8a301128c84d-config\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:15.902488 master-0 kubenswrapper[29936]: I1205 12:51:15.902095 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdf52\" (UniqueName: \"kubernetes.io/projected/fa069191-27f9-4d03-9bd1-8a301128c84d-kube-api-access-mdf52\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:15.902488 master-0 kubenswrapper[29936]: I1205 12:51:15.902295 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa069191-27f9-4d03-9bd1-8a301128c84d-client-ca\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:15.903134 master-0 kubenswrapper[29936]: I1205 12:51:15.902565 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config" (OuterVolumeSpecName: "config") pod "e943438b-1de8-435c-8a19-accd6a6292a4" (UID: "e943438b-1de8-435c-8a19-accd6a6292a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:51:15.903134 master-0 kubenswrapper[29936]: I1205 12:51:15.902624 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca" (OuterVolumeSpecName: "client-ca") pod "e943438b-1de8-435c-8a19-accd6a6292a4" (UID: "e943438b-1de8-435c-8a19-accd6a6292a4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:51:15.903905 master-0 kubenswrapper[29936]: I1205 12:51:15.903863 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e943438b-1de8-435c-8a19-accd6a6292a4-kube-api-access-lfknz" (OuterVolumeSpecName: "kube-api-access-lfknz") pod "e943438b-1de8-435c-8a19-accd6a6292a4" (UID: "e943438b-1de8-435c-8a19-accd6a6292a4"). InnerVolumeSpecName "kube-api-access-lfknz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:51:15.904471 master-0 kubenswrapper[29936]: I1205 12:51:15.904430 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e943438b-1de8-435c-8a19-accd6a6292a4" (UID: "e943438b-1de8-435c-8a19-accd6a6292a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:51:15.944904 master-0 kubenswrapper[29936]: I1205 12:51:15.944836 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-69cd4c69bf-mx2xk" event={"ID":"6936401b-4cb1-451f-b083-ee6721409cca","Type":"ContainerStarted","Data":"76e718959638f0d2f5336af4c11db511d92bda0d27f827230c4859b8768b78f6"} Dec 05 12:51:15.945518 master-0 kubenswrapper[29936]: I1205 12:51:15.945241 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-69cd4c69bf-mx2xk" Dec 05 12:51:15.946625 master-0 kubenswrapper[29936]: I1205 12:51:15.946152 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:51:15.947036 master-0 kubenswrapper[29936]: I1205 12:51:15.946679 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" event={"ID":"c39d2089-d3bf-4556-b6ef-c362a08c21a2","Type":"ContainerDied","Data":"ae3644549c6caccb0e5b76cf093dd16f97c66829b7bc2c724be0d4328e24c56e"} Dec 05 12:51:15.947036 master-0 kubenswrapper[29936]: I1205 12:51:15.946719 29936 scope.go:117] "RemoveContainer" containerID="712d1506cd51ade164ab750d49a330bfc3046901061c8f5155346b2e5325a1c2" Dec 05 12:51:15.947036 master-0 kubenswrapper[29936]: I1205 12:51:15.946930 29936 patch_prober.go:28] interesting pod/downloads-69cd4c69bf-mx2xk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Dec 05 12:51:15.947036 master-0 kubenswrapper[29936]: I1205 12:51:15.947008 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-69cd4c69bf-mx2xk" podUID="6936401b-4cb1-451f-b083-ee6721409cca" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Dec 05 12:51:15.950531 master-0 kubenswrapper[29936]: I1205 12:51:15.950480 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" event={"ID":"91de1093-448a-432c-bc02-f4d0492c2e2b","Type":"ContainerDied","Data":"5e4390b51c123622157c2975a08c75c10e159ea23c9d986afab1c424ef161265"} Dec 05 12:51:15.950661 master-0 kubenswrapper[29936]: I1205 12:51:15.950507 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5f469489fd-59qjd" Dec 05 12:51:15.952875 master-0 kubenswrapper[29936]: I1205 12:51:15.952798 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" event={"ID":"e943438b-1de8-435c-8a19-accd6a6292a4","Type":"ContainerDied","Data":"77da36c6bf5d09d68dbf2de017a655a5a15b25fda32cba3288a3d8b2cc4b44c0"} Dec 05 12:51:15.953003 master-0 kubenswrapper[29936]: I1205 12:51:15.952952 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx" Dec 05 12:51:15.970760 master-0 kubenswrapper[29936]: I1205 12:51:15.970724 29936 scope.go:117] "RemoveContainer" containerID="f1692e1daee8b4206325c60f2bd4dc179ef50f075c69b7d076873252ca37c2b0" Dec 05 12:51:15.980284 master-0 kubenswrapper[29936]: I1205 12:51:15.980198 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-69cd4c69bf-mx2xk" podStartSLOduration=1.4880052049999999 podStartE2EDuration="38.980144059s" podCreationTimestamp="2025-12-05 12:50:37 +0000 UTC" firstStartedPulling="2025-12-05 12:50:38.13261574 +0000 UTC m=+35.264695421" lastFinishedPulling="2025-12-05 12:51:15.624754594 +0000 UTC m=+72.756834275" observedRunningTime="2025-12-05 12:51:15.974912213 +0000 UTC m=+73.106991904" watchObservedRunningTime="2025-12-05 12:51:15.980144059 +0000 UTC m=+73.112223740" Dec 05 12:51:16.000920 master-0 kubenswrapper[29936]: I1205 12:51:15.999924 29936 scope.go:117] "RemoveContainer" containerID="d9a47a4e65ab5cf4baf6b36c8ce1ba7fd5756eae201f48950bc988deec039fe0" Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.004784 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-dir\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.004866 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert\") pod \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.004905 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr9jd\" (UniqueName: \"kubernetes.io/projected/c39d2089-d3bf-4556-b6ef-c362a08c21a2-kube-api-access-mr9jd\") pod \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.004941 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config\") pod \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.004972 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26ccx\" (UniqueName: \"kubernetes.io/projected/91de1093-448a-432c-bc02-f4d0492c2e2b-kube-api-access-26ccx\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005006 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-trusted-ca-bundle\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005032 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-ocp-branding-template\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005053 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-cliconfig\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005086 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-policies\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005107 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-router-certs\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005149 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca\") pod \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005508 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-serving-cert\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005543 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-provider-selection\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005575 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-service-ca\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005604 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-error\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005645 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-login\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005667 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-session\") pod \"91de1093-448a-432c-bc02-f4d0492c2e2b\" (UID: \"91de1093-448a-432c-bc02-f4d0492c2e2b\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005694 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles\") pod \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\" (UID: \"c39d2089-d3bf-4556-b6ef-c362a08c21a2\") " Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005938 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdf52\" (UniqueName: \"kubernetes.io/projected/fa069191-27f9-4d03-9bd1-8a301128c84d-kube-api-access-mdf52\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.005997 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa069191-27f9-4d03-9bd1-8a301128c84d-client-ca\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.006080 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa069191-27f9-4d03-9bd1-8a301128c84d-serving-cert\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.006105 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa069191-27f9-4d03-9bd1-8a301128c84d-config\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.006203 29936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e943438b-1de8-435c-8a19-accd6a6292a4-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.006218 29936 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.006261 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e943438b-1de8-435c-8a19-accd6a6292a4-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.006424 master-0 kubenswrapper[29936]: I1205 12:51:16.006276 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfknz\" (UniqueName: \"kubernetes.io/projected/e943438b-1de8-435c-8a19-accd6a6292a4-kube-api-access-lfknz\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.009685 master-0 kubenswrapper[29936]: I1205 12:51:16.007655 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx"] Dec 05 12:51:16.010402 master-0 kubenswrapper[29936]: I1205 12:51:16.010114 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa069191-27f9-4d03-9bd1-8a301128c84d-config\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:16.010402 master-0 kubenswrapper[29936]: I1205 12:51:16.010256 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:16.011918 master-0 kubenswrapper[29936]: I1205 12:51:16.011879 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 05 12:51:16.013012 master-0 kubenswrapper[29936]: I1205 12:51:16.012821 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:51:16.013547 master-0 kubenswrapper[29936]: I1205 12:51:16.013158 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:51:16.013547 master-0 kubenswrapper[29936]: I1205 12:51:16.013310 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c39d2089-d3bf-4556-b6ef-c362a08c21a2" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:51:16.013783 master-0 kubenswrapper[29936]: I1205 12:51:16.013743 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca" (OuterVolumeSpecName: "client-ca") pod "c39d2089-d3bf-4556-b6ef-c362a08c21a2" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:51:16.014394 master-0 kubenswrapper[29936]: I1205 12:51:16.013969 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:51:16.014394 master-0 kubenswrapper[29936]: I1205 12:51:16.014131 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c39d2089-d3bf-4556-b6ef-c362a08c21a2" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:51:16.015651 master-0 kubenswrapper[29936]: I1205 12:51:16.014750 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-554555dbc9-szqjx"] Dec 05 12:51:16.015651 master-0 kubenswrapper[29936]: I1205 12:51:16.015102 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:51:16.016451 master-0 kubenswrapper[29936]: I1205 12:51:16.016427 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fa069191-27f9-4d03-9bd1-8a301128c84d-client-ca\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:16.016557 master-0 kubenswrapper[29936]: I1205 12:51:16.016443 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config" (OuterVolumeSpecName: "config") pod "c39d2089-d3bf-4556-b6ef-c362a08c21a2" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:51:16.016667 master-0 kubenswrapper[29936]: I1205 12:51:16.016634 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:51:16.016792 master-0 kubenswrapper[29936]: I1205 12:51:16.016754 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:51:16.017233 master-0 kubenswrapper[29936]: I1205 12:51:16.017170 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:51:16.017392 master-0 kubenswrapper[29936]: I1205 12:51:16.017328 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:51:16.017766 master-0 kubenswrapper[29936]: I1205 12:51:16.017729 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:51:16.018118 master-0 kubenswrapper[29936]: I1205 12:51:16.018051 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:51:16.018416 master-0 kubenswrapper[29936]: I1205 12:51:16.018382 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39d2089-d3bf-4556-b6ef-c362a08c21a2-kube-api-access-mr9jd" (OuterVolumeSpecName: "kube-api-access-mr9jd") pod "c39d2089-d3bf-4556-b6ef-c362a08c21a2" (UID: "c39d2089-d3bf-4556-b6ef-c362a08c21a2"). InnerVolumeSpecName "kube-api-access-mr9jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:51:16.019820 master-0 kubenswrapper[29936]: I1205 12:51:16.019776 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:51:16.020671 master-0 kubenswrapper[29936]: I1205 12:51:16.020592 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91de1093-448a-432c-bc02-f4d0492c2e2b-kube-api-access-26ccx" (OuterVolumeSpecName: "kube-api-access-26ccx") pod "91de1093-448a-432c-bc02-f4d0492c2e2b" (UID: "91de1093-448a-432c-bc02-f4d0492c2e2b"). InnerVolumeSpecName "kube-api-access-26ccx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:51:16.020831 master-0 kubenswrapper[29936]: I1205 12:51:16.020772 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa069191-27f9-4d03-9bd1-8a301128c84d-serving-cert\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:16.047528 master-0 kubenswrapper[29936]: I1205 12:51:16.047476 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdf52\" (UniqueName: \"kubernetes.io/projected/fa069191-27f9-4d03-9bd1-8a301128c84d-kube-api-access-mdf52\") pod \"route-controller-manager-556754d987-v8gnx\" (UID: \"fa069191-27f9-4d03-9bd1-8a301128c84d\") " pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107292 29936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107378 29936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107397 29936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107410 29936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107434 29936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107451 29936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107468 29936 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107484 29936 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107495 29936 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c39d2089-d3bf-4556-b6ef-c362a08c21a2-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107507 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr9jd\" (UniqueName: \"kubernetes.io/projected/c39d2089-d3bf-4556-b6ef-c362a08c21a2-kube-api-access-mr9jd\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107521 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107534 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26ccx\" (UniqueName: \"kubernetes.io/projected/91de1093-448a-432c-bc02-f4d0492c2e2b-kube-api-access-26ccx\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107543 29936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107556 29936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107565 29936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107574 29936 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91de1093-448a-432c-bc02-f4d0492c2e2b-audit-policies\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107583 29936 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91de1093-448a-432c-bc02-f4d0492c2e2b-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.107978 master-0 kubenswrapper[29936]: I1205 12:51:16.107591 29936 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c39d2089-d3bf-4556-b6ef-c362a08c21a2-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:16.148674 master-0 kubenswrapper[29936]: I1205 12:51:16.148551 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:16.305818 master-0 kubenswrapper[29936]: I1205 12:51:16.305281 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5f469489fd-59qjd"] Dec 05 12:51:16.308120 master-0 kubenswrapper[29936]: I1205 12:51:16.307408 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-5f469489fd-59qjd"] Dec 05 12:51:16.586466 master-0 kubenswrapper[29936]: I1205 12:51:16.586308 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx"] Dec 05 12:51:16.963234 master-0 kubenswrapper[29936]: I1205 12:51:16.963165 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw" Dec 05 12:51:16.981351 master-0 kubenswrapper[29936]: I1205 12:51:16.981282 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"67037a7b-9105-4c7d-80ac-7481c14997f1","Type":"ContainerStarted","Data":"71e23aa83f43ebc2173b1d436a0fc43ce5211e7ed415e69869128fdff37a25f4"} Dec 05 12:51:16.981351 master-0 kubenswrapper[29936]: I1205 12:51:16.981352 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"67037a7b-9105-4c7d-80ac-7481c14997f1","Type":"ContainerStarted","Data":"31fee9d510e321648476c88902076cbeab7e810076c5e3c4e3018d9474272893"} Dec 05 12:51:16.984855 master-0 kubenswrapper[29936]: I1205 12:51:16.984797 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" event={"ID":"fa069191-27f9-4d03-9bd1-8a301128c84d","Type":"ContainerStarted","Data":"6e1fdf8093eb5f0f293d5030c789c2aa976f72249523bc3881895d24006c686d"} Dec 05 12:51:16.984855 master-0 kubenswrapper[29936]: I1205 12:51:16.984858 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" event={"ID":"fa069191-27f9-4d03-9bd1-8a301128c84d","Type":"ContainerStarted","Data":"f719b4899c7cade23a9ae836a3d8aeea86e64d281eee7f62eb7a8abad5147d9a"} Dec 05 12:51:16.985267 master-0 kubenswrapper[29936]: I1205 12:51:16.985174 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:16.991879 master-0 kubenswrapper[29936]: I1205 12:51:16.985396 29936 patch_prober.go:28] interesting pod/downloads-69cd4c69bf-mx2xk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Dec 05 12:51:16.991879 master-0 kubenswrapper[29936]: I1205 12:51:16.985485 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-69cd4c69bf-mx2xk" podUID="6936401b-4cb1-451f-b083-ee6721409cca" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Dec 05 12:51:17.029219 master-0 kubenswrapper[29936]: I1205 12:51:17.029092 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=7.029069595 podStartE2EDuration="7.029069595s" podCreationTimestamp="2025-12-05 12:51:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:51:17.007175358 +0000 UTC m=+74.139255029" watchObservedRunningTime="2025-12-05 12:51:17.029069595 +0000 UTC m=+74.161149286" Dec 05 12:51:17.032027 master-0 kubenswrapper[29936]: I1205 12:51:17.031935 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" podStartSLOduration=9.031920644 podStartE2EDuration="9.031920644s" podCreationTimestamp="2025-12-05 12:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:51:17.029926059 +0000 UTC m=+74.162005750" watchObservedRunningTime="2025-12-05 12:51:17.031920644 +0000 UTC m=+74.164000335" Dec 05 12:51:17.040132 master-0 kubenswrapper[29936]: I1205 12:51:17.040044 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-556754d987-v8gnx" Dec 05 12:51:17.059192 master-0 kubenswrapper[29936]: I1205 12:51:17.059075 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw"] Dec 05 12:51:17.062480 master-0 kubenswrapper[29936]: I1205 12:51:17.062425 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b59c5b9bc-vh8fw"] Dec 05 12:51:17.195487 master-0 kubenswrapper[29936]: I1205 12:51:17.195423 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91de1093-448a-432c-bc02-f4d0492c2e2b" path="/var/lib/kubelet/pods/91de1093-448a-432c-bc02-f4d0492c2e2b/volumes" Dec 05 12:51:17.196310 master-0 kubenswrapper[29936]: I1205 12:51:17.196282 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" path="/var/lib/kubelet/pods/c39d2089-d3bf-4556-b6ef-c362a08c21a2/volumes" Dec 05 12:51:17.197101 master-0 kubenswrapper[29936]: I1205 12:51:17.197074 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e943438b-1de8-435c-8a19-accd6a6292a4" path="/var/lib/kubelet/pods/e943438b-1de8-435c-8a19-accd6a6292a4/volumes" Dec 05 12:51:17.539538 master-0 kubenswrapper[29936]: I1205 12:51:17.539453 29936 patch_prober.go:28] interesting pod/downloads-69cd4c69bf-mx2xk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Dec 05 12:51:17.539538 master-0 kubenswrapper[29936]: I1205 12:51:17.539501 29936 patch_prober.go:28] interesting pod/downloads-69cd4c69bf-mx2xk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Dec 05 12:51:17.539850 master-0 kubenswrapper[29936]: I1205 12:51:17.539560 29936 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-69cd4c69bf-mx2xk" podUID="6936401b-4cb1-451f-b083-ee6721409cca" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Dec 05 12:51:17.539850 master-0 kubenswrapper[29936]: I1205 12:51:17.539596 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-69cd4c69bf-mx2xk" podUID="6936401b-4cb1-451f-b083-ee6721409cca" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Dec 05 12:51:17.958978 master-0 kubenswrapper[29936]: I1205 12:51:17.958888 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5644fcf74f-5cvqd"] Dec 05 12:51:17.959833 master-0 kubenswrapper[29936]: E1205 12:51:17.959279 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" containerName="controller-manager" Dec 05 12:51:17.959833 master-0 kubenswrapper[29936]: I1205 12:51:17.959296 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" containerName="controller-manager" Dec 05 12:51:17.959833 master-0 kubenswrapper[29936]: E1205 12:51:17.959314 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" containerName="controller-manager" Dec 05 12:51:17.959833 master-0 kubenswrapper[29936]: I1205 12:51:17.959320 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" containerName="controller-manager" Dec 05 12:51:17.959833 master-0 kubenswrapper[29936]: E1205 12:51:17.959333 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91de1093-448a-432c-bc02-f4d0492c2e2b" containerName="oauth-openshift" Dec 05 12:51:17.959833 master-0 kubenswrapper[29936]: I1205 12:51:17.959341 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="91de1093-448a-432c-bc02-f4d0492c2e2b" containerName="oauth-openshift" Dec 05 12:51:17.959833 master-0 kubenswrapper[29936]: I1205 12:51:17.959496 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" containerName="controller-manager" Dec 05 12:51:17.959833 master-0 kubenswrapper[29936]: I1205 12:51:17.959550 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="91de1093-448a-432c-bc02-f4d0492c2e2b" containerName="oauth-openshift" Dec 05 12:51:17.960207 master-0 kubenswrapper[29936]: I1205 12:51:17.960141 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:17.970907 master-0 kubenswrapper[29936]: I1205 12:51:17.970836 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 12:51:17.971237 master-0 kubenswrapper[29936]: I1205 12:51:17.971207 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-fqrhd" Dec 05 12:51:17.971446 master-0 kubenswrapper[29936]: I1205 12:51:17.971392 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 12:51:17.971633 master-0 kubenswrapper[29936]: I1205 12:51:17.971568 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 12:51:17.971876 master-0 kubenswrapper[29936]: I1205 12:51:17.971850 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 12:51:17.971949 master-0 kubenswrapper[29936]: I1205 12:51:17.971910 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 12:51:17.973838 master-0 kubenswrapper[29936]: I1205 12:51:17.973777 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-744fbd497d-tm2v4"] Dec 05 12:51:17.974353 master-0 kubenswrapper[29936]: I1205 12:51:17.974323 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39d2089-d3bf-4556-b6ef-c362a08c21a2" containerName="controller-manager" Dec 05 12:51:17.974878 master-0 kubenswrapper[29936]: I1205 12:51:17.974843 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:17.978329 master-0 kubenswrapper[29936]: I1205 12:51:17.978261 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 12:51:17.978826 master-0 kubenswrapper[29936]: I1205 12:51:17.978785 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 12:51:17.979136 master-0 kubenswrapper[29936]: I1205 12:51:17.979071 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 12:51:17.979934 master-0 kubenswrapper[29936]: I1205 12:51:17.979810 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 12:51:17.979934 master-0 kubenswrapper[29936]: I1205 12:51:17.979880 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 12:51:17.980420 master-0 kubenswrapper[29936]: I1205 12:51:17.980151 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 12:51:17.980420 master-0 kubenswrapper[29936]: I1205 12:51:17.980234 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 12:51:17.980420 master-0 kubenswrapper[29936]: I1205 12:51:17.980296 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-xzgbv" Dec 05 12:51:17.980589 master-0 kubenswrapper[29936]: I1205 12:51:17.980452 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 12:51:17.980589 master-0 kubenswrapper[29936]: I1205 12:51:17.980459 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 12:51:17.980589 master-0 kubenswrapper[29936]: I1205 12:51:17.980524 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 12:51:17.982110 master-0 kubenswrapper[29936]: I1205 12:51:17.982053 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 12:51:17.983527 master-0 kubenswrapper[29936]: I1205 12:51:17.983404 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 12:51:17.992506 master-0 kubenswrapper[29936]: I1205 12:51:17.992442 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 12:51:17.999015 master-0 kubenswrapper[29936]: I1205 12:51:17.998949 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 12:51:18.024010 master-0 kubenswrapper[29936]: I1205 12:51:18.020734 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5644fcf74f-5cvqd"] Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.031239 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-744fbd497d-tm2v4"] Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.035781 29936 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.036235 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler" containerID="cri-o://b24c1b8d78045ff86297a6b78ba71b900f89c5e046061babf21a495bd9bf95d3" gracePeriod=30 Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.036273 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler-recovery-controller" containerID="cri-o://2c505d1745e5c41c810aeede53577e7297a75c5a2221af8e371f406e5004dcbf" gracePeriod=30 Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.036291 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler-cert-syncer" containerID="cri-o://ba110a7b76ad288df7047b8cf5908c2bd3487d9f6a715466f139c0f2eb3f27da" gracePeriod=30 Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.036889 29936 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: E1205 12:51:18.037438 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="wait-for-host-port" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.037456 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="wait-for-host-port" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: E1205 12:51:18.037479 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler-cert-syncer" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.037487 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler-cert-syncer" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: E1205 12:51:18.037547 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler-recovery-controller" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.037557 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler-recovery-controller" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: E1205 12:51:18.037569 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.037615 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.037864 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.037887 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler-cert-syncer" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.037912 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="wait-for-host-port" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.037930 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb8c983acca0c27a191b3f720d4b1e0" containerName="kube-scheduler-recovery-controller" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.038776 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw4xw\" (UniqueName: \"kubernetes.io/projected/59a90548-aa32-4fdf-bb2d-0a8860f1661a-kube-api-access-gw4xw\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.038880 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a4c135-21f4-49e0-9680-61b449c733ec-serving-cert\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.039291 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/59a90548-aa32-4fdf-bb2d-0a8860f1661a-audit-dir\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.039488 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.039548 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a4c135-21f4-49e0-9680-61b449c733ec-client-ca\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.039690 master-0 kubenswrapper[29936]: I1205 12:51:18.039653 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48a4c135-21f4-49e0-9680-61b449c733ec-proxy-ca-bundles\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.039750 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a4c135-21f4-49e0-9680-61b449c733ec-config\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.039787 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.039907 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-session\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.039958 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbvxx\" (UniqueName: \"kubernetes.io/projected/48a4c135-21f4-49e0-9680-61b449c733ec-kube-api-access-kbvxx\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.040029 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-service-ca\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.040052 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-router-certs\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.040110 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-user-template-login\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.040147 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.040199 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-user-template-error\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.040227 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.040254 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-audit-policies\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.041870 master-0 kubenswrapper[29936]: I1205 12:51:18.040314 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.127964 master-0 kubenswrapper[29936]: I1205 12:51:18.127826 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="2cb8c983acca0c27a191b3f720d4b1e0" podUID="fc0ed8180ac3c77ddb293604fb163978" Dec 05 12:51:18.141803 master-0 kubenswrapper[29936]: I1205 12:51:18.141752 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a4c135-21f4-49e0-9680-61b449c733ec-config\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.141904 master-0 kubenswrapper[29936]: I1205 12:51:18.141809 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.142102 master-0 kubenswrapper[29936]: I1205 12:51:18.142053 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-session\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.142206 master-0 kubenswrapper[29936]: I1205 12:51:18.142159 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbvxx\" (UniqueName: \"kubernetes.io/projected/48a4c135-21f4-49e0-9680-61b449c733ec-kube-api-access-kbvxx\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.142397 master-0 kubenswrapper[29936]: I1205 12:51:18.142360 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fc0ed8180ac3c77ddb293604fb163978-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fc0ed8180ac3c77ddb293604fb163978\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:18.142444 master-0 kubenswrapper[29936]: I1205 12:51:18.142420 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-service-ca\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.142473 master-0 kubenswrapper[29936]: I1205 12:51:18.142451 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-router-certs\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.142501 master-0 kubenswrapper[29936]: I1205 12:51:18.142483 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-user-template-login\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.142532 master-0 kubenswrapper[29936]: I1205 12:51:18.142507 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.142566 master-0 kubenswrapper[29936]: I1205 12:51:18.142532 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-user-template-error\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.143443 master-0 kubenswrapper[29936]: I1205 12:51:18.143385 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48a4c135-21f4-49e0-9680-61b449c733ec-config\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.143511 master-0 kubenswrapper[29936]: I1205 12:51:18.143474 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-service-ca\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.143572 master-0 kubenswrapper[29936]: I1205 12:51:18.143529 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.143692 master-0 kubenswrapper[29936]: I1205 12:51:18.143660 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.144046 master-0 kubenswrapper[29936]: I1205 12:51:18.143694 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-audit-policies\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.144046 master-0 kubenswrapper[29936]: I1205 12:51:18.143729 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.144046 master-0 kubenswrapper[29936]: I1205 12:51:18.143807 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw4xw\" (UniqueName: \"kubernetes.io/projected/59a90548-aa32-4fdf-bb2d-0a8860f1661a-kube-api-access-gw4xw\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.144046 master-0 kubenswrapper[29936]: I1205 12:51:18.143842 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a4c135-21f4-49e0-9680-61b449c733ec-serving-cert\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.144046 master-0 kubenswrapper[29936]: I1205 12:51:18.143887 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/59a90548-aa32-4fdf-bb2d-0a8860f1661a-audit-dir\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.144046 master-0 kubenswrapper[29936]: I1205 12:51:18.143982 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fc0ed8180ac3c77ddb293604fb163978-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fc0ed8180ac3c77ddb293604fb163978\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:18.144046 master-0 kubenswrapper[29936]: I1205 12:51:18.144007 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.144345 master-0 kubenswrapper[29936]: I1205 12:51:18.144060 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a4c135-21f4-49e0-9680-61b449c733ec-client-ca\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.144345 master-0 kubenswrapper[29936]: I1205 12:51:18.144093 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48a4c135-21f4-49e0-9680-61b449c733ec-proxy-ca-bundles\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.144345 master-0 kubenswrapper[29936]: I1205 12:51:18.144274 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.145135 master-0 kubenswrapper[29936]: I1205 12:51:18.145101 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48a4c135-21f4-49e0-9680-61b449c733ec-proxy-ca-bundles\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.145229 master-0 kubenswrapper[29936]: I1205 12:51:18.145153 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/59a90548-aa32-4fdf-bb2d-0a8860f1661a-audit-dir\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.146133 master-0 kubenswrapper[29936]: I1205 12:51:18.146109 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-session\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.147862 master-0 kubenswrapper[29936]: I1205 12:51:18.146498 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-router-certs\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.147862 master-0 kubenswrapper[29936]: I1205 12:51:18.146857 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48a4c135-21f4-49e0-9680-61b449c733ec-client-ca\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.147862 master-0 kubenswrapper[29936]: I1205 12:51:18.147701 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-user-template-error\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.148089 master-0 kubenswrapper[29936]: I1205 12:51:18.148000 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.148373 master-0 kubenswrapper[29936]: I1205 12:51:18.148333 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/59a90548-aa32-4fdf-bb2d-0a8860f1661a-audit-policies\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.149224 master-0 kubenswrapper[29936]: I1205 12:51:18.149164 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.150170 master-0 kubenswrapper[29936]: I1205 12:51:18.150140 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.152720 master-0 kubenswrapper[29936]: I1205 12:51:18.152687 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48a4c135-21f4-49e0-9680-61b449c733ec-serving-cert\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.156064 master-0 kubenswrapper[29936]: I1205 12:51:18.156024 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/59a90548-aa32-4fdf-bb2d-0a8860f1661a-v4-0-config-user-template-login\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.203535 master-0 kubenswrapper[29936]: I1205 12:51:18.203500 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw4xw\" (UniqueName: \"kubernetes.io/projected/59a90548-aa32-4fdf-bb2d-0a8860f1661a-kube-api-access-gw4xw\") pod \"oauth-openshift-744fbd497d-tm2v4\" (UID: \"59a90548-aa32-4fdf-bb2d-0a8860f1661a\") " pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.206829 master-0 kubenswrapper[29936]: I1205 12:51:18.206791 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbvxx\" (UniqueName: \"kubernetes.io/projected/48a4c135-21f4-49e0-9680-61b449c733ec-kube-api-access-kbvxx\") pod \"controller-manager-5644fcf74f-5cvqd\" (UID: \"48a4c135-21f4-49e0-9680-61b449c733ec\") " pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.245405 master-0 kubenswrapper[29936]: I1205 12:51:18.245351 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fc0ed8180ac3c77ddb293604fb163978-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fc0ed8180ac3c77ddb293604fb163978\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:18.245686 master-0 kubenswrapper[29936]: I1205 12:51:18.245619 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fc0ed8180ac3c77ddb293604fb163978-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fc0ed8180ac3c77ddb293604fb163978\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:18.246148 master-0 kubenswrapper[29936]: I1205 12:51:18.245790 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fc0ed8180ac3c77ddb293604fb163978-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fc0ed8180ac3c77ddb293604fb163978\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:18.246148 master-0 kubenswrapper[29936]: I1205 12:51:18.245995 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fc0ed8180ac3c77ddb293604fb163978-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fc0ed8180ac3c77ddb293604fb163978\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:18.321686 master-0 kubenswrapper[29936]: I1205 12:51:18.321621 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_2cb8c983acca0c27a191b3f720d4b1e0/kube-scheduler-cert-syncer/0.log" Dec 05 12:51:18.323771 master-0 kubenswrapper[29936]: I1205 12:51:18.323721 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:18.327933 master-0 kubenswrapper[29936]: I1205 12:51:18.327873 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="2cb8c983acca0c27a191b3f720d4b1e0" podUID="fc0ed8180ac3c77ddb293604fb163978" Dec 05 12:51:18.334360 master-0 kubenswrapper[29936]: I1205 12:51:18.334310 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:18.339974 master-0 kubenswrapper[29936]: I1205 12:51:18.339828 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:18.449586 master-0 kubenswrapper[29936]: I1205 12:51:18.448805 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") pod \"2cb8c983acca0c27a191b3f720d4b1e0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " Dec 05 12:51:18.449586 master-0 kubenswrapper[29936]: I1205 12:51:18.449000 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") pod \"2cb8c983acca0c27a191b3f720d4b1e0\" (UID: \"2cb8c983acca0c27a191b3f720d4b1e0\") " Dec 05 12:51:18.449586 master-0 kubenswrapper[29936]: I1205 12:51:18.449492 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "2cb8c983acca0c27a191b3f720d4b1e0" (UID: "2cb8c983acca0c27a191b3f720d4b1e0"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:18.449969 master-0 kubenswrapper[29936]: I1205 12:51:18.449935 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "2cb8c983acca0c27a191b3f720d4b1e0" (UID: "2cb8c983acca0c27a191b3f720d4b1e0"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:18.450207 master-0 kubenswrapper[29936]: I1205 12:51:18.450133 29936 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:18.450207 master-0 kubenswrapper[29936]: I1205 12:51:18.450157 29936 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2cb8c983acca0c27a191b3f720d4b1e0-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:18.809850 master-0 kubenswrapper[29936]: I1205 12:51:18.809803 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5644fcf74f-5cvqd"] Dec 05 12:51:18.818398 master-0 kubenswrapper[29936]: W1205 12:51:18.818103 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48a4c135_21f4_49e0_9680_61b449c733ec.slice/crio-a94b1008a93ce0a3902cd7fe535748f2cad2a17b9d70f3e97b033cb739257f33 WatchSource:0}: Error finding container a94b1008a93ce0a3902cd7fe535748f2cad2a17b9d70f3e97b033cb739257f33: Status 404 returned error can't find the container with id a94b1008a93ce0a3902cd7fe535748f2cad2a17b9d70f3e97b033cb739257f33 Dec 05 12:51:18.977592 master-0 kubenswrapper[29936]: I1205 12:51:18.977504 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-744fbd497d-tm2v4"] Dec 05 12:51:19.000116 master-0 kubenswrapper[29936]: I1205 12:51:19.000081 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_2cb8c983acca0c27a191b3f720d4b1e0/kube-scheduler-cert-syncer/0.log" Dec 05 12:51:19.001119 master-0 kubenswrapper[29936]: I1205 12:51:19.001085 29936 generic.go:334] "Generic (PLEG): container finished" podID="2cb8c983acca0c27a191b3f720d4b1e0" containerID="2c505d1745e5c41c810aeede53577e7297a75c5a2221af8e371f406e5004dcbf" exitCode=0 Dec 05 12:51:19.001119 master-0 kubenswrapper[29936]: I1205 12:51:19.001118 29936 generic.go:334] "Generic (PLEG): container finished" podID="2cb8c983acca0c27a191b3f720d4b1e0" containerID="ba110a7b76ad288df7047b8cf5908c2bd3487d9f6a715466f139c0f2eb3f27da" exitCode=2 Dec 05 12:51:19.001358 master-0 kubenswrapper[29936]: I1205 12:51:19.001132 29936 generic.go:334] "Generic (PLEG): container finished" podID="2cb8c983acca0c27a191b3f720d4b1e0" containerID="b24c1b8d78045ff86297a6b78ba71b900f89c5e046061babf21a495bd9bf95d3" exitCode=0 Dec 05 12:51:19.001358 master-0 kubenswrapper[29936]: I1205 12:51:19.001218 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:19.001358 master-0 kubenswrapper[29936]: I1205 12:51:19.001246 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0475df4d5336da05f2cdbc3f74e49ad376be174c9b01bb8c74b713bd60e7ac6" Dec 05 12:51:19.004669 master-0 kubenswrapper[29936]: I1205 12:51:19.004492 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" event={"ID":"48a4c135-21f4-49e0-9680-61b449c733ec","Type":"ContainerStarted","Data":"a94b1008a93ce0a3902cd7fe535748f2cad2a17b9d70f3e97b033cb739257f33"} Dec 05 12:51:19.008427 master-0 kubenswrapper[29936]: I1205 12:51:19.005624 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="2cb8c983acca0c27a191b3f720d4b1e0" podUID="fc0ed8180ac3c77ddb293604fb163978" Dec 05 12:51:19.008427 master-0 kubenswrapper[29936]: I1205 12:51:19.006117 29936 generic.go:334] "Generic (PLEG): container finished" podID="e66ebd90-1d8c-47ff-b569-1831bfc110ce" containerID="76cecff7fa7ab1620c5fea3791fa06e91fd4391f443781b4e1e7f5b5e4f3ee7e" exitCode=0 Dec 05 12:51:19.008427 master-0 kubenswrapper[29936]: I1205 12:51:19.006191 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"e66ebd90-1d8c-47ff-b569-1831bfc110ce","Type":"ContainerDied","Data":"76cecff7fa7ab1620c5fea3791fa06e91fd4391f443781b4e1e7f5b5e4f3ee7e"} Dec 05 12:51:19.008427 master-0 kubenswrapper[29936]: I1205 12:51:19.007430 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" event={"ID":"59a90548-aa32-4fdf-bb2d-0a8860f1661a","Type":"ContainerStarted","Data":"ff0610bd20013479c0b7ff39692c93814588ad4d3728a21714b342cd6aa964cc"} Dec 05 12:51:19.044918 master-0 kubenswrapper[29936]: I1205 12:51:19.043865 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="2cb8c983acca0c27a191b3f720d4b1e0" podUID="fc0ed8180ac3c77ddb293604fb163978" Dec 05 12:51:19.203044 master-0 kubenswrapper[29936]: I1205 12:51:19.202984 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb8c983acca0c27a191b3f720d4b1e0" path="/var/lib/kubelet/pods/2cb8c983acca0c27a191b3f720d4b1e0/volumes" Dec 05 12:51:20.017391 master-0 kubenswrapper[29936]: I1205 12:51:20.017324 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" event={"ID":"48a4c135-21f4-49e0-9680-61b449c733ec","Type":"ContainerStarted","Data":"8e5543e8854bceb37f561ffa2316322b8bd244828d501e6ebb76a69c16541618"} Dec 05 12:51:20.018390 master-0 kubenswrapper[29936]: I1205 12:51:20.017545 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:20.024299 master-0 kubenswrapper[29936]: I1205 12:51:20.024255 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" Dec 05 12:51:20.413538 master-0 kubenswrapper[29936]: I1205 12:51:20.413482 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:51:20.485723 master-0 kubenswrapper[29936]: I1205 12:51:20.485677 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kubelet-dir\") pod \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " Dec 05 12:51:20.486018 master-0 kubenswrapper[29936]: I1205 12:51:20.486000 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kube-api-access\") pod \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " Dec 05 12:51:20.486105 master-0 kubenswrapper[29936]: I1205 12:51:20.485863 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e66ebd90-1d8c-47ff-b569-1831bfc110ce" (UID: "e66ebd90-1d8c-47ff-b569-1831bfc110ce"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:20.486301 master-0 kubenswrapper[29936]: I1205 12:51:20.486282 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-var-lock\") pod \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\" (UID: \"e66ebd90-1d8c-47ff-b569-1831bfc110ce\") " Dec 05 12:51:20.486411 master-0 kubenswrapper[29936]: I1205 12:51:20.486381 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-var-lock" (OuterVolumeSpecName: "var-lock") pod "e66ebd90-1d8c-47ff-b569-1831bfc110ce" (UID: "e66ebd90-1d8c-47ff-b569-1831bfc110ce"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:20.486847 master-0 kubenswrapper[29936]: I1205 12:51:20.486829 29936 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:20.486923 master-0 kubenswrapper[29936]: I1205 12:51:20.486912 29936 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:20.489961 master-0 kubenswrapper[29936]: I1205 12:51:20.489913 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e66ebd90-1d8c-47ff-b569-1831bfc110ce" (UID: "e66ebd90-1d8c-47ff-b569-1831bfc110ce"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:51:20.588795 master-0 kubenswrapper[29936]: I1205 12:51:20.588729 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e66ebd90-1d8c-47ff-b569-1831bfc110ce-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:20.636148 master-0 kubenswrapper[29936]: I1205 12:51:20.635975 29936 patch_prober.go:28] interesting pod/console-7495f49968-4tq6k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 05 12:51:20.636148 master-0 kubenswrapper[29936]: I1205 12:51:20.636043 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 05 12:51:21.039612 master-0 kubenswrapper[29936]: I1205 12:51:21.039511 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-6-master-0" event={"ID":"e66ebd90-1d8c-47ff-b569-1831bfc110ce","Type":"ContainerDied","Data":"a47445f5142e106a08eab5a6ce3670a2fd9727c3d75649eed68d941e1c1ecf9c"} Dec 05 12:51:21.039612 master-0 kubenswrapper[29936]: I1205 12:51:21.039587 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-6-master-0" Dec 05 12:51:21.039612 master-0 kubenswrapper[29936]: I1205 12:51:21.039616 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a47445f5142e106a08eab5a6ce3670a2fd9727c3d75649eed68d941e1c1ecf9c" Dec 05 12:51:21.042460 master-0 kubenswrapper[29936]: I1205 12:51:21.042391 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" event={"ID":"59a90548-aa32-4fdf-bb2d-0a8860f1661a","Type":"ContainerStarted","Data":"1b836494ea03897c78c61eac147c8603eedcee1b7fc111dfe62ba3851d3e824e"} Dec 05 12:51:21.643527 master-0 kubenswrapper[29936]: I1205 12:51:21.643443 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5644fcf74f-5cvqd" podStartSLOduration=13.643416703 podStartE2EDuration="13.643416703s" podCreationTimestamp="2025-12-05 12:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:51:21.603556566 +0000 UTC m=+78.735636257" watchObservedRunningTime="2025-12-05 12:51:21.643416703 +0000 UTC m=+78.775496404" Dec 05 12:51:21.735105 master-0 kubenswrapper[29936]: I1205 12:51:21.735042 29936 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:51:21.735459 master-0 kubenswrapper[29936]: I1205 12:51:21.735397 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager" containerID="cri-o://91dbe5959251acff62db45931eb5a5e1e4e7af9bb363ef308eee803d4237a389" gracePeriod=30 Dec 05 12:51:21.736204 master-0 kubenswrapper[29936]: I1205 12:51:21.736035 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://0a16bc5dbf4947d3592d7a160d069d5ae407c8eecca6478799c03089401c073c" gracePeriod=30 Dec 05 12:51:21.736204 master-0 kubenswrapper[29936]: I1205 12:51:21.736083 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="cluster-policy-controller" containerID="cri-o://8d14f1413c8e8a2ef6cd9ab523725814ba9ff7a6021dd1c6a68ef759cfabfdf3" gracePeriod=30 Dec 05 12:51:21.736388 master-0 kubenswrapper[29936]: I1205 12:51:21.736312 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://1b3283d0fac22ca78f337b1d5e3afe8d01431a02a7bb6f2fb90c61b14196aefb" gracePeriod=30 Dec 05 12:51:21.737214 master-0 kubenswrapper[29936]: I1205 12:51:21.737156 29936 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:51:21.737667 master-0 kubenswrapper[29936]: E1205 12:51:21.737625 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66ebd90-1d8c-47ff-b569-1831bfc110ce" containerName="installer" Dec 05 12:51:21.737667 master-0 kubenswrapper[29936]: I1205 12:51:21.737655 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66ebd90-1d8c-47ff-b569-1831bfc110ce" containerName="installer" Dec 05 12:51:21.738079 master-0 kubenswrapper[29936]: E1205 12:51:21.737675 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager" Dec 05 12:51:21.738079 master-0 kubenswrapper[29936]: I1205 12:51:21.737684 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager" Dec 05 12:51:21.738079 master-0 kubenswrapper[29936]: E1205 12:51:21.737717 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="cluster-policy-controller" Dec 05 12:51:21.738079 master-0 kubenswrapper[29936]: I1205 12:51:21.737913 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="cluster-policy-controller" Dec 05 12:51:21.738079 master-0 kubenswrapper[29936]: E1205 12:51:21.737929 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager-recovery-controller" Dec 05 12:51:21.738079 master-0 kubenswrapper[29936]: I1205 12:51:21.737937 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager-recovery-controller" Dec 05 12:51:21.738079 master-0 kubenswrapper[29936]: E1205 12:51:21.737947 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager-cert-syncer" Dec 05 12:51:21.738079 master-0 kubenswrapper[29936]: I1205 12:51:21.737955 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager-cert-syncer" Dec 05 12:51:21.738489 master-0 kubenswrapper[29936]: I1205 12:51:21.738142 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66ebd90-1d8c-47ff-b569-1831bfc110ce" containerName="installer" Dec 05 12:51:21.738489 master-0 kubenswrapper[29936]: I1205 12:51:21.738159 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager-cert-syncer" Dec 05 12:51:21.738489 master-0 kubenswrapper[29936]: I1205 12:51:21.738170 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager" Dec 05 12:51:21.738489 master-0 kubenswrapper[29936]: I1205 12:51:21.738202 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="cluster-policy-controller" Dec 05 12:51:21.738489 master-0 kubenswrapper[29936]: I1205 12:51:21.738213 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab1992e269496bc39c1df6084e6e60fd" containerName="kube-controller-manager-recovery-controller" Dec 05 12:51:21.814843 master-0 kubenswrapper[29936]: I1205 12:51:21.814710 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"610dc2015b38bc32879d55a7d39b2587\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:21.814843 master-0 kubenswrapper[29936]: I1205 12:51:21.814851 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"610dc2015b38bc32879d55a7d39b2587\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:21.915973 master-0 kubenswrapper[29936]: I1205 12:51:21.915843 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"610dc2015b38bc32879d55a7d39b2587\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:21.916086 master-0 kubenswrapper[29936]: I1205 12:51:21.915992 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"610dc2015b38bc32879d55a7d39b2587\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:21.916170 master-0 kubenswrapper[29936]: I1205 12:51:21.916143 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"610dc2015b38bc32879d55a7d39b2587\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:21.916872 master-0 kubenswrapper[29936]: I1205 12:51:21.916778 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"610dc2015b38bc32879d55a7d39b2587\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:21.964148 master-0 kubenswrapper[29936]: I1205 12:51:21.964077 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_ab1992e269496bc39c1df6084e6e60fd/kube-controller-manager-cert-syncer/0.log" Dec 05 12:51:21.965548 master-0 kubenswrapper[29936]: I1205 12:51:21.965338 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:21.969689 master-0 kubenswrapper[29936]: I1205 12:51:21.969638 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="ab1992e269496bc39c1df6084e6e60fd" podUID="610dc2015b38bc32879d55a7d39b2587" Dec 05 12:51:22.017742 master-0 kubenswrapper[29936]: I1205 12:51:22.017553 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-resource-dir\") pod \"ab1992e269496bc39c1df6084e6e60fd\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " Dec 05 12:51:22.017742 master-0 kubenswrapper[29936]: I1205 12:51:22.017675 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-cert-dir\") pod \"ab1992e269496bc39c1df6084e6e60fd\" (UID: \"ab1992e269496bc39c1df6084e6e60fd\") " Dec 05 12:51:22.017742 master-0 kubenswrapper[29936]: I1205 12:51:22.017759 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "ab1992e269496bc39c1df6084e6e60fd" (UID: "ab1992e269496bc39c1df6084e6e60fd"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:22.018486 master-0 kubenswrapper[29936]: I1205 12:51:22.017783 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "ab1992e269496bc39c1df6084e6e60fd" (UID: "ab1992e269496bc39c1df6084e6e60fd"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:22.018486 master-0 kubenswrapper[29936]: I1205 12:51:22.017986 29936 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:22.018486 master-0 kubenswrapper[29936]: I1205 12:51:22.017998 29936 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ab1992e269496bc39c1df6084e6e60fd-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:22.056282 master-0 kubenswrapper[29936]: I1205 12:51:22.056140 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_ab1992e269496bc39c1df6084e6e60fd/kube-controller-manager-cert-syncer/0.log" Dec 05 12:51:22.057593 master-0 kubenswrapper[29936]: I1205 12:51:22.057540 29936 generic.go:334] "Generic (PLEG): container finished" podID="ab1992e269496bc39c1df6084e6e60fd" containerID="0a16bc5dbf4947d3592d7a160d069d5ae407c8eecca6478799c03089401c073c" exitCode=0 Dec 05 12:51:22.057593 master-0 kubenswrapper[29936]: I1205 12:51:22.057577 29936 generic.go:334] "Generic (PLEG): container finished" podID="ab1992e269496bc39c1df6084e6e60fd" containerID="1b3283d0fac22ca78f337b1d5e3afe8d01431a02a7bb6f2fb90c61b14196aefb" exitCode=2 Dec 05 12:51:22.057593 master-0 kubenswrapper[29936]: I1205 12:51:22.057588 29936 generic.go:334] "Generic (PLEG): container finished" podID="ab1992e269496bc39c1df6084e6e60fd" containerID="8d14f1413c8e8a2ef6cd9ab523725814ba9ff7a6021dd1c6a68ef759cfabfdf3" exitCode=0 Dec 05 12:51:22.057593 master-0 kubenswrapper[29936]: I1205 12:51:22.057600 29936 generic.go:334] "Generic (PLEG): container finished" podID="ab1992e269496bc39c1df6084e6e60fd" containerID="91dbe5959251acff62db45931eb5a5e1e4e7af9bb363ef308eee803d4237a389" exitCode=0 Dec 05 12:51:22.057876 master-0 kubenswrapper[29936]: I1205 12:51:22.057680 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78eb0a378ee87ec426723278f27c3f8944db139eff4ee08e81e705d48c517d58" Dec 05 12:51:22.057876 master-0 kubenswrapper[29936]: I1205 12:51:22.057699 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:22.059460 master-0 kubenswrapper[29936]: I1205 12:51:22.059410 29936 generic.go:334] "Generic (PLEG): container finished" podID="9eb9db93-5e90-400b-8b54-e5cea89daabf" containerID="4093f3e755952787ec8d524ca81827afeecb189d619370b6131751e043918683" exitCode=0 Dec 05 12:51:22.059630 master-0 kubenswrapper[29936]: I1205 12:51:22.059538 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"9eb9db93-5e90-400b-8b54-e5cea89daabf","Type":"ContainerDied","Data":"4093f3e755952787ec8d524ca81827afeecb189d619370b6131751e043918683"} Dec 05 12:51:22.059869 master-0 kubenswrapper[29936]: I1205 12:51:22.059818 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:22.061758 master-0 kubenswrapper[29936]: I1205 12:51:22.061680 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="ab1992e269496bc39c1df6084e6e60fd" podUID="610dc2015b38bc32879d55a7d39b2587" Dec 05 12:51:22.067032 master-0 kubenswrapper[29936]: I1205 12:51:22.066957 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" Dec 05 12:51:22.089898 master-0 kubenswrapper[29936]: I1205 12:51:22.089774 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-744fbd497d-tm2v4" podStartSLOduration=19.089745236 podStartE2EDuration="19.089745236s" podCreationTimestamp="2025-12-05 12:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:51:22.085490007 +0000 UTC m=+79.217569708" watchObservedRunningTime="2025-12-05 12:51:22.089745236 +0000 UTC m=+79.221824927" Dec 05 12:51:22.134568 master-0 kubenswrapper[29936]: I1205 12:51:22.134467 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="ab1992e269496bc39c1df6084e6e60fd" podUID="610dc2015b38bc32879d55a7d39b2587" Dec 05 12:51:23.195684 master-0 kubenswrapper[29936]: I1205 12:51:23.195467 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab1992e269496bc39c1df6084e6e60fd" path="/var/lib/kubelet/pods/ab1992e269496bc39c1df6084e6e60fd/volumes" Dec 05 12:51:23.423679 master-0 kubenswrapper[29936]: I1205 12:51:23.423572 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:51:23.542827 master-0 kubenswrapper[29936]: I1205 12:51:23.542748 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-kubelet-dir\") pod \"9eb9db93-5e90-400b-8b54-e5cea89daabf\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " Dec 05 12:51:23.542827 master-0 kubenswrapper[29936]: I1205 12:51:23.542833 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9eb9db93-5e90-400b-8b54-e5cea89daabf-kube-api-access\") pod \"9eb9db93-5e90-400b-8b54-e5cea89daabf\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " Dec 05 12:51:23.543241 master-0 kubenswrapper[29936]: I1205 12:51:23.542897 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-var-lock\") pod \"9eb9db93-5e90-400b-8b54-e5cea89daabf\" (UID: \"9eb9db93-5e90-400b-8b54-e5cea89daabf\") " Dec 05 12:51:23.543241 master-0 kubenswrapper[29936]: I1205 12:51:23.542953 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9eb9db93-5e90-400b-8b54-e5cea89daabf" (UID: "9eb9db93-5e90-400b-8b54-e5cea89daabf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:23.543241 master-0 kubenswrapper[29936]: I1205 12:51:23.543100 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-var-lock" (OuterVolumeSpecName: "var-lock") pod "9eb9db93-5e90-400b-8b54-e5cea89daabf" (UID: "9eb9db93-5e90-400b-8b54-e5cea89daabf"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:23.543504 master-0 kubenswrapper[29936]: I1205 12:51:23.543466 29936 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:23.543504 master-0 kubenswrapper[29936]: I1205 12:51:23.543497 29936 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eb9db93-5e90-400b-8b54-e5cea89daabf-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:23.547264 master-0 kubenswrapper[29936]: I1205 12:51:23.547147 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb9db93-5e90-400b-8b54-e5cea89daabf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9eb9db93-5e90-400b-8b54-e5cea89daabf" (UID: "9eb9db93-5e90-400b-8b54-e5cea89daabf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:51:23.645314 master-0 kubenswrapper[29936]: I1205 12:51:23.645098 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9eb9db93-5e90-400b-8b54-e5cea89daabf-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:23.671072 master-0 kubenswrapper[29936]: I1205 12:51:23.668419 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 05 12:51:23.671072 master-0 kubenswrapper[29936]: I1205 12:51:23.668749 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-4-master-0" podUID="67037a7b-9105-4c7d-80ac-7481c14997f1" containerName="installer" containerID="cri-o://71e23aa83f43ebc2173b1d436a0fc43ce5211e7ed415e69869128fdff37a25f4" gracePeriod=30 Dec 05 12:51:24.076086 master-0 kubenswrapper[29936]: I1205 12:51:24.075979 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"9eb9db93-5e90-400b-8b54-e5cea89daabf","Type":"ContainerDied","Data":"cd379a044d84900fae9526c5ef529568f0eb518ec9cdf8409f4a3d8d094b0a6f"} Dec 05 12:51:24.076086 master-0 kubenswrapper[29936]: I1205 12:51:24.076058 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd379a044d84900fae9526c5ef529568f0eb518ec9cdf8409f4a3d8d094b0a6f" Dec 05 12:51:24.076086 master-0 kubenswrapper[29936]: I1205 12:51:24.076282 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 05 12:51:25.467969 master-0 kubenswrapper[29936]: I1205 12:51:25.467908 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:51:25.468634 master-0 kubenswrapper[29936]: I1205 12:51:25.467982 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:51:27.558437 master-0 kubenswrapper[29936]: I1205 12:51:27.558364 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-69cd4c69bf-mx2xk" Dec 05 12:51:28.022360 master-0 kubenswrapper[29936]: I1205 12:51:28.019730 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Dec 05 12:51:28.022360 master-0 kubenswrapper[29936]: E1205 12:51:28.020108 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb9db93-5e90-400b-8b54-e5cea89daabf" containerName="installer" Dec 05 12:51:28.022360 master-0 kubenswrapper[29936]: I1205 12:51:28.020124 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb9db93-5e90-400b-8b54-e5cea89daabf" containerName="installer" Dec 05 12:51:28.022360 master-0 kubenswrapper[29936]: I1205 12:51:28.020302 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb9db93-5e90-400b-8b54-e5cea89daabf" containerName="installer" Dec 05 12:51:28.022360 master-0 kubenswrapper[29936]: I1205 12:51:28.020870 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:28.120525 master-0 kubenswrapper[29936]: I1205 12:51:28.120462 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:28.120935 master-0 kubenswrapper[29936]: I1205 12:51:28.120915 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:28.121056 master-0 kubenswrapper[29936]: I1205 12:51:28.121042 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-var-lock\") pod \"installer-5-master-0\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:28.222145 master-0 kubenswrapper[29936]: I1205 12:51:28.222019 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-var-lock\") pod \"installer-5-master-0\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:28.222557 master-0 kubenswrapper[29936]: I1205 12:51:28.222211 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-var-lock\") pod \"installer-5-master-0\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:28.222557 master-0 kubenswrapper[29936]: I1205 12:51:28.222293 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:28.222557 master-0 kubenswrapper[29936]: I1205 12:51:28.222343 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:28.222557 master-0 kubenswrapper[29936]: I1205 12:51:28.222491 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:29.933603 master-0 kubenswrapper[29936]: I1205 12:51:29.933513 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Dec 05 12:51:30.636584 master-0 kubenswrapper[29936]: I1205 12:51:30.636480 29936 patch_prober.go:28] interesting pod/console-7495f49968-4tq6k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 05 12:51:30.636584 master-0 kubenswrapper[29936]: I1205 12:51:30.636565 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 05 12:51:31.017707 master-0 kubenswrapper[29936]: I1205 12:51:31.017639 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kube-api-access\") pod \"installer-5-master-0\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:31.050153 master-0 kubenswrapper[29936]: I1205 12:51:31.050062 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:51:32.185912 master-0 kubenswrapper[29936]: I1205 12:51:32.185747 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:32.207687 master-0 kubenswrapper[29936]: I1205 12:51:32.207642 29936 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="2d47d791-cd50-425f-9388-e723516d7f56" Dec 05 12:51:32.207840 master-0 kubenswrapper[29936]: I1205 12:51:32.207825 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="2d47d791-cd50-425f-9388-e723516d7f56" Dec 05 12:51:33.881069 master-0 kubenswrapper[29936]: I1205 12:51:33.881015 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 05 12:51:33.887367 master-0 kubenswrapper[29936]: I1205 12:51:33.887312 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Dec 05 12:51:33.890427 master-0 kubenswrapper[29936]: W1205 12:51:33.890359 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6f117368_9d0a_4c16_8d03_ffc83d250dd1.slice/crio-405f2bdfe5ece5ac09307456a46ddc029b968cd1da0b2a398f02c409982ee0a9 WatchSource:0}: Error finding container 405f2bdfe5ece5ac09307456a46ddc029b968cd1da0b2a398f02c409982ee0a9: Status 404 returned error can't find the container with id 405f2bdfe5ece5ac09307456a46ddc029b968cd1da0b2a398f02c409982ee0a9 Dec 05 12:51:34.175684 master-0 kubenswrapper[29936]: I1205 12:51:34.175521 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"6f117368-9d0a-4c16-8d03-ffc83d250dd1","Type":"ContainerStarted","Data":"405f2bdfe5ece5ac09307456a46ddc029b968cd1da0b2a398f02c409982ee0a9"} Dec 05 12:51:34.834038 master-0 kubenswrapper[29936]: I1205 12:51:34.833782 29936 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:34.837218 master-0 kubenswrapper[29936]: I1205 12:51:34.837145 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 05 12:51:35.468957 master-0 kubenswrapper[29936]: I1205 12:51:35.468845 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:51:35.468957 master-0 kubenswrapper[29936]: I1205 12:51:35.468945 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:51:36.195349 master-0 kubenswrapper[29936]: I1205 12:51:36.195262 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"6f117368-9d0a-4c16-8d03-ffc83d250dd1","Type":"ContainerStarted","Data":"027a84d65e873850efbb8d9852fd5d9428626e1f41733cf9e7e4f4f2b2122439"} Dec 05 12:51:36.354595 master-0 kubenswrapper[29936]: I1205 12:51:36.353165 29936 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d47d791-cd50-425f-9388-e723516d7f56\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T12:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T12:51:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [wait-for-host-port]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T12:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-scheduler kube-scheduler-cert-syncer kube-scheduler-recovery-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T12:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-scheduler kube-scheduler-cert-syncer kube-scheduler-recovery-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f042fa25014f3d37f3ea967d21f361d2a11833ae18f2c750318101b25d2497ce\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f042fa25014f3d37f3ea967d21f361d2a11833ae18f2c750318101b25d2497ce\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5891cdd7dcf7c9081de8b364b4c96446b7f946f7880fbae291a4592a198264\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}}}],\\\"phase\\\":\\\"Pending\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-master-0\": pods \"openshift-kube-scheduler-master-0\" not found" Dec 05 12:51:37.186331 master-0 kubenswrapper[29936]: I1205 12:51:37.186234 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:37.204472 master-0 kubenswrapper[29936]: I1205 12:51:37.204406 29936 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="27520db6-bac2-4134-b2cd-dbbcbd6fd89e" Dec 05 12:51:37.204472 master-0 kubenswrapper[29936]: I1205 12:51:37.204460 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="27520db6-bac2-4134-b2cd-dbbcbd6fd89e" Dec 05 12:51:38.137705 master-0 kubenswrapper[29936]: I1205 12:51:38.137648 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:38.140691 master-0 kubenswrapper[29936]: I1205 12:51:38.140636 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 05 12:51:38.177736 master-0 kubenswrapper[29936]: W1205 12:51:38.177660 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc0ed8180ac3c77ddb293604fb163978.slice/crio-a066df18d6cfbc5139230178095263d33840eaa268123b3221baacd3f2a6d037 WatchSource:0}: Error finding container a066df18d6cfbc5139230178095263d33840eaa268123b3221baacd3f2a6d037: Status 404 returned error can't find the container with id a066df18d6cfbc5139230178095263d33840eaa268123b3221baacd3f2a6d037 Dec 05 12:51:38.212941 master-0 kubenswrapper[29936]: I1205 12:51:38.212865 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fc0ed8180ac3c77ddb293604fb163978","Type":"ContainerStarted","Data":"a066df18d6cfbc5139230178095263d33840eaa268123b3221baacd3f2a6d037"} Dec 05 12:51:40.638222 master-0 kubenswrapper[29936]: I1205 12:51:40.638091 29936 patch_prober.go:28] interesting pod/console-7495f49968-4tq6k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 05 12:51:40.639557 master-0 kubenswrapper[29936]: I1205 12:51:40.638276 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 05 12:51:40.867987 master-0 kubenswrapper[29936]: I1205 12:51:40.867809 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:51:40.870003 master-0 kubenswrapper[29936]: I1205 12:51:40.869888 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=13.869873525 podStartE2EDuration="13.869873525s" podCreationTimestamp="2025-12-05 12:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:51:40.862681636 +0000 UTC m=+97.994761407" watchObservedRunningTime="2025-12-05 12:51:40.869873525 +0000 UTC m=+98.001953207" Dec 05 12:51:41.241772 master-0 kubenswrapper[29936]: I1205 12:51:41.241669 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fc0ed8180ac3c77ddb293604fb163978","Type":"ContainerStarted","Data":"5eb236b26c15a302d6e2866ad032e841c14accb084bf0248191328786188b2fa"} Dec 05 12:51:41.806037 master-0 kubenswrapper[29936]: E1205 12:51:41.805930 29936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc0ed8180ac3c77ddb293604fb163978.slice/crio-conmon-5eb236b26c15a302d6e2866ad032e841c14accb084bf0248191328786188b2fa.scope\": RecentStats: unable to find data in memory cache]" Dec 05 12:51:42.255874 master-0 kubenswrapper[29936]: I1205 12:51:42.255756 29936 generic.go:334] "Generic (PLEG): container finished" podID="fc0ed8180ac3c77ddb293604fb163978" containerID="5eb236b26c15a302d6e2866ad032e841c14accb084bf0248191328786188b2fa" exitCode=0 Dec 05 12:51:42.256254 master-0 kubenswrapper[29936]: I1205 12:51:42.255853 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fc0ed8180ac3c77ddb293604fb163978","Type":"ContainerDied","Data":"5eb236b26c15a302d6e2866ad032e841c14accb084bf0248191328786188b2fa"} Dec 05 12:51:43.187484 master-0 kubenswrapper[29936]: I1205 12:51:43.186433 29936 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:43.202009 master-0 kubenswrapper[29936]: I1205 12:51:43.201941 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:51:44.274602 master-0 kubenswrapper[29936]: I1205 12:51:44.274419 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fc0ed8180ac3c77ddb293604fb163978","Type":"ContainerStarted","Data":"48b5a956a99949a665a48ce7a05043b10d611b503a515c1e36331afd05c0cbfa"} Dec 05 12:51:45.150778 master-0 kubenswrapper[29936]: I1205 12:51:45.150697 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:45.158219 master-0 kubenswrapper[29936]: I1205 12:51:45.157953 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:51:45.180164 master-0 kubenswrapper[29936]: W1205 12:51:45.180117 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod610dc2015b38bc32879d55a7d39b2587.slice/crio-60d3aaefe0051812ad1090dac5fefb06749299f2a086d05d22b6029b515dfaaa WatchSource:0}: Error finding container 60d3aaefe0051812ad1090dac5fefb06749299f2a086d05d22b6029b515dfaaa: Status 404 returned error can't find the container with id 60d3aaefe0051812ad1090dac5fefb06749299f2a086d05d22b6029b515dfaaa Dec 05 12:51:45.285691 master-0 kubenswrapper[29936]: I1205 12:51:45.285631 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"610dc2015b38bc32879d55a7d39b2587","Type":"ContainerStarted","Data":"60d3aaefe0051812ad1090dac5fefb06749299f2a086d05d22b6029b515dfaaa"} Dec 05 12:51:45.468160 master-0 kubenswrapper[29936]: I1205 12:51:45.468085 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:51:45.468435 master-0 kubenswrapper[29936]: I1205 12:51:45.468253 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:51:46.302415 master-0 kubenswrapper[29936]: I1205 12:51:46.302339 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fc0ed8180ac3c77ddb293604fb163978","Type":"ContainerStarted","Data":"8a90315022ef0d596c471f98d9655cc8e30fd254c5e3b9c740693c9883cc199f"} Dec 05 12:51:46.302415 master-0 kubenswrapper[29936]: I1205 12:51:46.302405 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fc0ed8180ac3c77ddb293604fb163978","Type":"ContainerStarted","Data":"a260e471047ac81bed4568a75e9981f7961d90e4df33fa05ffdc0c6fb556eccb"} Dec 05 12:51:46.303867 master-0 kubenswrapper[29936]: I1205 12:51:46.303693 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:51:46.320217 master-0 kubenswrapper[29936]: I1205 12:51:46.320144 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"610dc2015b38bc32879d55a7d39b2587","Type":"ContainerStarted","Data":"9e00fa2595fad4ad014a23b8074a3240a2449d47373074fecd9654f334a13fb7"} Dec 05 12:51:46.320339 master-0 kubenswrapper[29936]: I1205 12:51:46.320225 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"610dc2015b38bc32879d55a7d39b2587","Type":"ContainerStarted","Data":"613fa64c416308b42c0d2958d3f3712126e52a447f52c90eaabf9bb657dccfd4"} Dec 05 12:51:46.320339 master-0 kubenswrapper[29936]: I1205 12:51:46.320235 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"610dc2015b38bc32879d55a7d39b2587","Type":"ContainerStarted","Data":"36ca02e8be7a0b8aad017a2fba35ee2e24e24ec30949f922f7c15439af96ed15"} Dec 05 12:51:46.334205 master-0 kubenswrapper[29936]: I1205 12:51:46.334061 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=10.334034587 podStartE2EDuration="10.334034587s" podCreationTimestamp="2025-12-05 12:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:51:46.331747455 +0000 UTC m=+103.463827126" watchObservedRunningTime="2025-12-05 12:51:46.334034587 +0000 UTC m=+103.466114268" Dec 05 12:51:47.330501 master-0 kubenswrapper[29936]: I1205 12:51:47.330254 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_67037a7b-9105-4c7d-80ac-7481c14997f1/installer/0.log" Dec 05 12:51:47.330501 master-0 kubenswrapper[29936]: I1205 12:51:47.330337 29936 generic.go:334] "Generic (PLEG): container finished" podID="67037a7b-9105-4c7d-80ac-7481c14997f1" containerID="71e23aa83f43ebc2173b1d436a0fc43ce5211e7ed415e69869128fdff37a25f4" exitCode=1 Dec 05 12:51:47.330501 master-0 kubenswrapper[29936]: I1205 12:51:47.330410 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"67037a7b-9105-4c7d-80ac-7481c14997f1","Type":"ContainerDied","Data":"71e23aa83f43ebc2173b1d436a0fc43ce5211e7ed415e69869128fdff37a25f4"} Dec 05 12:51:47.333113 master-0 kubenswrapper[29936]: I1205 12:51:47.333040 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"610dc2015b38bc32879d55a7d39b2587","Type":"ContainerStarted","Data":"46a38d7a2db34bb7213e7c44dd4da9930d6a2962fb71de116eced4ef1aa3810e"} Dec 05 12:51:47.358058 master-0 kubenswrapper[29936]: I1205 12:51:47.357972 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=4.357932089 podStartE2EDuration="4.357932089s" podCreationTimestamp="2025-12-05 12:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:51:47.353280049 +0000 UTC m=+104.485359750" watchObservedRunningTime="2025-12-05 12:51:47.357932089 +0000 UTC m=+104.490011770" Dec 05 12:51:47.627302 master-0 kubenswrapper[29936]: I1205 12:51:47.627242 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_67037a7b-9105-4c7d-80ac-7481c14997f1/installer/0.log" Dec 05 12:51:47.627570 master-0 kubenswrapper[29936]: I1205 12:51:47.627316 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:47.658330 master-0 kubenswrapper[29936]: I1205 12:51:47.656192 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67037a7b-9105-4c7d-80ac-7481c14997f1-kube-api-access\") pod \"67037a7b-9105-4c7d-80ac-7481c14997f1\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " Dec 05 12:51:47.658330 master-0 kubenswrapper[29936]: I1205 12:51:47.656331 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-var-lock\") pod \"67037a7b-9105-4c7d-80ac-7481c14997f1\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " Dec 05 12:51:47.658330 master-0 kubenswrapper[29936]: I1205 12:51:47.656436 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-var-lock" (OuterVolumeSpecName: "var-lock") pod "67037a7b-9105-4c7d-80ac-7481c14997f1" (UID: "67037a7b-9105-4c7d-80ac-7481c14997f1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:47.658330 master-0 kubenswrapper[29936]: I1205 12:51:47.656506 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-kubelet-dir\") pod \"67037a7b-9105-4c7d-80ac-7481c14997f1\" (UID: \"67037a7b-9105-4c7d-80ac-7481c14997f1\") " Dec 05 12:51:47.658330 master-0 kubenswrapper[29936]: I1205 12:51:47.656586 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "67037a7b-9105-4c7d-80ac-7481c14997f1" (UID: "67037a7b-9105-4c7d-80ac-7481c14997f1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:51:47.658330 master-0 kubenswrapper[29936]: I1205 12:51:47.657038 29936 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:47.658330 master-0 kubenswrapper[29936]: I1205 12:51:47.657055 29936 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/67037a7b-9105-4c7d-80ac-7481c14997f1-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:47.660024 master-0 kubenswrapper[29936]: I1205 12:51:47.659942 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67037a7b-9105-4c7d-80ac-7481c14997f1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "67037a7b-9105-4c7d-80ac-7481c14997f1" (UID: "67037a7b-9105-4c7d-80ac-7481c14997f1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:51:47.759332 master-0 kubenswrapper[29936]: I1205 12:51:47.759239 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/67037a7b-9105-4c7d-80ac-7481c14997f1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:51:48.342667 master-0 kubenswrapper[29936]: I1205 12:51:48.342016 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_67037a7b-9105-4c7d-80ac-7481c14997f1/installer/0.log" Dec 05 12:51:48.342667 master-0 kubenswrapper[29936]: I1205 12:51:48.342145 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"67037a7b-9105-4c7d-80ac-7481c14997f1","Type":"ContainerDied","Data":"31fee9d510e321648476c88902076cbeab7e810076c5e3c4e3018d9474272893"} Dec 05 12:51:48.342667 master-0 kubenswrapper[29936]: I1205 12:51:48.342225 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 05 12:51:48.342667 master-0 kubenswrapper[29936]: I1205 12:51:48.342251 29936 scope.go:117] "RemoveContainer" containerID="71e23aa83f43ebc2173b1d436a0fc43ce5211e7ed415e69869128fdff37a25f4" Dec 05 12:51:48.375972 master-0 kubenswrapper[29936]: I1205 12:51:48.375884 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 05 12:51:48.388432 master-0 kubenswrapper[29936]: I1205 12:51:48.388362 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 05 12:51:49.202078 master-0 kubenswrapper[29936]: I1205 12:51:49.202002 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67037a7b-9105-4c7d-80ac-7481c14997f1" path="/var/lib/kubelet/pods/67037a7b-9105-4c7d-80ac-7481c14997f1/volumes" Dec 05 12:51:50.636083 master-0 kubenswrapper[29936]: I1205 12:51:50.636030 29936 patch_prober.go:28] interesting pod/console-7495f49968-4tq6k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 05 12:51:50.636925 master-0 kubenswrapper[29936]: I1205 12:51:50.636702 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 05 12:51:55.151540 master-0 kubenswrapper[29936]: I1205 12:51:55.151449 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:55.151540 master-0 kubenswrapper[29936]: I1205 12:51:55.151533 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:55.151540 master-0 kubenswrapper[29936]: I1205 12:51:55.151548 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:55.153092 master-0 kubenswrapper[29936]: I1205 12:51:55.153035 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:55.153255 master-0 kubenswrapper[29936]: I1205 12:51:55.153173 29936 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 05 12:51:55.153516 master-0 kubenswrapper[29936]: I1205 12:51:55.153287 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 05 12:51:55.157919 master-0 kubenswrapper[29936]: I1205 12:51:55.157879 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:51:55.467588 master-0 kubenswrapper[29936]: I1205 12:51:55.467497 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:51:55.467588 master-0 kubenswrapper[29936]: I1205 12:51:55.467575 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:51:56.421683 master-0 kubenswrapper[29936]: I1205 12:51:56.421572 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:52:00.636610 master-0 kubenswrapper[29936]: I1205 12:52:00.636481 29936 patch_prober.go:28] interesting pod/console-7495f49968-4tq6k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 05 12:52:00.636610 master-0 kubenswrapper[29936]: I1205 12:52:00.636596 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 05 12:52:05.160962 master-0 kubenswrapper[29936]: I1205 12:52:05.160832 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:52:05.168877 master-0 kubenswrapper[29936]: I1205 12:52:05.168800 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:52:05.467628 master-0 kubenswrapper[29936]: I1205 12:52:05.467513 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:52:05.467628 master-0 kubenswrapper[29936]: I1205 12:52:05.467642 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:52:10.636702 master-0 kubenswrapper[29936]: I1205 12:52:10.636607 29936 patch_prober.go:28] interesting pod/console-7495f49968-4tq6k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 05 12:52:10.637523 master-0 kubenswrapper[29936]: I1205 12:52:10.636705 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 05 12:52:15.468348 master-0 kubenswrapper[29936]: I1205 12:52:15.468260 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:52:15.469046 master-0 kubenswrapper[29936]: I1205 12:52:15.468356 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:52:20.636519 master-0 kubenswrapper[29936]: I1205 12:52:20.636418 29936 patch_prober.go:28] interesting pod/console-7495f49968-4tq6k container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" start-of-body= Dec 05 12:52:20.637213 master-0 kubenswrapper[29936]: I1205 12:52:20.636581 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" probeResult="failure" output="Get \"https://10.128.0.92:8443/health\": dial tcp 10.128.0.92:8443: connect: connection refused" Dec 05 12:52:24.886315 master-0 kubenswrapper[29936]: I1205 12:52:24.885152 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7495f49968-4tq6k"] Dec 05 12:52:24.905236 master-0 kubenswrapper[29936]: I1205 12:52:24.904798 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv"] Dec 05 12:52:24.905236 master-0 kubenswrapper[29936]: E1205 12:52:24.905081 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67037a7b-9105-4c7d-80ac-7481c14997f1" containerName="installer" Dec 05 12:52:24.905236 master-0 kubenswrapper[29936]: I1205 12:52:24.905135 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="67037a7b-9105-4c7d-80ac-7481c14997f1" containerName="installer" Dec 05 12:52:24.905637 master-0 kubenswrapper[29936]: I1205 12:52:24.905601 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="67037a7b-9105-4c7d-80ac-7481c14997f1" containerName="installer" Dec 05 12:52:24.910247 master-0 kubenswrapper[29936]: I1205 12:52:24.910149 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" Dec 05 12:52:24.924296 master-0 kubenswrapper[29936]: I1205 12:52:24.923706 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 12:52:24.924296 master-0 kubenswrapper[29936]: I1205 12:52:24.924100 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 12:52:24.929227 master-0 kubenswrapper[29936]: I1205 12:52:24.928170 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv"] Dec 05 12:52:24.998858 master-0 kubenswrapper[29936]: I1205 12:52:24.998787 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c9b8d8fb9-7pxzk"] Dec 05 12:52:25.007395 master-0 kubenswrapper[29936]: I1205 12:52:25.007305 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.077429 master-0 kubenswrapper[29936]: I1205 12:52:25.077347 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/84d4e3f1-757e-45e3-acdb-6c5689b4c094-nginx-conf\") pod \"networking-console-plugin-7d45bf9455-6hzvv\" (UID: \"84d4e3f1-757e-45e3-acdb-6c5689b4c094\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" Dec 05 12:52:25.077707 master-0 kubenswrapper[29936]: I1205 12:52:25.077431 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-serving-cert\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.077707 master-0 kubenswrapper[29936]: I1205 12:52:25.077482 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9wrr\" (UniqueName: \"kubernetes.io/projected/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-kube-api-access-t9wrr\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.077707 master-0 kubenswrapper[29936]: I1205 12:52:25.077564 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-service-ca\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.077707 master-0 kubenswrapper[29936]: I1205 12:52:25.077665 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-config\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.077842 master-0 kubenswrapper[29936]: I1205 12:52:25.077768 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/84d4e3f1-757e-45e3-acdb-6c5689b4c094-networking-console-plugin-cert\") pod \"networking-console-plugin-7d45bf9455-6hzvv\" (UID: \"84d4e3f1-757e-45e3-acdb-6c5689b4c094\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" Dec 05 12:52:25.077842 master-0 kubenswrapper[29936]: I1205 12:52:25.077829 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-oauth-serving-cert\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.078256 master-0 kubenswrapper[29936]: I1205 12:52:25.077941 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-trusted-ca-bundle\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.078256 master-0 kubenswrapper[29936]: I1205 12:52:25.078052 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-oauth-config\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.181295 master-0 kubenswrapper[29936]: I1205 12:52:25.180064 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/84d4e3f1-757e-45e3-acdb-6c5689b4c094-networking-console-plugin-cert\") pod \"networking-console-plugin-7d45bf9455-6hzvv\" (UID: \"84d4e3f1-757e-45e3-acdb-6c5689b4c094\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" Dec 05 12:52:25.181295 master-0 kubenswrapper[29936]: I1205 12:52:25.180188 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-oauth-serving-cert\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.181295 master-0 kubenswrapper[29936]: I1205 12:52:25.180226 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-trusted-ca-bundle\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.181295 master-0 kubenswrapper[29936]: I1205 12:52:25.180252 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-oauth-config\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.181295 master-0 kubenswrapper[29936]: I1205 12:52:25.180306 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c9b8d8fb9-7pxzk"] Dec 05 12:52:25.186196 master-0 kubenswrapper[29936]: I1205 12:52:25.182226 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/84d4e3f1-757e-45e3-acdb-6c5689b4c094-nginx-conf\") pod \"networking-console-plugin-7d45bf9455-6hzvv\" (UID: \"84d4e3f1-757e-45e3-acdb-6c5689b4c094\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" Dec 05 12:52:25.186196 master-0 kubenswrapper[29936]: I1205 12:52:25.182273 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-serving-cert\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.186196 master-0 kubenswrapper[29936]: I1205 12:52:25.182309 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9wrr\" (UniqueName: \"kubernetes.io/projected/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-kube-api-access-t9wrr\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.186196 master-0 kubenswrapper[29936]: I1205 12:52:25.182360 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-service-ca\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.186196 master-0 kubenswrapper[29936]: I1205 12:52:25.182422 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-config\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.186196 master-0 kubenswrapper[29936]: I1205 12:52:25.183618 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-config\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.186196 master-0 kubenswrapper[29936]: I1205 12:52:25.184367 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-service-ca\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.186196 master-0 kubenswrapper[29936]: I1205 12:52:25.184912 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-oauth-serving-cert\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.186196 master-0 kubenswrapper[29936]: I1205 12:52:25.185015 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/84d4e3f1-757e-45e3-acdb-6c5689b4c094-nginx-conf\") pod \"networking-console-plugin-7d45bf9455-6hzvv\" (UID: \"84d4e3f1-757e-45e3-acdb-6c5689b4c094\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" Dec 05 12:52:25.186196 master-0 kubenswrapper[29936]: I1205 12:52:25.185956 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-trusted-ca-bundle\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.186677 master-0 kubenswrapper[29936]: I1205 12:52:25.186210 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/84d4e3f1-757e-45e3-acdb-6c5689b4c094-networking-console-plugin-cert\") pod \"networking-console-plugin-7d45bf9455-6hzvv\" (UID: \"84d4e3f1-757e-45e3-acdb-6c5689b4c094\") " pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" Dec 05 12:52:25.191203 master-0 kubenswrapper[29936]: I1205 12:52:25.187990 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-oauth-config\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.196203 master-0 kubenswrapper[29936]: I1205 12:52:25.194512 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-serving-cert\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.221551 master-0 kubenswrapper[29936]: I1205 12:52:25.221486 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9wrr\" (UniqueName: \"kubernetes.io/projected/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-kube-api-access-t9wrr\") pod \"console-c9b8d8fb9-7pxzk\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.252520 master-0 kubenswrapper[29936]: I1205 12:52:25.252398 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" Dec 05 12:52:25.399863 master-0 kubenswrapper[29936]: I1205 12:52:25.399780 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:25.467504 master-0 kubenswrapper[29936]: I1205 12:52:25.467436 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:52:25.467760 master-0 kubenswrapper[29936]: I1205 12:52:25.467522 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:52:25.710571 master-0 kubenswrapper[29936]: I1205 12:52:25.710493 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv"] Dec 05 12:52:25.714825 master-0 kubenswrapper[29936]: W1205 12:52:25.714753 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d4e3f1_757e_45e3_acdb_6c5689b4c094.slice/crio-f51bc3c734e4191d4b0a43b39e4b57b66ed1f57c29edcfcc273d17eb87841d26 WatchSource:0}: Error finding container f51bc3c734e4191d4b0a43b39e4b57b66ed1f57c29edcfcc273d17eb87841d26: Status 404 returned error can't find the container with id f51bc3c734e4191d4b0a43b39e4b57b66ed1f57c29edcfcc273d17eb87841d26 Dec 05 12:52:25.813351 master-0 kubenswrapper[29936]: I1205 12:52:25.813241 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c9b8d8fb9-7pxzk"] Dec 05 12:52:25.819458 master-0 kubenswrapper[29936]: W1205 12:52:25.819286 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded3f0eab_f26c_4b16_9d01_a6fd7e4bce73.slice/crio-d415ce7aad1b9f977544ee6fda9f28aacf11572f4cd6fe849747fe5679c38b34 WatchSource:0}: Error finding container d415ce7aad1b9f977544ee6fda9f28aacf11572f4cd6fe849747fe5679c38b34: Status 404 returned error can't find the container with id d415ce7aad1b9f977544ee6fda9f28aacf11572f4cd6fe849747fe5679c38b34 Dec 05 12:52:26.683761 master-0 kubenswrapper[29936]: I1205 12:52:26.683562 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c9b8d8fb9-7pxzk" event={"ID":"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73","Type":"ContainerStarted","Data":"a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026"} Dec 05 12:52:26.683761 master-0 kubenswrapper[29936]: I1205 12:52:26.683654 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c9b8d8fb9-7pxzk" event={"ID":"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73","Type":"ContainerStarted","Data":"d415ce7aad1b9f977544ee6fda9f28aacf11572f4cd6fe849747fe5679c38b34"} Dec 05 12:52:26.687235 master-0 kubenswrapper[29936]: I1205 12:52:26.687158 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" event={"ID":"84d4e3f1-757e-45e3-acdb-6c5689b4c094","Type":"ContainerStarted","Data":"f51bc3c734e4191d4b0a43b39e4b57b66ed1f57c29edcfcc273d17eb87841d26"} Dec 05 12:52:26.708690 master-0 kubenswrapper[29936]: I1205 12:52:26.708544 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c9b8d8fb9-7pxzk" podStartSLOduration=2.708514688 podStartE2EDuration="2.708514688s" podCreationTimestamp="2025-12-05 12:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:52:26.703757256 +0000 UTC m=+143.835836937" watchObservedRunningTime="2025-12-05 12:52:26.708514688 +0000 UTC m=+143.840594389" Dec 05 12:52:27.699380 master-0 kubenswrapper[29936]: I1205 12:52:27.698387 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" event={"ID":"84d4e3f1-757e-45e3-acdb-6c5689b4c094","Type":"ContainerStarted","Data":"b825a76b49df1d70a4ffc6e23b7db7983810c0f6dce0dd32a35365b7811b4a1e"} Dec 05 12:52:27.747513 master-0 kubenswrapper[29936]: I1205 12:52:27.747397 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-7d45bf9455-6hzvv" podStartSLOduration=2.130314451 podStartE2EDuration="3.747362014s" podCreationTimestamp="2025-12-05 12:52:24 +0000 UTC" firstStartedPulling="2025-12-05 12:52:25.71782425 +0000 UTC m=+142.849903931" lastFinishedPulling="2025-12-05 12:52:27.334871813 +0000 UTC m=+144.466951494" observedRunningTime="2025-12-05 12:52:27.740844413 +0000 UTC m=+144.872924114" watchObservedRunningTime="2025-12-05 12:52:27.747362014 +0000 UTC m=+144.879441695" Dec 05 12:52:28.144655 master-0 kubenswrapper[29936]: I1205 12:52:28.142051 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 05 12:52:28.428571 master-0 kubenswrapper[29936]: I1205 12:52:28.428398 29936 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 05 12:52:28.429711 master-0 kubenswrapper[29936]: I1205 12:52:28.429672 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.431167 master-0 kubenswrapper[29936]: I1205 12:52:28.431126 29936 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 05 12:52:28.431449 master-0 kubenswrapper[29936]: I1205 12:52:28.431383 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver" containerID="cri-o://e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4" gracePeriod=15 Dec 05 12:52:28.432588 master-0 kubenswrapper[29936]: I1205 12:52:28.431567 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" containerID="cri-o://cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83" gracePeriod=15 Dec 05 12:52:28.432588 master-0 kubenswrapper[29936]: I1205 12:52:28.431636 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571" gracePeriod=15 Dec 05 12:52:28.432588 master-0 kubenswrapper[29936]: I1205 12:52:28.431714 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543" gracePeriod=15 Dec 05 12:52:28.432812 master-0 kubenswrapper[29936]: I1205 12:52:28.431732 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-syncer" containerID="cri-o://bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658" gracePeriod=15 Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.432966 29936 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: E1205 12:52:28.433233 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-insecure-readyz" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433249 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-insecure-readyz" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: E1205 12:52:28.433277 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="setup" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433284 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="setup" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: E1205 12:52:28.433299 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433305 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: E1205 12:52:28.433322 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433330 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: E1205 12:52:28.433351 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-syncer" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433359 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-syncer" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: E1205 12:52:28.433369 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433376 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433508 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433526 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-insecure-readyz" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433542 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433570 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-cert-syncer" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433583 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver" Dec 05 12:52:28.436214 master-0 kubenswrapper[29936]: I1205 12:52:28.433593 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="setup" Dec 05 12:52:28.484851 master-0 kubenswrapper[29936]: I1205 12:52:28.484773 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 05 12:52:28.546427 master-0 kubenswrapper[29936]: I1205 12:52:28.545443 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.546427 master-0 kubenswrapper[29936]: I1205 12:52:28.545581 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a369cadf0161d66f2936cdea3ded59b7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"a369cadf0161d66f2936cdea3ded59b7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:28.546427 master-0 kubenswrapper[29936]: I1205 12:52:28.545621 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.546427 master-0 kubenswrapper[29936]: I1205 12:52:28.545648 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.546427 master-0 kubenswrapper[29936]: I1205 12:52:28.545697 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.546427 master-0 kubenswrapper[29936]: I1205 12:52:28.545722 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a369cadf0161d66f2936cdea3ded59b7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"a369cadf0161d66f2936cdea3ded59b7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:28.546427 master-0 kubenswrapper[29936]: I1205 12:52:28.545813 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a369cadf0161d66f2936cdea3ded59b7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"a369cadf0161d66f2936cdea3ded59b7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:28.546427 master-0 kubenswrapper[29936]: I1205 12:52:28.545852 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.609822 master-0 kubenswrapper[29936]: E1205 12:52:28.609583 29936 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:28.610374 master-0 kubenswrapper[29936]: E1205 12:52:28.610307 29936 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:28.611516 master-0 kubenswrapper[29936]: E1205 12:52:28.611038 29936 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:28.611929 master-0 kubenswrapper[29936]: E1205 12:52:28.611882 29936 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:28.612565 master-0 kubenswrapper[29936]: E1205 12:52:28.612499 29936 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:28.612642 master-0 kubenswrapper[29936]: I1205 12:52:28.612571 29936 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 12:52:28.613540 master-0 kubenswrapper[29936]: E1205 12:52:28.613256 29936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Dec 05 12:52:28.648087 master-0 kubenswrapper[29936]: I1205 12:52:28.647981 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a369cadf0161d66f2936cdea3ded59b7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"a369cadf0161d66f2936cdea3ded59b7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:28.648434 master-0 kubenswrapper[29936]: I1205 12:52:28.648254 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a369cadf0161d66f2936cdea3ded59b7-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"a369cadf0161d66f2936cdea3ded59b7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:28.648434 master-0 kubenswrapper[29936]: I1205 12:52:28.648368 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.648583 master-0 kubenswrapper[29936]: I1205 12:52:28.648441 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.648583 master-0 kubenswrapper[29936]: I1205 12:52:28.648458 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.648583 master-0 kubenswrapper[29936]: I1205 12:52:28.648501 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a369cadf0161d66f2936cdea3ded59b7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"a369cadf0161d66f2936cdea3ded59b7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:28.648583 master-0 kubenswrapper[29936]: I1205 12:52:28.648505 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.648583 master-0 kubenswrapper[29936]: I1205 12:52:28.648571 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a369cadf0161d66f2936cdea3ded59b7-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"a369cadf0161d66f2936cdea3ded59b7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:28.648907 master-0 kubenswrapper[29936]: I1205 12:52:28.648608 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.648907 master-0 kubenswrapper[29936]: I1205 12:52:28.648650 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.648907 master-0 kubenswrapper[29936]: I1205 12:52:28.648662 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.649117 master-0 kubenswrapper[29936]: I1205 12:52:28.648955 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.649228 master-0 kubenswrapper[29936]: I1205 12:52:28.649010 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.649228 master-0 kubenswrapper[29936]: I1205 12:52:28.649211 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a369cadf0161d66f2936cdea3ded59b7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"a369cadf0161d66f2936cdea3ded59b7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:28.649372 master-0 kubenswrapper[29936]: I1205 12:52:28.649279 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a369cadf0161d66f2936cdea3ded59b7-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"a369cadf0161d66f2936cdea3ded59b7\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:28.649591 master-0 kubenswrapper[29936]: I1205 12:52:28.648718 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.708834 master-0 kubenswrapper[29936]: I1205 12:52:28.708756 29936 generic.go:334] "Generic (PLEG): container finished" podID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" containerID="027a84d65e873850efbb8d9852fd5d9428626e1f41733cf9e7e4f4f2b2122439" exitCode=0 Dec 05 12:52:28.708834 master-0 kubenswrapper[29936]: I1205 12:52:28.708798 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"6f117368-9d0a-4c16-8d03-ffc83d250dd1","Type":"ContainerDied","Data":"027a84d65e873850efbb8d9852fd5d9428626e1f41733cf9e7e4f4f2b2122439"} Dec 05 12:52:28.713234 master-0 kubenswrapper[29936]: I1205 12:52:28.712028 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:28.713676 master-0 kubenswrapper[29936]: I1205 12:52:28.712803 29936 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:28.720641 master-0 kubenswrapper[29936]: I1205 12:52:28.720575 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b89698aa356a3bc32694e2b098f9a900/kube-apiserver-cert-syncer/0.log" Dec 05 12:52:28.721194 master-0 kubenswrapper[29936]: I1205 12:52:28.721083 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:28.723142 master-0 kubenswrapper[29936]: I1205 12:52:28.722995 29936 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83" exitCode=0 Dec 05 12:52:28.723142 master-0 kubenswrapper[29936]: I1205 12:52:28.723061 29936 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543" exitCode=0 Dec 05 12:52:28.723142 master-0 kubenswrapper[29936]: I1205 12:52:28.723093 29936 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571" exitCode=0 Dec 05 12:52:28.723142 master-0 kubenswrapper[29936]: I1205 12:52:28.723105 29936 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658" exitCode=2 Dec 05 12:52:28.769930 master-0 kubenswrapper[29936]: I1205 12:52:28.769306 29936 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" start-of-body= Dec 05 12:52:28.769930 master-0 kubenswrapper[29936]: I1205 12:52:28.769407 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="b89698aa356a3bc32694e2b098f9a900" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.32.10:17697/healthz\": dial tcp 192.168.32.10:17697: connect: connection refused" Dec 05 12:52:28.771640 master-0 kubenswrapper[29936]: E1205 12:52:28.771369 29936 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Dec 05 12:52:28.771640 master-0 kubenswrapper[29936]: &Event{ObjectMeta:{kube-apiserver-master-0.187e52ce049446b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:b89698aa356a3bc32694e2b098f9a900,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.32.10:17697/healthz": dial tcp 192.168.32.10:17697: connect: connection refused Dec 05 12:52:28.771640 master-0 kubenswrapper[29936]: body: Dec 05 12:52:28.771640 master-0 kubenswrapper[29936]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:52:28.769380022 +0000 UTC m=+145.901459723,LastTimestamp:2025-12-05 12:52:28.769380022 +0000 UTC m=+145.901459723,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Dec 05 12:52:28.771640 master-0 kubenswrapper[29936]: > Dec 05 12:52:28.782403 master-0 kubenswrapper[29936]: I1205 12:52:28.782309 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:52:28.815521 master-0 kubenswrapper[29936]: E1205 12:52:28.815437 29936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Dec 05 12:52:29.216592 master-0 kubenswrapper[29936]: E1205 12:52:29.216525 29936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Dec 05 12:52:29.733730 master-0 kubenswrapper[29936]: I1205 12:52:29.733620 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"b83ccd6fa217a93a2c607d0109896ef8","Type":"ContainerStarted","Data":"0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f"} Dec 05 12:52:29.733730 master-0 kubenswrapper[29936]: I1205 12:52:29.733702 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"b83ccd6fa217a93a2c607d0109896ef8","Type":"ContainerStarted","Data":"ca55c286f46e4fe2598da44493f4248b621b5338c6d35390a60b5ca16c8868a4"} Dec 05 12:52:29.735757 master-0 kubenswrapper[29936]: I1205 12:52:29.735648 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:29.737057 master-0 kubenswrapper[29936]: I1205 12:52:29.736992 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:30.018950 master-0 kubenswrapper[29936]: E1205 12:52:30.018735 29936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Dec 05 12:52:30.136237 master-0 kubenswrapper[29936]: I1205 12:52:30.136123 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:52:30.137426 master-0 kubenswrapper[29936]: I1205 12:52:30.137369 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:30.138155 master-0 kubenswrapper[29936]: I1205 12:52:30.138106 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:30.176002 master-0 kubenswrapper[29936]: I1205 12:52:30.175874 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kubelet-dir\") pod \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " Dec 05 12:52:30.176412 master-0 kubenswrapper[29936]: I1205 12:52:30.176053 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6f117368-9d0a-4c16-8d03-ffc83d250dd1" (UID: "6f117368-9d0a-4c16-8d03-ffc83d250dd1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:52:30.176412 master-0 kubenswrapper[29936]: I1205 12:52:30.176259 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-var-lock\") pod \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " Dec 05 12:52:30.176412 master-0 kubenswrapper[29936]: I1205 12:52:30.176339 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-var-lock" (OuterVolumeSpecName: "var-lock") pod "6f117368-9d0a-4c16-8d03-ffc83d250dd1" (UID: "6f117368-9d0a-4c16-8d03-ffc83d250dd1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:52:30.176412 master-0 kubenswrapper[29936]: I1205 12:52:30.176405 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kube-api-access\") pod \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\" (UID: \"6f117368-9d0a-4c16-8d03-ffc83d250dd1\") " Dec 05 12:52:30.177742 master-0 kubenswrapper[29936]: I1205 12:52:30.177671 29936 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:30.177742 master-0 kubenswrapper[29936]: I1205 12:52:30.177734 29936 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6f117368-9d0a-4c16-8d03-ffc83d250dd1-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:30.180052 master-0 kubenswrapper[29936]: I1205 12:52:30.179986 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6f117368-9d0a-4c16-8d03-ffc83d250dd1" (UID: "6f117368-9d0a-4c16-8d03-ffc83d250dd1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:52:30.280600 master-0 kubenswrapper[29936]: I1205 12:52:30.280428 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6f117368-9d0a-4c16-8d03-ffc83d250dd1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:30.744089 master-0 kubenswrapper[29936]: I1205 12:52:30.744017 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"6f117368-9d0a-4c16-8d03-ffc83d250dd1","Type":"ContainerDied","Data":"405f2bdfe5ece5ac09307456a46ddc029b968cd1da0b2a398f02c409982ee0a9"} Dec 05 12:52:30.744089 master-0 kubenswrapper[29936]: I1205 12:52:30.744088 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="405f2bdfe5ece5ac09307456a46ddc029b968cd1da0b2a398f02c409982ee0a9" Dec 05 12:52:30.744864 master-0 kubenswrapper[29936]: I1205 12:52:30.744279 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 05 12:52:30.763455 master-0 kubenswrapper[29936]: I1205 12:52:30.763356 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:30.764411 master-0 kubenswrapper[29936]: I1205 12:52:30.764322 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:31.524656 master-0 kubenswrapper[29936]: I1205 12:52:31.524603 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b89698aa356a3bc32694e2b098f9a900/kube-apiserver-cert-syncer/0.log" Dec 05 12:52:31.525755 master-0 kubenswrapper[29936]: I1205 12:52:31.525729 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:31.527207 master-0 kubenswrapper[29936]: I1205 12:52:31.527105 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:31.528227 master-0 kubenswrapper[29936]: I1205 12:52:31.528145 29936 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:31.529168 master-0 kubenswrapper[29936]: I1205 12:52:31.529103 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:31.621080 master-0 kubenswrapper[29936]: E1205 12:52:31.620985 29936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Dec 05 12:52:31.628530 master-0 kubenswrapper[29936]: I1205 12:52:31.628473 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") pod \"b89698aa356a3bc32694e2b098f9a900\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " Dec 05 12:52:31.628729 master-0 kubenswrapper[29936]: I1205 12:52:31.628650 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b89698aa356a3bc32694e2b098f9a900" (UID: "b89698aa356a3bc32694e2b098f9a900"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:52:31.628946 master-0 kubenswrapper[29936]: I1205 12:52:31.628926 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") pod \"b89698aa356a3bc32694e2b098f9a900\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " Dec 05 12:52:31.629127 master-0 kubenswrapper[29936]: I1205 12:52:31.628985 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "b89698aa356a3bc32694e2b098f9a900" (UID: "b89698aa356a3bc32694e2b098f9a900"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:52:31.629321 master-0 kubenswrapper[29936]: I1205 12:52:31.629300 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") pod \"b89698aa356a3bc32694e2b098f9a900\" (UID: \"b89698aa356a3bc32694e2b098f9a900\") " Dec 05 12:52:31.629608 master-0 kubenswrapper[29936]: I1205 12:52:31.629336 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "b89698aa356a3bc32694e2b098f9a900" (UID: "b89698aa356a3bc32694e2b098f9a900"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:52:31.630482 master-0 kubenswrapper[29936]: I1205 12:52:31.630395 29936 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:31.630482 master-0 kubenswrapper[29936]: I1205 12:52:31.630480 29936 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:31.630590 master-0 kubenswrapper[29936]: I1205 12:52:31.630495 29936 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b89698aa356a3bc32694e2b098f9a900-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:31.754768 master-0 kubenswrapper[29936]: I1205 12:52:31.754699 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_b89698aa356a3bc32694e2b098f9a900/kube-apiserver-cert-syncer/0.log" Dec 05 12:52:31.755992 master-0 kubenswrapper[29936]: I1205 12:52:31.755921 29936 generic.go:334] "Generic (PLEG): container finished" podID="b89698aa356a3bc32694e2b098f9a900" containerID="e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4" exitCode=0 Dec 05 12:52:31.756065 master-0 kubenswrapper[29936]: I1205 12:52:31.756035 29936 scope.go:117] "RemoveContainer" containerID="cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83" Dec 05 12:52:31.756223 master-0 kubenswrapper[29936]: I1205 12:52:31.756131 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:31.778565 master-0 kubenswrapper[29936]: I1205 12:52:31.778475 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:31.779387 master-0 kubenswrapper[29936]: I1205 12:52:31.779313 29936 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:31.780001 master-0 kubenswrapper[29936]: I1205 12:52:31.779895 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:31.790680 master-0 kubenswrapper[29936]: I1205 12:52:31.790490 29936 scope.go:117] "RemoveContainer" containerID="9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543" Dec 05 12:52:31.814135 master-0 kubenswrapper[29936]: I1205 12:52:31.814075 29936 scope.go:117] "RemoveContainer" containerID="b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571" Dec 05 12:52:31.839129 master-0 kubenswrapper[29936]: I1205 12:52:31.838943 29936 scope.go:117] "RemoveContainer" containerID="bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658" Dec 05 12:52:31.859974 master-0 kubenswrapper[29936]: I1205 12:52:31.859918 29936 scope.go:117] "RemoveContainer" containerID="e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4" Dec 05 12:52:31.885822 master-0 kubenswrapper[29936]: I1205 12:52:31.885768 29936 scope.go:117] "RemoveContainer" containerID="1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a" Dec 05 12:52:31.916705 master-0 kubenswrapper[29936]: I1205 12:52:31.916643 29936 scope.go:117] "RemoveContainer" containerID="cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83" Dec 05 12:52:31.917563 master-0 kubenswrapper[29936]: E1205 12:52:31.917507 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83\": container with ID starting with cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83 not found: ID does not exist" containerID="cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83" Dec 05 12:52:31.917647 master-0 kubenswrapper[29936]: I1205 12:52:31.917559 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83"} err="failed to get container status \"cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83\": rpc error: code = NotFound desc = could not find container \"cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83\": container with ID starting with cb45ac962cb69e933331ee0c856318dac5cab172e5a67fae300400790243fa83 not found: ID does not exist" Dec 05 12:52:31.917647 master-0 kubenswrapper[29936]: I1205 12:52:31.917594 29936 scope.go:117] "RemoveContainer" containerID="9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543" Dec 05 12:52:31.918201 master-0 kubenswrapper[29936]: E1205 12:52:31.918130 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543\": container with ID starting with 9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543 not found: ID does not exist" containerID="9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543" Dec 05 12:52:31.918201 master-0 kubenswrapper[29936]: I1205 12:52:31.918190 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543"} err="failed to get container status \"9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543\": rpc error: code = NotFound desc = could not find container \"9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543\": container with ID starting with 9fe4c6502fdc4a5ad38e4d8943f30fb0dc815742902f56611365ebade961c543 not found: ID does not exist" Dec 05 12:52:31.918408 master-0 kubenswrapper[29936]: I1205 12:52:31.918217 29936 scope.go:117] "RemoveContainer" containerID="b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571" Dec 05 12:52:31.918785 master-0 kubenswrapper[29936]: E1205 12:52:31.918737 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571\": container with ID starting with b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571 not found: ID does not exist" containerID="b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571" Dec 05 12:52:31.918785 master-0 kubenswrapper[29936]: I1205 12:52:31.918771 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571"} err="failed to get container status \"b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571\": rpc error: code = NotFound desc = could not find container \"b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571\": container with ID starting with b4c300d20c451ceb48a4dd631fddc00299b2cc310864a10d077327af520fb571 not found: ID does not exist" Dec 05 12:52:31.918908 master-0 kubenswrapper[29936]: I1205 12:52:31.918795 29936 scope.go:117] "RemoveContainer" containerID="bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658" Dec 05 12:52:31.919139 master-0 kubenswrapper[29936]: E1205 12:52:31.919088 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658\": container with ID starting with bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658 not found: ID does not exist" containerID="bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658" Dec 05 12:52:31.919220 master-0 kubenswrapper[29936]: I1205 12:52:31.919131 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658"} err="failed to get container status \"bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658\": rpc error: code = NotFound desc = could not find container \"bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658\": container with ID starting with bb7873e2599e8ac76df6ef4a55a0c5149a92c7a11857e0aa4c586472148fc658 not found: ID does not exist" Dec 05 12:52:31.919220 master-0 kubenswrapper[29936]: I1205 12:52:31.919154 29936 scope.go:117] "RemoveContainer" containerID="e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4" Dec 05 12:52:31.919588 master-0 kubenswrapper[29936]: E1205 12:52:31.919545 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4\": container with ID starting with e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4 not found: ID does not exist" containerID="e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4" Dec 05 12:52:31.919588 master-0 kubenswrapper[29936]: I1205 12:52:31.919577 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4"} err="failed to get container status \"e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4\": rpc error: code = NotFound desc = could not find container \"e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4\": container with ID starting with e6673b2b755060cf16e87e6c6406cf444f3e60221e1299617149fc286f9cbbb4 not found: ID does not exist" Dec 05 12:52:31.919700 master-0 kubenswrapper[29936]: I1205 12:52:31.919597 29936 scope.go:117] "RemoveContainer" containerID="1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a" Dec 05 12:52:31.919942 master-0 kubenswrapper[29936]: E1205 12:52:31.919889 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a\": container with ID starting with 1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a not found: ID does not exist" containerID="1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a" Dec 05 12:52:31.919942 master-0 kubenswrapper[29936]: I1205 12:52:31.919930 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a"} err="failed to get container status \"1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a\": rpc error: code = NotFound desc = could not find container \"1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a\": container with ID starting with 1bbd4f368bad5edbbd435da376ff1fe1a1eb948351d43f8a86c24d7830ed7a2a not found: ID does not exist" Dec 05 12:52:33.190355 master-0 kubenswrapper[29936]: I1205 12:52:33.190228 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:33.191383 master-0 kubenswrapper[29936]: I1205 12:52:33.191309 29936 status_manager.go:851] "Failed to get status for pod" podUID="b89698aa356a3bc32694e2b098f9a900" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:33.192317 master-0 kubenswrapper[29936]: I1205 12:52:33.192241 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:33.200223 master-0 kubenswrapper[29936]: I1205 12:52:33.200129 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89698aa356a3bc32694e2b098f9a900" path="/var/lib/kubelet/pods/b89698aa356a3bc32694e2b098f9a900/volumes" Dec 05 12:52:34.822051 master-0 kubenswrapper[29936]: E1205 12:52:34.821877 29936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Dec 05 12:52:35.400284 master-0 kubenswrapper[29936]: I1205 12:52:35.400214 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:35.400777 master-0 kubenswrapper[29936]: I1205 12:52:35.400312 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:52:35.403832 master-0 kubenswrapper[29936]: I1205 12:52:35.403788 29936 patch_prober.go:28] interesting pod/console-c9b8d8fb9-7pxzk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.102:8443/health\": dial tcp 10.128.0.102:8443: connect: connection refused" start-of-body= Dec 05 12:52:35.403927 master-0 kubenswrapper[29936]: I1205 12:52:35.403838 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c9b8d8fb9-7pxzk" podUID="ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" containerName="console" probeResult="failure" output="Get \"https://10.128.0.102:8443/health\": dial tcp 10.128.0.102:8443: connect: connection refused" Dec 05 12:52:35.467375 master-0 kubenswrapper[29936]: I1205 12:52:35.467229 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:52:35.467375 master-0 kubenswrapper[29936]: I1205 12:52:35.467320 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:52:37.904737 master-0 kubenswrapper[29936]: E1205 12:52:37.904222 29936 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Dec 05 12:52:37.904737 master-0 kubenswrapper[29936]: &Event{ObjectMeta:{kube-apiserver-master-0.187e52ce049446b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:b89698aa356a3bc32694e2b098f9a900,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.32.10:17697/healthz": dial tcp 192.168.32.10:17697: connect: connection refused Dec 05 12:52:37.904737 master-0 kubenswrapper[29936]: body: Dec 05 12:52:37.904737 master-0 kubenswrapper[29936]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-05 12:52:28.769380022 +0000 UTC m=+145.901459723,LastTimestamp:2025-12-05 12:52:28.769380022 +0000 UTC m=+145.901459723,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Dec 05 12:52:37.904737 master-0 kubenswrapper[29936]: > Dec 05 12:52:38.462120 master-0 kubenswrapper[29936]: E1205 12:52:38.462046 29936 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:52:38Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:52:38Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:52:38Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T12:52:38Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:38.462613 master-0 kubenswrapper[29936]: E1205 12:52:38.462568 29936 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:38.463594 master-0 kubenswrapper[29936]: E1205 12:52:38.463534 29936 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:38.464536 master-0 kubenswrapper[29936]: E1205 12:52:38.464375 29936 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:38.465385 master-0 kubenswrapper[29936]: E1205 12:52:38.465358 29936 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:38.465488 master-0 kubenswrapper[29936]: E1205 12:52:38.465473 29936 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 12:52:40.185655 master-0 kubenswrapper[29936]: I1205 12:52:40.185552 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:40.188632 master-0 kubenswrapper[29936]: I1205 12:52:40.188522 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:40.189445 master-0 kubenswrapper[29936]: I1205 12:52:40.189392 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:40.223837 master-0 kubenswrapper[29936]: I1205 12:52:40.223741 29936 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2901f05d-e5c5-4b8f-85e3-e5e2c1e62076" Dec 05 12:52:40.223837 master-0 kubenswrapper[29936]: I1205 12:52:40.223803 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2901f05d-e5c5-4b8f-85e3-e5e2c1e62076" Dec 05 12:52:40.224870 master-0 kubenswrapper[29936]: E1205 12:52:40.224780 29936 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:40.225637 master-0 kubenswrapper[29936]: I1205 12:52:40.225583 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:40.265632 master-0 kubenswrapper[29936]: W1205 12:52:40.265556 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda369cadf0161d66f2936cdea3ded59b7.slice/crio-63ab95af6da53eb515d40523fd8056c262533160301ecc8292fd5df14d91112f WatchSource:0}: Error finding container 63ab95af6da53eb515d40523fd8056c262533160301ecc8292fd5df14d91112f: Status 404 returned error can't find the container with id 63ab95af6da53eb515d40523fd8056c262533160301ecc8292fd5df14d91112f Dec 05 12:52:40.841615 master-0 kubenswrapper[29936]: I1205 12:52:40.841426 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"a369cadf0161d66f2936cdea3ded59b7","Type":"ContainerStarted","Data":"63ab95af6da53eb515d40523fd8056c262533160301ecc8292fd5df14d91112f"} Dec 05 12:52:41.223573 master-0 kubenswrapper[29936]: E1205 12:52:41.223481 29936 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Dec 05 12:52:42.870106 master-0 kubenswrapper[29936]: I1205 12:52:42.870015 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_610dc2015b38bc32879d55a7d39b2587/kube-controller-manager/0.log" Dec 05 12:52:42.870106 master-0 kubenswrapper[29936]: I1205 12:52:42.870094 29936 generic.go:334] "Generic (PLEG): container finished" podID="610dc2015b38bc32879d55a7d39b2587" containerID="36ca02e8be7a0b8aad017a2fba35ee2e24e24ec30949f922f7c15439af96ed15" exitCode=1 Dec 05 12:52:42.870809 master-0 kubenswrapper[29936]: I1205 12:52:42.870202 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"610dc2015b38bc32879d55a7d39b2587","Type":"ContainerDied","Data":"36ca02e8be7a0b8aad017a2fba35ee2e24e24ec30949f922f7c15439af96ed15"} Dec 05 12:52:42.870902 master-0 kubenswrapper[29936]: I1205 12:52:42.870873 29936 scope.go:117] "RemoveContainer" containerID="36ca02e8be7a0b8aad017a2fba35ee2e24e24ec30949f922f7c15439af96ed15" Dec 05 12:52:42.873333 master-0 kubenswrapper[29936]: I1205 12:52:42.873005 29936 status_manager.go:851] "Failed to get status for pod" podUID="610dc2015b38bc32879d55a7d39b2587" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:42.874315 master-0 kubenswrapper[29936]: I1205 12:52:42.874243 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:42.875085 master-0 kubenswrapper[29936]: I1205 12:52:42.875034 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:42.878103 master-0 kubenswrapper[29936]: I1205 12:52:42.878006 29936 generic.go:334] "Generic (PLEG): container finished" podID="a369cadf0161d66f2936cdea3ded59b7" containerID="2f85a3215cb0f69ec20e2a8533a9fe46ec8f7ee47d0bbd6d4d65250f9818c542" exitCode=0 Dec 05 12:52:42.878207 master-0 kubenswrapper[29936]: I1205 12:52:42.878133 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"a369cadf0161d66f2936cdea3ded59b7","Type":"ContainerDied","Data":"2f85a3215cb0f69ec20e2a8533a9fe46ec8f7ee47d0bbd6d4d65250f9818c542"} Dec 05 12:52:42.878672 master-0 kubenswrapper[29936]: I1205 12:52:42.878623 29936 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2901f05d-e5c5-4b8f-85e3-e5e2c1e62076" Dec 05 12:52:42.878721 master-0 kubenswrapper[29936]: I1205 12:52:42.878671 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2901f05d-e5c5-4b8f-85e3-e5e2c1e62076" Dec 05 12:52:42.879544 master-0 kubenswrapper[29936]: I1205 12:52:42.879478 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:42.879624 master-0 kubenswrapper[29936]: E1205 12:52:42.879568 29936 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:42.880303 master-0 kubenswrapper[29936]: I1205 12:52:42.880207 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:42.880748 master-0 kubenswrapper[29936]: I1205 12:52:42.880681 29936 status_manager.go:851] "Failed to get status for pod" podUID="610dc2015b38bc32879d55a7d39b2587" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:43.196116 master-0 kubenswrapper[29936]: I1205 12:52:43.195997 29936 status_manager.go:851] "Failed to get status for pod" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:43.196684 master-0 kubenswrapper[29936]: I1205 12:52:43.196622 29936 status_manager.go:851] "Failed to get status for pod" podUID="a369cadf0161d66f2936cdea3ded59b7" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:43.197162 master-0 kubenswrapper[29936]: I1205 12:52:43.197115 29936 status_manager.go:851] "Failed to get status for pod" podUID="610dc2015b38bc32879d55a7d39b2587" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:43.197610 master-0 kubenswrapper[29936]: I1205 12:52:43.197555 29936 status_manager.go:851] "Failed to get status for pod" podUID="b83ccd6fa217a93a2c607d0109896ef8" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 05 12:52:44.920232 master-0 kubenswrapper[29936]: I1205 12:52:44.920127 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_610dc2015b38bc32879d55a7d39b2587/kube-controller-manager/0.log" Dec 05 12:52:44.920908 master-0 kubenswrapper[29936]: I1205 12:52:44.920314 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"610dc2015b38bc32879d55a7d39b2587","Type":"ContainerStarted","Data":"67d67d3e89e13fa99e16d8850ae9285f71eeb433f6a0cc9257e00f3e497935e9"} Dec 05 12:52:44.932381 master-0 kubenswrapper[29936]: I1205 12:52:44.932326 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"a369cadf0161d66f2936cdea3ded59b7","Type":"ContainerStarted","Data":"380280c2843252cc9d11e4dc98f9b6f8a70f5ac978196fd455c9f06ab6a7dc19"} Dec 05 12:52:44.932381 master-0 kubenswrapper[29936]: I1205 12:52:44.932388 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"a369cadf0161d66f2936cdea3ded59b7","Type":"ContainerStarted","Data":"7b4eaad5db6fc15b391899a0e17cda75384bc0cc19870b1486eb1dbd75223ef8"} Dec 05 12:52:45.151779 master-0 kubenswrapper[29936]: I1205 12:52:45.151684 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:52:45.151779 master-0 kubenswrapper[29936]: I1205 12:52:45.151780 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:52:45.152092 master-0 kubenswrapper[29936]: I1205 12:52:45.152064 29936 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 05 12:52:45.152145 master-0 kubenswrapper[29936]: I1205 12:52:45.152112 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 05 12:52:45.400630 master-0 kubenswrapper[29936]: I1205 12:52:45.400431 29936 patch_prober.go:28] interesting pod/console-c9b8d8fb9-7pxzk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.102:8443/health\": dial tcp 10.128.0.102:8443: connect: connection refused" start-of-body= Dec 05 12:52:45.400630 master-0 kubenswrapper[29936]: I1205 12:52:45.400529 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c9b8d8fb9-7pxzk" podUID="ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" containerName="console" probeResult="failure" output="Get \"https://10.128.0.102:8443/health\": dial tcp 10.128.0.102:8443: connect: connection refused" Dec 05 12:52:45.467983 master-0 kubenswrapper[29936]: I1205 12:52:45.467902 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:52:45.467983 master-0 kubenswrapper[29936]: I1205 12:52:45.467979 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:52:45.942946 master-0 kubenswrapper[29936]: I1205 12:52:45.942840 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"a369cadf0161d66f2936cdea3ded59b7","Type":"ContainerStarted","Data":"c5cc3253156b953318e17b44806339236ddf359feccbcded4877b8becbe84f9b"} Dec 05 12:52:46.962834 master-0 kubenswrapper[29936]: I1205 12:52:46.962362 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"a369cadf0161d66f2936cdea3ded59b7","Type":"ContainerStarted","Data":"b677a6e19d505375aeeef4ef29910ffc9674854047723305aed4a7021bf49768"} Dec 05 12:52:46.962834 master-0 kubenswrapper[29936]: I1205 12:52:46.962779 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"a369cadf0161d66f2936cdea3ded59b7","Type":"ContainerStarted","Data":"0a5b6220d08c461c2e1cf20c6adb084d7651685ed19e953b15c64936504ee480"} Dec 05 12:52:46.963701 master-0 kubenswrapper[29936]: I1205 12:52:46.962954 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:46.963701 master-0 kubenswrapper[29936]: I1205 12:52:46.963084 29936 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2901f05d-e5c5-4b8f-85e3-e5e2c1e62076" Dec 05 12:52:46.963701 master-0 kubenswrapper[29936]: I1205 12:52:46.963123 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2901f05d-e5c5-4b8f-85e3-e5e2c1e62076" Dec 05 12:52:50.005221 master-0 kubenswrapper[29936]: I1205 12:52:50.005110 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7495f49968-4tq6k" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" containerID="cri-o://7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba" gracePeriod=15 Dec 05 12:52:50.226448 master-0 kubenswrapper[29936]: I1205 12:52:50.226342 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:50.226448 master-0 kubenswrapper[29936]: I1205 12:52:50.226435 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:50.234732 master-0 kubenswrapper[29936]: I1205 12:52:50.234607 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:50.595353 master-0 kubenswrapper[29936]: I1205 12:52:50.595290 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7495f49968-4tq6k_3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5/console/0.log" Dec 05 12:52:50.595682 master-0 kubenswrapper[29936]: I1205 12:52:50.595399 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:52:50.671840 master-0 kubenswrapper[29936]: I1205 12:52:50.671779 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-config\") pod \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " Dec 05 12:52:50.671840 master-0 kubenswrapper[29936]: I1205 12:52:50.671852 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-oauth-serving-cert\") pod \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " Dec 05 12:52:50.672232 master-0 kubenswrapper[29936]: I1205 12:52:50.671898 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-serving-cert\") pod \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " Dec 05 12:52:50.672232 master-0 kubenswrapper[29936]: I1205 12:52:50.671998 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shncq\" (UniqueName: \"kubernetes.io/projected/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-kube-api-access-shncq\") pod \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " Dec 05 12:52:50.672340 master-0 kubenswrapper[29936]: I1205 12:52:50.672275 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-oauth-config\") pod \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " Dec 05 12:52:50.672455 master-0 kubenswrapper[29936]: I1205 12:52:50.672386 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-service-ca\") pod \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\" (UID: \"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5\") " Dec 05 12:52:50.673242 master-0 kubenswrapper[29936]: I1205 12:52:50.672464 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-config" (OuterVolumeSpecName: "console-config") pod "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" (UID: "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:52:50.673242 master-0 kubenswrapper[29936]: I1205 12:52:50.672750 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" (UID: "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:52:50.673242 master-0 kubenswrapper[29936]: I1205 12:52:50.673060 29936 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:50.673242 master-0 kubenswrapper[29936]: I1205 12:52:50.673089 29936 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:50.673242 master-0 kubenswrapper[29936]: I1205 12:52:50.673207 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" (UID: "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:52:50.675569 master-0 kubenswrapper[29936]: I1205 12:52:50.675533 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-kube-api-access-shncq" (OuterVolumeSpecName: "kube-api-access-shncq") pod "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" (UID: "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5"). InnerVolumeSpecName "kube-api-access-shncq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:52:50.675657 master-0 kubenswrapper[29936]: I1205 12:52:50.675525 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" (UID: "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:52:50.676943 master-0 kubenswrapper[29936]: I1205 12:52:50.676860 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" (UID: "3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:52:50.775218 master-0 kubenswrapper[29936]: I1205 12:52:50.775125 29936 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:50.775218 master-0 kubenswrapper[29936]: I1205 12:52:50.775213 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shncq\" (UniqueName: \"kubernetes.io/projected/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-kube-api-access-shncq\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:50.775559 master-0 kubenswrapper[29936]: I1205 12:52:50.775242 29936 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:50.775559 master-0 kubenswrapper[29936]: I1205 12:52:50.775273 29936 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:52:51.004449 master-0 kubenswrapper[29936]: I1205 12:52:51.004380 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7495f49968-4tq6k_3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5/console/0.log" Dec 05 12:52:51.004449 master-0 kubenswrapper[29936]: I1205 12:52:51.004453 29936 generic.go:334] "Generic (PLEG): container finished" podID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerID="7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba" exitCode=2 Dec 05 12:52:51.004916 master-0 kubenswrapper[29936]: I1205 12:52:51.004498 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7495f49968-4tq6k" event={"ID":"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5","Type":"ContainerDied","Data":"7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba"} Dec 05 12:52:51.004916 master-0 kubenswrapper[29936]: I1205 12:52:51.004536 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7495f49968-4tq6k" event={"ID":"3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5","Type":"ContainerDied","Data":"174bcae07eb3cd4b7fdce32bd0e0bfd0c11ecf799abf2e1f44f9c05e951ecfa1"} Dec 05 12:52:51.004916 master-0 kubenswrapper[29936]: I1205 12:52:51.004558 29936 scope.go:117] "RemoveContainer" containerID="7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba" Dec 05 12:52:51.004916 master-0 kubenswrapper[29936]: I1205 12:52:51.004567 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7495f49968-4tq6k" Dec 05 12:52:51.026585 master-0 kubenswrapper[29936]: I1205 12:52:51.026394 29936 scope.go:117] "RemoveContainer" containerID="7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba" Dec 05 12:52:51.028198 master-0 kubenswrapper[29936]: E1205 12:52:51.028108 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba\": container with ID starting with 7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba not found: ID does not exist" containerID="7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba" Dec 05 12:52:51.028198 master-0 kubenswrapper[29936]: I1205 12:52:51.028172 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba"} err="failed to get container status \"7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba\": rpc error: code = NotFound desc = could not find container \"7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba\": container with ID starting with 7e2752bcd14ea12d204cc4e24319d66a24f791b34f70c5994177d58a5778a8ba not found: ID does not exist" Dec 05 12:52:51.979278 master-0 kubenswrapper[29936]: I1205 12:52:51.979172 29936 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:52.014868 master-0 kubenswrapper[29936]: I1205 12:52:52.014798 29936 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2901f05d-e5c5-4b8f-85e3-e5e2c1e62076" Dec 05 12:52:52.014868 master-0 kubenswrapper[29936]: I1205 12:52:52.014843 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2901f05d-e5c5-4b8f-85e3-e5e2c1e62076" Dec 05 12:52:52.019077 master-0 kubenswrapper[29936]: I1205 12:52:52.019026 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:52:52.022239 master-0 kubenswrapper[29936]: I1205 12:52:52.022138 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="a369cadf0161d66f2936cdea3ded59b7" podUID="4e304324-a423-4119-9715-632e46595575" Dec 05 12:52:53.028660 master-0 kubenswrapper[29936]: I1205 12:52:53.028500 29936 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2901f05d-e5c5-4b8f-85e3-e5e2c1e62076" Dec 05 12:52:53.028660 master-0 kubenswrapper[29936]: I1205 12:52:53.028609 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2901f05d-e5c5-4b8f-85e3-e5e2c1e62076" Dec 05 12:52:53.206730 master-0 kubenswrapper[29936]: I1205 12:52:53.206655 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="a369cadf0161d66f2936cdea3ded59b7" podUID="4e304324-a423-4119-9715-632e46595575" Dec 05 12:52:55.152427 master-0 kubenswrapper[29936]: I1205 12:52:55.152355 29936 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 05 12:52:55.153269 master-0 kubenswrapper[29936]: I1205 12:52:55.153223 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 05 12:52:55.401351 master-0 kubenswrapper[29936]: I1205 12:52:55.401288 29936 patch_prober.go:28] interesting pod/console-c9b8d8fb9-7pxzk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.102:8443/health\": dial tcp 10.128.0.102:8443: connect: connection refused" start-of-body= Dec 05 12:52:55.401351 master-0 kubenswrapper[29936]: I1205 12:52:55.401355 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c9b8d8fb9-7pxzk" podUID="ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" containerName="console" probeResult="failure" output="Get \"https://10.128.0.102:8443/health\": dial tcp 10.128.0.102:8443: connect: connection refused" Dec 05 12:52:55.468140 master-0 kubenswrapper[29936]: I1205 12:52:55.468057 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:52:55.468471 master-0 kubenswrapper[29936]: I1205 12:52:55.468142 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:53:00.081989 master-0 kubenswrapper[29936]: I1205 12:53:00.081861 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 05 12:53:00.091630 master-0 kubenswrapper[29936]: I1205 12:53:00.091571 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-m6vhr" Dec 05 12:53:00.140907 master-0 kubenswrapper[29936]: I1205 12:53:00.140816 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 12:53:01.099766 master-0 kubenswrapper[29936]: I1205 12:53:01.099693 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 12:53:01.483760 master-0 kubenswrapper[29936]: I1205 12:53:01.483699 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 05 12:53:01.794949 master-0 kubenswrapper[29936]: I1205 12:53:01.794762 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-hcp7n" Dec 05 12:53:01.874354 master-0 kubenswrapper[29936]: I1205 12:53:01.874115 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 12:53:01.878499 master-0 kubenswrapper[29936]: I1205 12:53:01.878423 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 12:53:01.960399 master-0 kubenswrapper[29936]: I1205 12:53:01.960288 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-cz7x2" Dec 05 12:53:02.044905 master-0 kubenswrapper[29936]: I1205 12:53:02.044829 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 12:53:02.051029 master-0 kubenswrapper[29936]: I1205 12:53:02.050912 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 12:53:02.111044 master-0 kubenswrapper[29936]: I1205 12:53:02.110995 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 05 12:53:02.173135 master-0 kubenswrapper[29936]: I1205 12:53:02.173002 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 12:53:02.292084 master-0 kubenswrapper[29936]: I1205 12:53:02.292033 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-sxj7j" Dec 05 12:53:02.441415 master-0 kubenswrapper[29936]: I1205 12:53:02.441350 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 05 12:53:02.555116 master-0 kubenswrapper[29936]: I1205 12:53:02.555047 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 12:53:02.617901 master-0 kubenswrapper[29936]: I1205 12:53:02.617789 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-4gpnc" Dec 05 12:53:02.656863 master-0 kubenswrapper[29936]: I1205 12:53:02.656791 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 12:53:02.668625 master-0 kubenswrapper[29936]: I1205 12:53:02.668535 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 12:53:02.786314 master-0 kubenswrapper[29936]: I1205 12:53:02.784642 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 12:53:02.884066 master-0 kubenswrapper[29936]: I1205 12:53:02.883986 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 12:53:03.060732 master-0 kubenswrapper[29936]: I1205 12:53:03.060597 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 12:53:03.169428 master-0 kubenswrapper[29936]: I1205 12:53:03.169280 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 12:53:03.294280 master-0 kubenswrapper[29936]: I1205 12:53:03.294239 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 12:53:03.579908 master-0 kubenswrapper[29936]: I1205 12:53:03.579758 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 12:53:03.744724 master-0 kubenswrapper[29936]: I1205 12:53:03.744633 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 12:53:03.852539 master-0 kubenswrapper[29936]: I1205 12:53:03.852384 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 12:53:03.910797 master-0 kubenswrapper[29936]: I1205 12:53:03.910720 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 12:53:04.004211 master-0 kubenswrapper[29936]: I1205 12:53:04.004133 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 12:53:04.092657 master-0 kubenswrapper[29936]: I1205 12:53:04.092569 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 12:53:04.097219 master-0 kubenswrapper[29936]: I1205 12:53:04.097146 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 12:53:04.103651 master-0 kubenswrapper[29936]: I1205 12:53:04.103470 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 12:53:04.170070 master-0 kubenswrapper[29936]: I1205 12:53:04.169989 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 12:53:04.171717 master-0 kubenswrapper[29936]: I1205 12:53:04.171676 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-xzgbv" Dec 05 12:53:04.186098 master-0 kubenswrapper[29936]: I1205 12:53:04.186010 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 05 12:53:04.198679 master-0 kubenswrapper[29936]: I1205 12:53:04.198626 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 05 12:53:04.260641 master-0 kubenswrapper[29936]: I1205 12:53:04.260569 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 12:53:04.290623 master-0 kubenswrapper[29936]: I1205 12:53:04.290530 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 12:53:04.304575 master-0 kubenswrapper[29936]: I1205 12:53:04.304494 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 12:53:04.328685 master-0 kubenswrapper[29936]: I1205 12:53:04.328559 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 12:53:04.406599 master-0 kubenswrapper[29936]: I1205 12:53:04.406385 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 12:53:04.450544 master-0 kubenswrapper[29936]: I1205 12:53:04.450458 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-bnqtr" Dec 05 12:53:04.487659 master-0 kubenswrapper[29936]: I1205 12:53:04.487575 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 12:53:04.505488 master-0 kubenswrapper[29936]: I1205 12:53:04.505438 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 05 12:53:04.522934 master-0 kubenswrapper[29936]: I1205 12:53:04.522861 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 12:53:04.583512 master-0 kubenswrapper[29936]: I1205 12:53:04.583463 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 12:53:04.590165 master-0 kubenswrapper[29936]: I1205 12:53:04.590132 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-zknmp" Dec 05 12:53:04.629624 master-0 kubenswrapper[29936]: I1205 12:53:04.629556 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 05 12:53:04.666915 master-0 kubenswrapper[29936]: I1205 12:53:04.666755 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 12:53:04.667149 master-0 kubenswrapper[29936]: I1205 12:53:04.667018 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-z9sgn" Dec 05 12:53:04.758388 master-0 kubenswrapper[29936]: I1205 12:53:04.758318 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 05 12:53:04.797154 master-0 kubenswrapper[29936]: I1205 12:53:04.797064 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 12:53:04.802445 master-0 kubenswrapper[29936]: I1205 12:53:04.802365 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 05 12:53:04.820820 master-0 kubenswrapper[29936]: I1205 12:53:04.820733 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-rbhdx" Dec 05 12:53:04.825635 master-0 kubenswrapper[29936]: I1205 12:53:04.825522 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 12:53:04.861364 master-0 kubenswrapper[29936]: I1205 12:53:04.861271 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 12:53:04.878339 master-0 kubenswrapper[29936]: I1205 12:53:04.878208 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 12:53:04.900018 master-0 kubenswrapper[29936]: I1205 12:53:04.899772 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 12:53:04.912737 master-0 kubenswrapper[29936]: I1205 12:53:04.912647 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 12:53:04.954288 master-0 kubenswrapper[29936]: I1205 12:53:04.954157 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 05 12:53:05.023917 master-0 kubenswrapper[29936]: I1205 12:53:05.023855 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-kj2kk" Dec 05 12:53:05.067171 master-0 kubenswrapper[29936]: I1205 12:53:05.067084 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 12:53:05.077738 master-0 kubenswrapper[29936]: I1205 12:53:05.077678 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 05 12:53:05.152886 master-0 kubenswrapper[29936]: I1205 12:53:05.152806 29936 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 05 12:53:05.153214 master-0 kubenswrapper[29936]: I1205 12:53:05.152902 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 05 12:53:05.153214 master-0 kubenswrapper[29936]: I1205 12:53:05.152989 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:53:05.153930 master-0 kubenswrapper[29936]: I1205 12:53:05.153884 29936 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"67d67d3e89e13fa99e16d8850ae9285f71eeb433f6a0cc9257e00f3e497935e9"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 12:53:05.154033 master-0 kubenswrapper[29936]: I1205 12:53:05.153993 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" containerID="cri-o://67d67d3e89e13fa99e16d8850ae9285f71eeb433f6a0cc9257e00f3e497935e9" gracePeriod=30 Dec 05 12:53:05.241890 master-0 kubenswrapper[29936]: I1205 12:53:05.241808 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 05 12:53:05.314800 master-0 kubenswrapper[29936]: I1205 12:53:05.314692 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 12:53:05.360059 master-0 kubenswrapper[29936]: I1205 12:53:05.359942 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 12:53:05.388110 master-0 kubenswrapper[29936]: I1205 12:53:05.387946 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 05 12:53:05.403597 master-0 kubenswrapper[29936]: I1205 12:53:05.403485 29936 patch_prober.go:28] interesting pod/console-c9b8d8fb9-7pxzk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.102:8443/health\": dial tcp 10.128.0.102:8443: connect: connection refused" start-of-body= Dec 05 12:53:05.403597 master-0 kubenswrapper[29936]: I1205 12:53:05.403576 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-c9b8d8fb9-7pxzk" podUID="ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" containerName="console" probeResult="failure" output="Get \"https://10.128.0.102:8443/health\": dial tcp 10.128.0.102:8443: connect: connection refused" Dec 05 12:53:05.467392 master-0 kubenswrapper[29936]: I1205 12:53:05.467314 29936 patch_prober.go:28] interesting pod/console-74ffd5f75f-slrkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Dec 05 12:53:05.467782 master-0 kubenswrapper[29936]: I1205 12:53:05.467440 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Dec 05 12:53:05.576741 master-0 kubenswrapper[29936]: I1205 12:53:05.576563 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-jdkkl" Dec 05 12:53:05.594265 master-0 kubenswrapper[29936]: I1205 12:53:05.593942 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-wqbtd" Dec 05 12:53:05.666840 master-0 kubenswrapper[29936]: I1205 12:53:05.666747 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 12:53:05.705924 master-0 kubenswrapper[29936]: I1205 12:53:05.705784 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 05 12:53:05.815070 master-0 kubenswrapper[29936]: I1205 12:53:05.814895 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-br7gx" Dec 05 12:53:05.945718 master-0 kubenswrapper[29936]: I1205 12:53:05.945600 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 12:53:06.019224 master-0 kubenswrapper[29936]: I1205 12:53:06.019116 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 12:53:06.037714 master-0 kubenswrapper[29936]: I1205 12:53:06.037611 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 12:53:06.042086 master-0 kubenswrapper[29936]: I1205 12:53:06.042028 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 12:53:06.042257 master-0 kubenswrapper[29936]: I1205 12:53:06.042131 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 12:53:06.062712 master-0 kubenswrapper[29936]: I1205 12:53:06.062639 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 12:53:06.064904 master-0 kubenswrapper[29936]: I1205 12:53:06.064854 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 05 12:53:06.110191 master-0 kubenswrapper[29936]: I1205 12:53:06.109992 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 12:53:06.111316 master-0 kubenswrapper[29936]: I1205 12:53:06.111259 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 12:53:06.112621 master-0 kubenswrapper[29936]: I1205 12:53:06.112585 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 12:53:06.128531 master-0 kubenswrapper[29936]: I1205 12:53:06.128437 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 05 12:53:06.159322 master-0 kubenswrapper[29936]: I1205 12:53:06.159244 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 12:53:06.295426 master-0 kubenswrapper[29936]: I1205 12:53:06.295292 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 12:53:06.325033 master-0 kubenswrapper[29936]: I1205 12:53:06.324961 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 12:53:06.338997 master-0 kubenswrapper[29936]: I1205 12:53:06.338944 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 12:53:06.398097 master-0 kubenswrapper[29936]: I1205 12:53:06.398040 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 05 12:53:06.423309 master-0 kubenswrapper[29936]: I1205 12:53:06.422862 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 12:53:06.464842 master-0 kubenswrapper[29936]: I1205 12:53:06.464763 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 12:53:06.485963 master-0 kubenswrapper[29936]: I1205 12:53:06.485908 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 05 12:53:06.528538 master-0 kubenswrapper[29936]: I1205 12:53:06.528428 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 12:53:06.538738 master-0 kubenswrapper[29936]: I1205 12:53:06.538693 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-285qn" Dec 05 12:53:06.554840 master-0 kubenswrapper[29936]: I1205 12:53:06.554730 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 12:53:06.631117 master-0 kubenswrapper[29936]: I1205 12:53:06.630958 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 12:53:06.647170 master-0 kubenswrapper[29936]: I1205 12:53:06.647117 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-6t5rm" Dec 05 12:53:06.698172 master-0 kubenswrapper[29936]: I1205 12:53:06.698101 29936 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 12:53:06.701854 master-0 kubenswrapper[29936]: I1205 12:53:06.701537 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=38.701516898 podStartE2EDuration="38.701516898s" podCreationTimestamp="2025-12-05 12:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:52:50.565557168 +0000 UTC m=+167.697636879" watchObservedRunningTime="2025-12-05 12:53:06.701516898 +0000 UTC m=+183.833596579" Dec 05 12:53:06.704742 master-0 kubenswrapper[29936]: I1205 12:53:06.704714 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7495f49968-4tq6k","openshift-kube-apiserver/kube-apiserver-master-0"] Dec 05 12:53:06.704833 master-0 kubenswrapper[29936]: I1205 12:53:06.704769 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 05 12:53:06.716238 master-0 kubenswrapper[29936]: I1205 12:53:06.715992 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 05 12:53:06.718413 master-0 kubenswrapper[29936]: I1205 12:53:06.718376 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 12:53:06.765764 master-0 kubenswrapper[29936]: I1205 12:53:06.765651 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=15.765631918 podStartE2EDuration="15.765631918s" podCreationTimestamp="2025-12-05 12:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:53:06.76353061 +0000 UTC m=+183.895610311" watchObservedRunningTime="2025-12-05 12:53:06.765631918 +0000 UTC m=+183.897711599" Dec 05 12:53:06.798161 master-0 kubenswrapper[29936]: I1205 12:53:06.798083 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 12:53:06.823444 master-0 kubenswrapper[29936]: I1205 12:53:06.823310 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 12:53:06.833488 master-0 kubenswrapper[29936]: I1205 12:53:06.833440 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 12:53:06.848349 master-0 kubenswrapper[29936]: I1205 12:53:06.848297 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 12:53:06.881386 master-0 kubenswrapper[29936]: I1205 12:53:06.881316 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 12:53:06.924493 master-0 kubenswrapper[29936]: I1205 12:53:06.924418 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 05 12:53:06.985499 master-0 kubenswrapper[29936]: I1205 12:53:06.985382 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 12:53:06.992753 master-0 kubenswrapper[29936]: I1205 12:53:06.992705 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 05 12:53:06.995660 master-0 kubenswrapper[29936]: I1205 12:53:06.995596 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 05 12:53:07.067009 master-0 kubenswrapper[29936]: I1205 12:53:07.066943 29936 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 12:53:07.113798 master-0 kubenswrapper[29936]: I1205 12:53:07.113626 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 05 12:53:07.190711 master-0 kubenswrapper[29936]: I1205 12:53:07.190645 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 12:53:07.198333 master-0 kubenswrapper[29936]: I1205 12:53:07.197981 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" path="/var/lib/kubelet/pods/3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5/volumes" Dec 05 12:53:07.233800 master-0 kubenswrapper[29936]: I1205 12:53:07.233701 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 12:53:07.274109 master-0 kubenswrapper[29936]: I1205 12:53:07.273993 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-mpkz5" Dec 05 12:53:07.288354 master-0 kubenswrapper[29936]: I1205 12:53:07.288248 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 12:53:07.298280 master-0 kubenswrapper[29936]: I1205 12:53:07.298199 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 12:53:07.324516 master-0 kubenswrapper[29936]: I1205 12:53:07.324465 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 12:53:07.379370 master-0 kubenswrapper[29936]: I1205 12:53:07.379249 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 12:53:07.428252 master-0 kubenswrapper[29936]: I1205 12:53:07.428087 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 12:53:07.429383 master-0 kubenswrapper[29936]: I1205 12:53:07.429327 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 05 12:53:07.471334 master-0 kubenswrapper[29936]: I1205 12:53:07.471225 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-s6pqp" Dec 05 12:53:07.480297 master-0 kubenswrapper[29936]: I1205 12:53:07.480174 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-8mxmd" Dec 05 12:53:07.494616 master-0 kubenswrapper[29936]: I1205 12:53:07.494495 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 05 12:53:07.499406 master-0 kubenswrapper[29936]: I1205 12:53:07.499344 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 12:53:07.535502 master-0 kubenswrapper[29936]: I1205 12:53:07.535429 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 12:53:07.621042 master-0 kubenswrapper[29936]: I1205 12:53:07.620935 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 12:53:07.631701 master-0 kubenswrapper[29936]: I1205 12:53:07.631510 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 12:53:07.674314 master-0 kubenswrapper[29936]: I1205 12:53:07.674231 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 05 12:53:07.688303 master-0 kubenswrapper[29936]: I1205 12:53:07.688232 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 12:53:07.690610 master-0 kubenswrapper[29936]: I1205 12:53:07.690535 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 12:53:07.698895 master-0 kubenswrapper[29936]: I1205 12:53:07.698823 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 12:53:07.792162 master-0 kubenswrapper[29936]: I1205 12:53:07.792058 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-94n4t" Dec 05 12:53:07.879225 master-0 kubenswrapper[29936]: I1205 12:53:07.879149 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 12:53:07.886018 master-0 kubenswrapper[29936]: I1205 12:53:07.885870 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 12:53:07.927535 master-0 kubenswrapper[29936]: I1205 12:53:07.927445 29936 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 12:53:07.991130 master-0 kubenswrapper[29936]: I1205 12:53:07.991042 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 12:53:08.146163 master-0 kubenswrapper[29936]: I1205 12:53:08.146013 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 12:53:08.286809 master-0 kubenswrapper[29936]: I1205 12:53:08.286738 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 05 12:53:08.287472 master-0 kubenswrapper[29936]: I1205 12:53:08.287417 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 12:53:08.288114 master-0 kubenswrapper[29936]: I1205 12:53:08.288084 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 05 12:53:08.319008 master-0 kubenswrapper[29936]: I1205 12:53:08.318917 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 12:53:08.332497 master-0 kubenswrapper[29936]: I1205 12:53:08.332414 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 12:53:08.356997 master-0 kubenswrapper[29936]: I1205 12:53:08.356890 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 12:53:08.377411 master-0 kubenswrapper[29936]: I1205 12:53:08.377343 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 12:53:08.396979 master-0 kubenswrapper[29936]: I1205 12:53:08.396824 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 12:53:08.430606 master-0 kubenswrapper[29936]: I1205 12:53:08.430534 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 05 12:53:08.535802 master-0 kubenswrapper[29936]: I1205 12:53:08.535699 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 12:53:08.701214 master-0 kubenswrapper[29936]: I1205 12:53:08.701114 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 05 12:53:08.714221 master-0 kubenswrapper[29936]: I1205 12:53:08.714115 29936 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 12:53:08.730362 master-0 kubenswrapper[29936]: I1205 12:53:08.730251 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 05 12:53:08.776788 master-0 kubenswrapper[29936]: I1205 12:53:08.776686 29936 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 12:53:08.861133 master-0 kubenswrapper[29936]: I1205 12:53:08.861067 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 05 12:53:08.900643 master-0 kubenswrapper[29936]: I1205 12:53:08.900549 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 12:53:08.994270 master-0 kubenswrapper[29936]: I1205 12:53:08.993993 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 12:53:09.021506 master-0 kubenswrapper[29936]: I1205 12:53:09.021402 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 12:53:09.046492 master-0 kubenswrapper[29936]: I1205 12:53:09.046393 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 12:53:09.053803 master-0 kubenswrapper[29936]: I1205 12:53:09.053737 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"tuned-dockercfg-5khzw" Dec 05 12:53:09.061256 master-0 kubenswrapper[29936]: I1205 12:53:09.061199 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:53:09.141923 master-0 kubenswrapper[29936]: I1205 12:53:09.141851 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 12:53:09.152594 master-0 kubenswrapper[29936]: I1205 12:53:09.152512 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 12:53:09.156207 master-0 kubenswrapper[29936]: I1205 12:53:09.156099 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 12:53:09.165527 master-0 kubenswrapper[29936]: I1205 12:53:09.165453 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 12:53:09.211113 master-0 kubenswrapper[29936]: I1205 12:53:09.211046 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 05 12:53:09.268315 master-0 kubenswrapper[29936]: I1205 12:53:09.268039 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-e63soeg91on8p" Dec 05 12:53:09.338501 master-0 kubenswrapper[29936]: I1205 12:53:09.338398 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 12:53:09.393052 master-0 kubenswrapper[29936]: I1205 12:53:09.392996 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 12:53:09.395875 master-0 kubenswrapper[29936]: I1205 12:53:09.395812 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 05 12:53:09.487833 master-0 kubenswrapper[29936]: I1205 12:53:09.487759 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 12:53:09.498513 master-0 kubenswrapper[29936]: I1205 12:53:09.498456 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 12:53:09.556248 master-0 kubenswrapper[29936]: I1205 12:53:09.556018 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 12:53:09.625101 master-0 kubenswrapper[29936]: I1205 12:53:09.625033 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 05 12:53:09.657711 master-0 kubenswrapper[29936]: I1205 12:53:09.657421 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 12:53:09.684604 master-0 kubenswrapper[29936]: I1205 12:53:09.684518 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 12:53:09.700000 master-0 kubenswrapper[29936]: I1205 12:53:09.699930 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 12:53:09.816463 master-0 kubenswrapper[29936]: I1205 12:53:09.816294 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 12:53:09.943908 master-0 kubenswrapper[29936]: I1205 12:53:09.943847 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-thjz4" Dec 05 12:53:09.977948 master-0 kubenswrapper[29936]: I1205 12:53:09.977881 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 05 12:53:09.980703 master-0 kubenswrapper[29936]: I1205 12:53:09.980609 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 12:53:10.004163 master-0 kubenswrapper[29936]: I1205 12:53:10.004048 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 12:53:10.033801 master-0 kubenswrapper[29936]: I1205 12:53:10.033709 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 12:53:10.041083 master-0 kubenswrapper[29936]: I1205 12:53:10.041014 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 05 12:53:10.055351 master-0 kubenswrapper[29936]: I1205 12:53:10.055292 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 12:53:10.073544 master-0 kubenswrapper[29936]: I1205 12:53:10.073401 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 12:53:10.122809 master-0 kubenswrapper[29936]: I1205 12:53:10.122729 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 12:53:10.140390 master-0 kubenswrapper[29936]: I1205 12:53:10.140318 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 05 12:53:10.231112 master-0 kubenswrapper[29936]: I1205 12:53:10.231050 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 12:53:10.287471 master-0 kubenswrapper[29936]: I1205 12:53:10.287392 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 12:53:10.377157 master-0 kubenswrapper[29936]: I1205 12:53:10.376984 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 05 12:53:10.400491 master-0 kubenswrapper[29936]: I1205 12:53:10.400407 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 12:53:10.436013 master-0 kubenswrapper[29936]: I1205 12:53:10.435945 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 12:53:10.461404 master-0 kubenswrapper[29936]: I1205 12:53:10.461291 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 12:53:10.461861 master-0 kubenswrapper[29936]: I1205 12:53:10.461792 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 12:53:10.500059 master-0 kubenswrapper[29936]: I1205 12:53:10.500001 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 12:53:10.592397 master-0 kubenswrapper[29936]: I1205 12:53:10.592321 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 12:53:10.593986 master-0 kubenswrapper[29936]: I1205 12:53:10.593919 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 05 12:53:10.623768 master-0 kubenswrapper[29936]: I1205 12:53:10.623719 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 12:53:10.682540 master-0 kubenswrapper[29936]: I1205 12:53:10.682397 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 05 12:53:10.685226 master-0 kubenswrapper[29936]: I1205 12:53:10.685148 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 12:53:10.738594 master-0 kubenswrapper[29936]: I1205 12:53:10.738506 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 12:53:10.837937 master-0 kubenswrapper[29936]: I1205 12:53:10.837874 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 05 12:53:10.852381 master-0 kubenswrapper[29936]: I1205 12:53:10.852284 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 12:53:10.952534 master-0 kubenswrapper[29936]: I1205 12:53:10.952434 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-5tl2j" Dec 05 12:53:10.962006 master-0 kubenswrapper[29936]: I1205 12:53:10.961906 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-2vcf7" Dec 05 12:53:11.044862 master-0 kubenswrapper[29936]: I1205 12:53:11.044779 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 12:53:11.229578 master-0 kubenswrapper[29936]: I1205 12:53:11.229412 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-xvwgq" Dec 05 12:53:11.231318 master-0 kubenswrapper[29936]: I1205 12:53:11.231290 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 05 12:53:11.373516 master-0 kubenswrapper[29936]: I1205 12:53:11.373434 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-jbzfz" Dec 05 12:53:11.386068 master-0 kubenswrapper[29936]: I1205 12:53:11.386017 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 12:53:11.397044 master-0 kubenswrapper[29936]: I1205 12:53:11.396941 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 12:53:11.462807 master-0 kubenswrapper[29936]: I1205 12:53:11.462549 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 12:53:11.477062 master-0 kubenswrapper[29936]: I1205 12:53:11.476996 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 12:53:11.479835 master-0 kubenswrapper[29936]: I1205 12:53:11.479721 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 12:53:11.485857 master-0 kubenswrapper[29936]: I1205 12:53:11.485808 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 12:53:11.509292 master-0 kubenswrapper[29936]: I1205 12:53:11.509155 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 12:53:11.542135 master-0 kubenswrapper[29936]: I1205 12:53:11.542061 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 05 12:53:11.550687 master-0 kubenswrapper[29936]: I1205 12:53:11.550612 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 12:53:11.604475 master-0 kubenswrapper[29936]: I1205 12:53:11.604408 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 12:53:11.655257 master-0 kubenswrapper[29936]: I1205 12:53:11.655161 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 12:53:11.664278 master-0 kubenswrapper[29936]: I1205 12:53:11.664222 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 12:53:11.734846 master-0 kubenswrapper[29936]: I1205 12:53:11.734636 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 12:53:11.775150 master-0 kubenswrapper[29936]: I1205 12:53:11.774644 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 12:53:11.794824 master-0 kubenswrapper[29936]: I1205 12:53:11.794754 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:53:11.847451 master-0 kubenswrapper[29936]: I1205 12:53:11.847364 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-nz5rx" Dec 05 12:53:11.849078 master-0 kubenswrapper[29936]: I1205 12:53:11.849050 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 12:53:11.878036 master-0 kubenswrapper[29936]: I1205 12:53:11.877936 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-ljblm" Dec 05 12:53:11.892850 master-0 kubenswrapper[29936]: I1205 12:53:11.892778 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 12:53:11.962308 master-0 kubenswrapper[29936]: I1205 12:53:11.962226 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-4cwgg" Dec 05 12:53:12.009097 master-0 kubenswrapper[29936]: I1205 12:53:12.008919 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-fb2xd" Dec 05 12:53:12.229834 master-0 kubenswrapper[29936]: I1205 12:53:12.229769 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 12:53:12.248803 master-0 kubenswrapper[29936]: I1205 12:53:12.248718 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 12:53:12.405712 master-0 kubenswrapper[29936]: I1205 12:53:12.405534 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-tjfgr" Dec 05 12:53:12.412872 master-0 kubenswrapper[29936]: I1205 12:53:12.412820 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 12:53:12.439681 master-0 kubenswrapper[29936]: I1205 12:53:12.439613 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 12:53:12.455942 master-0 kubenswrapper[29936]: I1205 12:53:12.455879 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 12:53:12.659376 master-0 kubenswrapper[29936]: I1205 12:53:12.659210 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 12:53:12.671938 master-0 kubenswrapper[29936]: I1205 12:53:12.671858 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-j8gcn" Dec 05 12:53:12.762655 master-0 kubenswrapper[29936]: I1205 12:53:12.762612 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 12:53:12.779587 master-0 kubenswrapper[29936]: I1205 12:53:12.779556 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 12:53:12.833205 master-0 kubenswrapper[29936]: I1205 12:53:12.833132 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 12:53:13.000443 master-0 kubenswrapper[29936]: I1205 12:53:13.000365 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 12:53:13.008200 master-0 kubenswrapper[29936]: I1205 12:53:13.008147 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 12:53:13.021703 master-0 kubenswrapper[29936]: I1205 12:53:13.021667 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 12:53:13.029478 master-0 kubenswrapper[29936]: I1205 12:53:13.029449 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 12:53:13.065884 master-0 kubenswrapper[29936]: I1205 12:53:13.065838 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 05 12:53:13.083563 master-0 kubenswrapper[29936]: I1205 12:53:13.083515 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 12:53:13.242737 master-0 kubenswrapper[29936]: I1205 12:53:13.242667 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 05 12:53:13.243420 master-0 kubenswrapper[29936]: I1205 12:53:13.243358 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 05 12:53:13.251339 master-0 kubenswrapper[29936]: I1205 12:53:13.251225 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 12:53:13.263058 master-0 kubenswrapper[29936]: I1205 12:53:13.263001 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 05 12:53:13.341723 master-0 kubenswrapper[29936]: I1205 12:53:13.341641 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 12:53:13.381651 master-0 kubenswrapper[29936]: I1205 12:53:13.381588 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 12:53:13.463022 master-0 kubenswrapper[29936]: I1205 12:53:13.462928 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 12:53:13.499820 master-0 kubenswrapper[29936]: I1205 12:53:13.499731 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 12:53:13.519241 master-0 kubenswrapper[29936]: I1205 12:53:13.519027 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-fqrhd" Dec 05 12:53:13.725064 master-0 kubenswrapper[29936]: I1205 12:53:13.724826 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 12:53:13.748090 master-0 kubenswrapper[29936]: I1205 12:53:13.747969 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 05 12:53:13.775151 master-0 kubenswrapper[29936]: I1205 12:53:13.774991 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 12:53:13.810995 master-0 kubenswrapper[29936]: I1205 12:53:13.810910 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 05 12:53:13.859805 master-0 kubenswrapper[29936]: I1205 12:53:13.859718 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-bcwx4" Dec 05 12:53:13.869614 master-0 kubenswrapper[29936]: I1205 12:53:13.869528 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 12:53:13.899681 master-0 kubenswrapper[29936]: I1205 12:53:13.899585 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 12:53:13.923584 master-0 kubenswrapper[29936]: I1205 12:53:13.923511 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 12:53:13.975767 master-0 kubenswrapper[29936]: I1205 12:53:13.975689 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 12:53:14.049210 master-0 kubenswrapper[29936]: I1205 12:53:14.049032 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-w5k28" Dec 05 12:53:14.075958 master-0 kubenswrapper[29936]: I1205 12:53:14.075867 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 05 12:53:14.167323 master-0 kubenswrapper[29936]: I1205 12:53:14.166922 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 05 12:53:14.228677 master-0 kubenswrapper[29936]: I1205 12:53:14.228583 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 05 12:53:14.267401 master-0 kubenswrapper[29936]: I1205 12:53:14.267334 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 12:53:14.282300 master-0 kubenswrapper[29936]: I1205 12:53:14.282225 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 12:53:14.352817 master-0 kubenswrapper[29936]: I1205 12:53:14.352659 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 12:53:14.411120 master-0 kubenswrapper[29936]: I1205 12:53:14.411049 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 05 12:53:14.498282 master-0 kubenswrapper[29936]: I1205 12:53:14.497984 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 05 12:53:14.545246 master-0 kubenswrapper[29936]: I1205 12:53:14.545143 29936 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 05 12:53:14.545893 master-0 kubenswrapper[29936]: I1205 12:53:14.545814 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="b83ccd6fa217a93a2c607d0109896ef8" containerName="startup-monitor" containerID="cri-o://0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f" gracePeriod=5 Dec 05 12:53:14.556046 master-0 kubenswrapper[29936]: I1205 12:53:14.555977 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 12:53:14.558715 master-0 kubenswrapper[29936]: I1205 12:53:14.558668 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 12:53:14.601410 master-0 kubenswrapper[29936]: I1205 12:53:14.601344 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 12:53:14.626432 master-0 kubenswrapper[29936]: I1205 12:53:14.626233 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 12:53:14.647923 master-0 kubenswrapper[29936]: I1205 12:53:14.647749 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-s9ftm" Dec 05 12:53:14.756991 master-0 kubenswrapper[29936]: I1205 12:53:14.756922 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 12:53:14.805991 master-0 kubenswrapper[29936]: I1205 12:53:14.805945 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 12:53:15.141206 master-0 kubenswrapper[29936]: I1205 12:53:15.141139 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 12:53:15.142468 master-0 kubenswrapper[29936]: I1205 12:53:15.142439 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-bd2pn" Dec 05 12:53:15.185102 master-0 kubenswrapper[29936]: I1205 12:53:15.185041 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 12:53:15.250706 master-0 kubenswrapper[29936]: I1205 12:53:15.250654 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 12:53:15.272323 master-0 kubenswrapper[29936]: I1205 12:53:15.272301 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 12:53:15.278731 master-0 kubenswrapper[29936]: I1205 12:53:15.278712 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-wrm9q" Dec 05 12:53:15.295613 master-0 kubenswrapper[29936]: I1205 12:53:15.295596 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 12:53:15.403706 master-0 kubenswrapper[29936]: I1205 12:53:15.403551 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-zpvbv" Dec 05 12:53:15.408167 master-0 kubenswrapper[29936]: I1205 12:53:15.408125 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:53:15.412958 master-0 kubenswrapper[29936]: I1205 12:53:15.412896 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:53:15.471407 master-0 kubenswrapper[29936]: I1205 12:53:15.471352 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:53:15.475752 master-0 kubenswrapper[29936]: I1205 12:53:15.475710 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:53:15.561204 master-0 kubenswrapper[29936]: I1205 12:53:15.556576 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 12:53:15.714690 master-0 kubenswrapper[29936]: I1205 12:53:15.714631 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 12:53:15.897814 master-0 kubenswrapper[29936]: I1205 12:53:15.897756 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 05 12:53:16.153900 master-0 kubenswrapper[29936]: I1205 12:53:16.153777 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 12:53:16.809398 master-0 kubenswrapper[29936]: I1205 12:53:16.809336 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 05 12:53:17.032853 master-0 kubenswrapper[29936]: I1205 12:53:17.032807 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 12:53:17.847480 master-0 kubenswrapper[29936]: I1205 12:53:17.847412 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 12:53:17.981487 master-0 kubenswrapper[29936]: I1205 12:53:17.981392 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 12:53:18.087415 master-0 kubenswrapper[29936]: I1205 12:53:18.087320 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 12:53:20.138698 master-0 kubenswrapper[29936]: I1205 12:53:20.138629 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_b83ccd6fa217a93a2c607d0109896ef8/startup-monitor/0.log" Dec 05 12:53:20.139425 master-0 kubenswrapper[29936]: I1205 12:53:20.138745 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:53:20.186779 master-0 kubenswrapper[29936]: I1205 12:53:20.186558 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-log\") pod \"b83ccd6fa217a93a2c607d0109896ef8\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " Dec 05 12:53:20.187039 master-0 kubenswrapper[29936]: I1205 12:53:20.186702 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-log" (OuterVolumeSpecName: "var-log") pod "b83ccd6fa217a93a2c607d0109896ef8" (UID: "b83ccd6fa217a93a2c607d0109896ef8"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:53:20.187039 master-0 kubenswrapper[29936]: I1205 12:53:20.186902 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-resource-dir\") pod \"b83ccd6fa217a93a2c607d0109896ef8\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " Dec 05 12:53:20.187039 master-0 kubenswrapper[29936]: I1205 12:53:20.186966 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-lock\") pod \"b83ccd6fa217a93a2c607d0109896ef8\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " Dec 05 12:53:20.187039 master-0 kubenswrapper[29936]: I1205 12:53:20.186995 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-pod-resource-dir\") pod \"b83ccd6fa217a93a2c607d0109896ef8\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " Dec 05 12:53:20.187340 master-0 kubenswrapper[29936]: I1205 12:53:20.187078 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-lock" (OuterVolumeSpecName: "var-lock") pod "b83ccd6fa217a93a2c607d0109896ef8" (UID: "b83ccd6fa217a93a2c607d0109896ef8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:53:20.187340 master-0 kubenswrapper[29936]: I1205 12:53:20.187119 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "b83ccd6fa217a93a2c607d0109896ef8" (UID: "b83ccd6fa217a93a2c607d0109896ef8"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:53:20.187624 master-0 kubenswrapper[29936]: I1205 12:53:20.187594 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-manifests\") pod \"b83ccd6fa217a93a2c607d0109896ef8\" (UID: \"b83ccd6fa217a93a2c607d0109896ef8\") " Dec 05 12:53:20.187783 master-0 kubenswrapper[29936]: I1205 12:53:20.187739 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-manifests" (OuterVolumeSpecName: "manifests") pod "b83ccd6fa217a93a2c607d0109896ef8" (UID: "b83ccd6fa217a93a2c607d0109896ef8"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:53:20.188154 master-0 kubenswrapper[29936]: I1205 12:53:20.188112 29936 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-log\") on node \"master-0\" DevicePath \"\"" Dec 05 12:53:20.188154 master-0 kubenswrapper[29936]: I1205 12:53:20.188140 29936 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:53:20.188154 master-0 kubenswrapper[29936]: I1205 12:53:20.188154 29936 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:53:20.188393 master-0 kubenswrapper[29936]: I1205 12:53:20.188166 29936 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-manifests\") on node \"master-0\" DevicePath \"\"" Dec 05 12:53:20.193024 master-0 kubenswrapper[29936]: I1205 12:53:20.192969 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "b83ccd6fa217a93a2c607d0109896ef8" (UID: "b83ccd6fa217a93a2c607d0109896ef8"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:53:20.259102 master-0 kubenswrapper[29936]: I1205 12:53:20.259046 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_b83ccd6fa217a93a2c607d0109896ef8/startup-monitor/0.log" Dec 05 12:53:20.259102 master-0 kubenswrapper[29936]: I1205 12:53:20.259108 29936 generic.go:334] "Generic (PLEG): container finished" podID="b83ccd6fa217a93a2c607d0109896ef8" containerID="0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f" exitCode=137 Dec 05 12:53:20.259644 master-0 kubenswrapper[29936]: I1205 12:53:20.259158 29936 scope.go:117] "RemoveContainer" containerID="0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f" Dec 05 12:53:20.259644 master-0 kubenswrapper[29936]: I1205 12:53:20.259217 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 05 12:53:20.283293 master-0 kubenswrapper[29936]: I1205 12:53:20.283254 29936 scope.go:117] "RemoveContainer" containerID="0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f" Dec 05 12:53:20.283865 master-0 kubenswrapper[29936]: E1205 12:53:20.283814 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f\": container with ID starting with 0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f not found: ID does not exist" containerID="0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f" Dec 05 12:53:20.283947 master-0 kubenswrapper[29936]: I1205 12:53:20.283858 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f"} err="failed to get container status \"0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f\": rpc error: code = NotFound desc = could not find container \"0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f\": container with ID starting with 0093cbbdc8caaf843b6029f230d6447ee498121846e93f7e0c4a138c3421020f not found: ID does not exist" Dec 05 12:53:20.289740 master-0 kubenswrapper[29936]: I1205 12:53:20.289709 29936 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/b83ccd6fa217a93a2c607d0109896ef8-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:53:21.203358 master-0 kubenswrapper[29936]: I1205 12:53:21.202987 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83ccd6fa217a93a2c607d0109896ef8" path="/var/lib/kubelet/pods/b83ccd6fa217a93a2c607d0109896ef8/volumes" Dec 05 12:53:21.203992 master-0 kubenswrapper[29936]: I1205 12:53:21.203448 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Dec 05 12:53:21.575161 master-0 kubenswrapper[29936]: I1205 12:53:21.575010 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 05 12:53:21.575161 master-0 kubenswrapper[29936]: I1205 12:53:21.575070 29936 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="695841ad-4c26-4ef3-b627-4c6340262a46" Dec 05 12:53:21.579859 master-0 kubenswrapper[29936]: I1205 12:53:21.579791 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 05 12:53:21.579859 master-0 kubenswrapper[29936]: I1205 12:53:21.579848 29936 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="695841ad-4c26-4ef3-b627-4c6340262a46" Dec 05 12:53:36.398673 master-0 kubenswrapper[29936]: I1205 12:53:36.398606 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_610dc2015b38bc32879d55a7d39b2587/kube-controller-manager/1.log" Dec 05 12:53:36.402052 master-0 kubenswrapper[29936]: I1205 12:53:36.401977 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_610dc2015b38bc32879d55a7d39b2587/kube-controller-manager/0.log" Dec 05 12:53:36.402262 master-0 kubenswrapper[29936]: I1205 12:53:36.402087 29936 generic.go:334] "Generic (PLEG): container finished" podID="610dc2015b38bc32879d55a7d39b2587" containerID="67d67d3e89e13fa99e16d8850ae9285f71eeb433f6a0cc9257e00f3e497935e9" exitCode=137 Dec 05 12:53:36.402262 master-0 kubenswrapper[29936]: I1205 12:53:36.402157 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"610dc2015b38bc32879d55a7d39b2587","Type":"ContainerDied","Data":"67d67d3e89e13fa99e16d8850ae9285f71eeb433f6a0cc9257e00f3e497935e9"} Dec 05 12:53:36.402492 master-0 kubenswrapper[29936]: I1205 12:53:36.402279 29936 scope.go:117] "RemoveContainer" containerID="36ca02e8be7a0b8aad017a2fba35ee2e24e24ec30949f922f7c15439af96ed15" Dec 05 12:53:37.411626 master-0 kubenswrapper[29936]: I1205 12:53:37.411554 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_610dc2015b38bc32879d55a7d39b2587/kube-controller-manager/1.log" Dec 05 12:53:37.412533 master-0 kubenswrapper[29936]: I1205 12:53:37.412484 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"610dc2015b38bc32879d55a7d39b2587","Type":"ContainerStarted","Data":"e8917c3711bbe1adfa1dc4fa6befd9275e69d1180a7505f4e499700e3290a159"} Dec 05 12:53:39.346216 master-0 kubenswrapper[29936]: I1205 12:53:39.346115 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 12:53:45.152055 master-0 kubenswrapper[29936]: I1205 12:53:45.151937 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:53:45.153094 master-0 kubenswrapper[29936]: I1205 12:53:45.152431 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:53:45.158153 master-0 kubenswrapper[29936]: I1205 12:53:45.158106 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:53:54.266535 master-0 kubenswrapper[29936]: E1205 12:53:54.266452 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: secret "metrics-server-e63soeg91on8p" not found Dec 05 12:53:54.267590 master-0 kubenswrapper[29936]: E1205 12:53:54.266557 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:53:54.766532797 +0000 UTC m=+231.898612478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : secret "metrics-server-e63soeg91on8p" not found Dec 05 12:53:54.775070 master-0 kubenswrapper[29936]: E1205 12:53:54.774979 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: secret "metrics-server-e63soeg91on8p" not found Dec 05 12:53:54.775408 master-0 kubenswrapper[29936]: E1205 12:53:54.775106 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:53:55.775083678 +0000 UTC m=+232.907163359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : secret "metrics-server-e63soeg91on8p" not found Dec 05 12:53:55.157651 master-0 kubenswrapper[29936]: I1205 12:53:55.157483 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:53:55.552914 master-0 kubenswrapper[29936]: I1205 12:53:55.550382 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 05 12:53:55.557034 master-0 kubenswrapper[29936]: E1205 12:53:55.554029 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" containerName="installer" Dec 05 12:53:55.557353 master-0 kubenswrapper[29936]: I1205 12:53:55.557319 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" containerName="installer" Dec 05 12:53:55.557506 master-0 kubenswrapper[29936]: E1205 12:53:55.557490 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" Dec 05 12:53:55.557592 master-0 kubenswrapper[29936]: I1205 12:53:55.557576 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" Dec 05 12:53:55.557706 master-0 kubenswrapper[29936]: E1205 12:53:55.557691 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83ccd6fa217a93a2c607d0109896ef8" containerName="startup-monitor" Dec 05 12:53:55.557789 master-0 kubenswrapper[29936]: I1205 12:53:55.557778 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83ccd6fa217a93a2c607d0109896ef8" containerName="startup-monitor" Dec 05 12:53:55.558444 master-0 kubenswrapper[29936]: I1205 12:53:55.558206 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f117368-9d0a-4c16-8d03-ffc83d250dd1" containerName="installer" Dec 05 12:53:55.558565 master-0 kubenswrapper[29936]: I1205 12:53:55.558548 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83ccd6fa217a93a2c607d0109896ef8" containerName="startup-monitor" Dec 05 12:53:55.558657 master-0 kubenswrapper[29936]: I1205 12:53:55.558643 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ec0c161-e3a9-4b81-a9e7-deba8be2b5f5" containerName="console" Dec 05 12:53:55.562003 master-0 kubenswrapper[29936]: I1205 12:53:55.561747 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.576243 master-0 kubenswrapper[29936]: I1205 12:53:55.565969 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5686c796dc-z46lg"] Dec 05 12:53:55.576243 master-0 kubenswrapper[29936]: I1205 12:53:55.568830 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.579452 master-0 kubenswrapper[29936]: I1205 12:53:55.579396 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 05 12:53:55.583513 master-0 kubenswrapper[29936]: I1205 12:53:55.583475 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-46ue55kh1j5mf" Dec 05 12:53:55.584062 master-0 kubenswrapper[29936]: I1205 12:53:55.584045 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 05 12:53:55.591209 master-0 kubenswrapper[29936]: I1205 12:53:55.590352 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 05 12:53:55.591209 master-0 kubenswrapper[29936]: I1205 12:53:55.590756 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 05 12:53:55.591209 master-0 kubenswrapper[29936]: I1205 12:53:55.590983 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 05 12:53:55.591556 master-0 kubenswrapper[29936]: I1205 12:53:55.591404 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-tls\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.591556 master-0 kubenswrapper[29936]: I1205 12:53:55.591432 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.591556 master-0 kubenswrapper[29936]: I1205 12:53:55.591462 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/34b86c49-87d9-4167-899e-d070aff1dc10-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.591556 master-0 kubenswrapper[29936]: I1205 12:53:55.591478 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-web-config\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.591556 master-0 kubenswrapper[29936]: I1205 12:53:55.591495 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.591556 master-0 kubenswrapper[29936]: I1205 12:53:55.591514 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.591556 master-0 kubenswrapper[29936]: I1205 12:53:55.591529 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42f3c259-d745-4060-9286-ac8999a49163-metrics-client-ca\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.591905 master-0 kubenswrapper[29936]: I1205 12:53:55.591866 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 05 12:53:55.592162 master-0 kubenswrapper[29936]: I1205 12:53:55.592135 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 05 12:53:55.592445 master-0 kubenswrapper[29936]: I1205 12:53:55.592399 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 05 12:53:55.597212 master-0 kubenswrapper[29936]: I1205 12:53:55.591548 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.597212 master-0 kubenswrapper[29936]: I1205 12:53:55.592938 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/34b86c49-87d9-4167-899e-d070aff1dc10-config-out\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.597212 master-0 kubenswrapper[29936]: I1205 12:53:55.592972 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.597212 master-0 kubenswrapper[29936]: I1205 12:53:55.593007 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.597212 master-0 kubenswrapper[29936]: I1205 12:53:55.593032 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.597212 master-0 kubenswrapper[29936]: I1205 12:53:55.593061 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.597212 master-0 kubenswrapper[29936]: I1205 12:53:55.593082 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-config\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.597212 master-0 kubenswrapper[29936]: I1205 12:53:55.593139 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn48d\" (UniqueName: \"kubernetes.io/projected/42f3c259-d745-4060-9286-ac8999a49163-kube-api-access-vn48d\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.597212 master-0 kubenswrapper[29936]: I1205 12:53:55.593159 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.598064 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.598242 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.598309 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p48gk\" (UniqueName: \"kubernetes.io/projected/34b86c49-87d9-4167-899e-d070aff1dc10-kube-api-access-p48gk\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.598406 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.598440 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.598465 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.598507 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.598543 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-grpc-tls\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.598570 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.598616 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/34b86c49-87d9-4167-899e-d070aff1dc10-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.600554 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.600716 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 05 12:53:55.604324 master-0 kubenswrapper[29936]: I1205 12:53:55.600839 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 05 12:53:55.605458 master-0 kubenswrapper[29936]: I1205 12:53:55.605413 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-33c1fv6k6k7io" Dec 05 12:53:55.605891 master-0 kubenswrapper[29936]: I1205 12:53:55.605873 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 05 12:53:55.613205 master-0 kubenswrapper[29936]: I1205 12:53:55.607765 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 05 12:53:55.618218 master-0 kubenswrapper[29936]: I1205 12:53:55.618000 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 05 12:53:55.626320 master-0 kubenswrapper[29936]: I1205 12:53:55.623280 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-58cd4759dd-kpl86"] Dec 05 12:53:55.626320 master-0 kubenswrapper[29936]: I1205 12:53:55.625375 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.643234 master-0 kubenswrapper[29936]: I1205 12:53:55.635222 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-b6876557c-zz5pl"] Dec 05 12:53:55.643234 master-0 kubenswrapper[29936]: I1205 12:53:55.636451 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.649214 master-0 kubenswrapper[29936]: I1205 12:53:55.648852 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Dec 05 12:53:55.691842 master-0 kubenswrapper[29936]: I1205 12:53:55.687863 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Dec 05 12:53:55.691842 master-0 kubenswrapper[29936]: I1205 12:53:55.688531 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Dec 05 12:53:55.691842 master-0 kubenswrapper[29936]: I1205 12:53:55.689102 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Dec 05 12:53:55.692904 master-0 kubenswrapper[29936]: I1205 12:53:55.692561 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Dec 05 12:53:55.708143 master-0 kubenswrapper[29936]: I1205 12:53:55.702934 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-861e0jpjd3v15" Dec 05 12:53:55.731109 master-0 kubenswrapper[29936]: I1205 12:53:55.727476 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x"] Dec 05 12:53:55.765575 master-0 kubenswrapper[29936]: I1205 12:53:55.765496 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 05 12:53:55.765974 master-0 kubenswrapper[29936]: I1205 12:53:55.765918 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x" Dec 05 12:53:55.768334 master-0 kubenswrapper[29936]: I1205 12:53:55.768215 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-56c9b9fa8d9gs" Dec 05 12:53:55.768334 master-0 kubenswrapper[29936]: I1205 12:53:55.768267 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn48d\" (UniqueName: \"kubernetes.io/projected/42f3c259-d745-4060-9286-ac8999a49163-kube-api-access-vn48d\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.768460 master-0 kubenswrapper[29936]: I1205 12:53:55.768342 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.768460 master-0 kubenswrapper[29936]: I1205 12:53:55.768384 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.768460 master-0 kubenswrapper[29936]: I1205 12:53:55.768433 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.768619 master-0 kubenswrapper[29936]: I1205 12:53:55.768498 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p48gk\" (UniqueName: \"kubernetes.io/projected/34b86c49-87d9-4167-899e-d070aff1dc10-kube-api-access-p48gk\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.768619 master-0 kubenswrapper[29936]: I1205 12:53:55.768572 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.768619 master-0 kubenswrapper[29936]: I1205 12:53:55.768600 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.769244 master-0 kubenswrapper[29936]: I1205 12:53:55.768627 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.769244 master-0 kubenswrapper[29936]: I1205 12:53:55.768673 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.769244 master-0 kubenswrapper[29936]: I1205 12:53:55.768698 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-grpc-tls\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.769244 master-0 kubenswrapper[29936]: I1205 12:53:55.768725 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.769244 master-0 kubenswrapper[29936]: I1205 12:53:55.768771 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/34b86c49-87d9-4167-899e-d070aff1dc10-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.769244 master-0 kubenswrapper[29936]: I1205 12:53:55.768808 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-tls\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.769244 master-0 kubenswrapper[29936]: I1205 12:53:55.768843 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.771264 master-0 kubenswrapper[29936]: I1205 12:53:55.771219 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/34b86c49-87d9-4167-899e-d070aff1dc10-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.771336 master-0 kubenswrapper[29936]: I1205 12:53:55.771280 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-web-config\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.771336 master-0 kubenswrapper[29936]: I1205 12:53:55.771317 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.771427 master-0 kubenswrapper[29936]: I1205 12:53:55.771348 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.771427 master-0 kubenswrapper[29936]: I1205 12:53:55.771377 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42f3c259-d745-4060-9286-ac8999a49163-metrics-client-ca\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.771427 master-0 kubenswrapper[29936]: I1205 12:53:55.771417 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/34b86c49-87d9-4167-899e-d070aff1dc10-config-out\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.771522 master-0 kubenswrapper[29936]: I1205 12:53:55.771446 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.771522 master-0 kubenswrapper[29936]: I1205 12:53:55.771482 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.771584 master-0 kubenswrapper[29936]: I1205 12:53:55.771524 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.771584 master-0 kubenswrapper[29936]: I1205 12:53:55.771556 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.771643 master-0 kubenswrapper[29936]: I1205 12:53:55.771596 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.771643 master-0 kubenswrapper[29936]: I1205 12:53:55.771627 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-config\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.772447 master-0 kubenswrapper[29936]: I1205 12:53:55.772393 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74ffd5f75f-slrkr"] Dec 05 12:53:55.772859 master-0 kubenswrapper[29936]: I1205 12:53:55.772803 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/34b86c49-87d9-4167-899e-d070aff1dc10-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.773620 master-0 kubenswrapper[29936]: E1205 12:53:55.773541 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 05 12:53:55.773620 master-0 kubenswrapper[29936]: E1205 12:53:55.773613 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:53:56.273591004 +0000 UTC m=+233.405670895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-tls" not found Dec 05 12:53:55.775668 master-0 kubenswrapper[29936]: I1205 12:53:55.775421 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42f3c259-d745-4060-9286-ac8999a49163-metrics-client-ca\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.778067 master-0 kubenswrapper[29936]: I1205 12:53:55.778033 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-tls\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.778164 master-0 kubenswrapper[29936]: E1205 12:53:55.778138 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:53:55.778233 master-0 kubenswrapper[29936]: E1205 12:53:55.778211 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:53:56.278171611 +0000 UTC m=+233.410251292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:53:55.778296 master-0 kubenswrapper[29936]: I1205 12:53:55.778241 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 05 12:53:55.778915 master-0 kubenswrapper[29936]: I1205 12:53:55.778620 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 05 12:53:55.780352 master-0 kubenswrapper[29936]: I1205 12:53:55.779356 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.780352 master-0 kubenswrapper[29936]: I1205 12:53:55.780147 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.781240 master-0 kubenswrapper[29936]: I1205 12:53:55.781149 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.782599 master-0 kubenswrapper[29936]: I1205 12:53:55.781881 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.783298 master-0 kubenswrapper[29936]: I1205 12:53:55.783221 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-config\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.783817 master-0 kubenswrapper[29936]: I1205 12:53:55.783776 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.784038 master-0 kubenswrapper[29936]: I1205 12:53:55.783900 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.784590 master-0 kubenswrapper[29936]: I1205 12:53:55.784028 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.784590 master-0 kubenswrapper[29936]: I1205 12:53:55.784099 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-web-config\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.784760 master-0 kubenswrapper[29936]: I1205 12:53:55.784728 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/34b86c49-87d9-4167-899e-d070aff1dc10-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.785000 master-0 kubenswrapper[29936]: I1205 12:53:55.784937 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.785257 master-0 kubenswrapper[29936]: I1205 12:53:55.785220 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34b86c49-87d9-4167-899e-d070aff1dc10-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.785541 master-0 kubenswrapper[29936]: I1205 12:53:55.785514 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.785795 master-0 kubenswrapper[29936]: I1205 12:53:55.785766 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.787230 master-0 kubenswrapper[29936]: I1205 12:53:55.787188 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5686c796dc-z46lg"] Dec 05 12:53:55.787832 master-0 kubenswrapper[29936]: I1205 12:53:55.787787 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.787892 master-0 kubenswrapper[29936]: I1205 12:53:55.787793 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/34b86c49-87d9-4167-899e-d070aff1dc10-config-out\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.788004 master-0 kubenswrapper[29936]: I1205 12:53:55.787945 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/42f3c259-d745-4060-9286-ac8999a49163-secret-grpc-tls\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.788214 master-0 kubenswrapper[29936]: I1205 12:53:55.788149 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.793854 master-0 kubenswrapper[29936]: I1205 12:53:55.793626 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.793854 master-0 kubenswrapper[29936]: I1205 12:53:55.793740 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x"] Dec 05 12:53:55.798464 master-0 kubenswrapper[29936]: I1205 12:53:55.798416 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-b6876557c-zz5pl"] Dec 05 12:53:55.803684 master-0 kubenswrapper[29936]: I1205 12:53:55.803571 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/monitoring-plugin-56b6c94668-k88cs"] Dec 05 12:53:55.803944 master-0 kubenswrapper[29936]: I1205 12:53:55.803855 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" podUID="a2c37759-9414-4ca7-8bd4-20c4f689189b" containerName="monitoring-plugin" containerID="cri-o://d7f23bbeaa1c82e3247b30df8f006c095fc00a421e05c2e364c2c95154bd5ea5" gracePeriod=30 Dec 05 12:53:55.853366 master-0 kubenswrapper[29936]: I1205 12:53:55.853285 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-58cd4759dd-kpl86"] Dec 05 12:53:55.859276 master-0 kubenswrapper[29936]: I1205 12:53:55.858712 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p48gk\" (UniqueName: \"kubernetes.io/projected/34b86c49-87d9-4167-899e-d070aff1dc10-kube-api-access-p48gk\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:55.859276 master-0 kubenswrapper[29936]: I1205 12:53:55.858795 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-54c5748c8c-kqs7s"] Dec 05 12:53:55.859276 master-0 kubenswrapper[29936]: I1205 12:53:55.859042 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" podUID="a5338041-f213-46ef-9d81-248567ba958d" containerName="metrics-server" containerID="cri-o://8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853" gracePeriod=170 Dec 05 12:53:55.874706 master-0 kubenswrapper[29936]: I1205 12:53:55.874615 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-federate-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.874706 master-0 kubenswrapper[29936]: I1205 12:53:55.874700 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d02714eb-260c-4004-9feb-3a6c524756aa-metrics-client-ca\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.875065 master-0 kubenswrapper[29936]: I1205 12:53:55.874728 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qhq\" (UniqueName: \"kubernetes.io/projected/d02714eb-260c-4004-9feb-3a6c524756aa-kube-api-access-m5qhq\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.875065 master-0 kubenswrapper[29936]: I1205 12:53:55.874754 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rszwv\" (UniqueName: \"kubernetes.io/projected/d872dad8-8c57-4627-963d-37cae007bc41-kube-api-access-rszwv\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.875065 master-0 kubenswrapper[29936]: I1205 12:53:55.874895 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3771a3be-b579-428a-8ef3-1513cce5292b-monitoring-plugin-cert\") pod \"monitoring-plugin-6b44d597f9-szx4x\" (UID: \"3771a3be-b579-428a-8ef3-1513cce5292b\") " pod="openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x" Dec 05 12:53:55.875065 master-0 kubenswrapper[29936]: I1205 12:53:55.874925 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.875065 master-0 kubenswrapper[29936]: I1205 12:53:55.874955 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-trusted-ca-bundle\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.875065 master-0 kubenswrapper[29936]: I1205 12:53:55.875006 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d872dad8-8c57-4627-963d-37cae007bc41-metrics-server-audit-profiles\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.875065 master-0 kubenswrapper[29936]: I1205 12:53:55.875040 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d872dad8-8c57-4627-963d-37cae007bc41-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.875317 master-0 kubenswrapper[29936]: I1205 12:53:55.875092 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d872dad8-8c57-4627-963d-37cae007bc41-client-ca-bundle\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.875317 master-0 kubenswrapper[29936]: I1205 12:53:55.875129 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d02714eb-260c-4004-9feb-3a6c524756aa-serving-certs-ca-bundle\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.875317 master-0 kubenswrapper[29936]: I1205 12:53:55.875164 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d872dad8-8c57-4627-963d-37cae007bc41-audit-log\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.875317 master-0 kubenswrapper[29936]: I1205 12:53:55.875252 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d872dad8-8c57-4627-963d-37cae007bc41-secret-metrics-server-tls\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.875317 master-0 kubenswrapper[29936]: I1205 12:53:55.875283 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-secret-telemeter-client\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.875493 master-0 kubenswrapper[29936]: I1205 12:53:55.875331 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d872dad8-8c57-4627-963d-37cae007bc41-secret-metrics-client-certs\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.875493 master-0 kubenswrapper[29936]: I1205 12:53:55.875364 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.880320 master-0 kubenswrapper[29936]: E1205 12:53:55.876469 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: secret "metrics-server-e63soeg91on8p" not found Dec 05 12:53:55.880320 master-0 kubenswrapper[29936]: E1205 12:53:55.876532 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:53:57.876511363 +0000 UTC m=+235.008591044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : secret "metrics-server-e63soeg91on8p" not found Dec 05 12:53:55.890052 master-0 kubenswrapper[29936]: I1205 12:53:55.889286 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn48d\" (UniqueName: \"kubernetes.io/projected/42f3c259-d745-4060-9286-ac8999a49163-kube-api-access-vn48d\") pod \"thanos-querier-5686c796dc-z46lg\" (UID: \"42f3c259-d745-4060-9286-ac8999a49163\") " pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:55.976987 master-0 kubenswrapper[29936]: I1205 12:53:55.976908 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d872dad8-8c57-4627-963d-37cae007bc41-secret-metrics-server-tls\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.977125 master-0 kubenswrapper[29936]: I1205 12:53:55.977027 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-secret-telemeter-client\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.977125 master-0 kubenswrapper[29936]: I1205 12:53:55.977093 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d872dad8-8c57-4627-963d-37cae007bc41-secret-metrics-client-certs\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.977277 master-0 kubenswrapper[29936]: I1205 12:53:55.977128 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.977474 master-0 kubenswrapper[29936]: I1205 12:53:55.977404 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-federate-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.977540 master-0 kubenswrapper[29936]: I1205 12:53:55.977505 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d02714eb-260c-4004-9feb-3a6c524756aa-metrics-client-ca\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.977596 master-0 kubenswrapper[29936]: I1205 12:53:55.977543 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qhq\" (UniqueName: \"kubernetes.io/projected/d02714eb-260c-4004-9feb-3a6c524756aa-kube-api-access-m5qhq\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.977596 master-0 kubenswrapper[29936]: E1205 12:53:55.977557 29936 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 05 12:53:55.977680 master-0 kubenswrapper[29936]: E1205 12:53:55.977626 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls podName:d02714eb-260c-4004-9feb-3a6c524756aa nodeName:}" failed. No retries permitted until 2025-12-05 12:53:56.477603992 +0000 UTC m=+233.609683673 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls") pod "telemeter-client-58cd4759dd-kpl86" (UID: "d02714eb-260c-4004-9feb-3a6c524756aa") : secret "telemeter-client-tls" not found Dec 05 12:53:55.977874 master-0 kubenswrapper[29936]: I1205 12:53:55.977822 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rszwv\" (UniqueName: \"kubernetes.io/projected/d872dad8-8c57-4627-963d-37cae007bc41-kube-api-access-rszwv\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.978018 master-0 kubenswrapper[29936]: I1205 12:53:55.977989 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3771a3be-b579-428a-8ef3-1513cce5292b-monitoring-plugin-cert\") pod \"monitoring-plugin-6b44d597f9-szx4x\" (UID: \"3771a3be-b579-428a-8ef3-1513cce5292b\") " pod="openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x" Dec 05 12:53:55.978071 master-0 kubenswrapper[29936]: I1205 12:53:55.978022 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.978071 master-0 kubenswrapper[29936]: I1205 12:53:55.978047 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-trusted-ca-bundle\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.978173 master-0 kubenswrapper[29936]: I1205 12:53:55.978115 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d872dad8-8c57-4627-963d-37cae007bc41-metrics-server-audit-profiles\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.979917 master-0 kubenswrapper[29936]: I1205 12:53:55.978761 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d872dad8-8c57-4627-963d-37cae007bc41-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.979917 master-0 kubenswrapper[29936]: I1205 12:53:55.979228 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d872dad8-8c57-4627-963d-37cae007bc41-client-ca-bundle\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.979917 master-0 kubenswrapper[29936]: I1205 12:53:55.979263 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d02714eb-260c-4004-9feb-3a6c524756aa-metrics-client-ca\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.979917 master-0 kubenswrapper[29936]: I1205 12:53:55.979815 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-trusted-ca-bundle\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.980959 master-0 kubenswrapper[29936]: I1205 12:53:55.979934 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d872dad8-8c57-4627-963d-37cae007bc41-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.980959 master-0 kubenswrapper[29936]: I1205 12:53:55.980213 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d872dad8-8c57-4627-963d-37cae007bc41-metrics-server-audit-profiles\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.980959 master-0 kubenswrapper[29936]: I1205 12:53:55.980685 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d02714eb-260c-4004-9feb-3a6c524756aa-serving-certs-ca-bundle\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.981260 master-0 kubenswrapper[29936]: I1205 12:53:55.979294 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d02714eb-260c-4004-9feb-3a6c524756aa-serving-certs-ca-bundle\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.981358 master-0 kubenswrapper[29936]: I1205 12:53:55.981329 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d872dad8-8c57-4627-963d-37cae007bc41-audit-log\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.981765 master-0 kubenswrapper[29936]: I1205 12:53:55.981709 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.981950 master-0 kubenswrapper[29936]: I1205 12:53:55.981909 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d872dad8-8c57-4627-963d-37cae007bc41-audit-log\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.982421 master-0 kubenswrapper[29936]: I1205 12:53:55.982369 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-federate-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.982601 master-0 kubenswrapper[29936]: I1205 12:53:55.982556 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d872dad8-8c57-4627-963d-37cae007bc41-secret-metrics-client-certs\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.983025 master-0 kubenswrapper[29936]: I1205 12:53:55.982967 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3771a3be-b579-428a-8ef3-1513cce5292b-monitoring-plugin-cert\") pod \"monitoring-plugin-6b44d597f9-szx4x\" (UID: \"3771a3be-b579-428a-8ef3-1513cce5292b\") " pod="openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x" Dec 05 12:53:55.983492 master-0 kubenswrapper[29936]: I1205 12:53:55.983446 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d872dad8-8c57-4627-963d-37cae007bc41-secret-metrics-server-tls\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:55.985244 master-0 kubenswrapper[29936]: I1205 12:53:55.985195 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-secret-telemeter-client\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:55.988491 master-0 kubenswrapper[29936]: I1205 12:53:55.988442 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d872dad8-8c57-4627-963d-37cae007bc41-client-ca-bundle\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:56.011370 master-0 kubenswrapper[29936]: I1205 12:53:56.011299 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x" Dec 05 12:53:56.080237 master-0 kubenswrapper[29936]: I1205 12:53:56.080000 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:53:56.278317 master-0 kubenswrapper[29936]: I1205 12:53:56.278262 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qhq\" (UniqueName: \"kubernetes.io/projected/d02714eb-260c-4004-9feb-3a6c524756aa-kube-api-access-m5qhq\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:56.287954 master-0 kubenswrapper[29936]: I1205 12:53:56.287199 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rszwv\" (UniqueName: \"kubernetes.io/projected/d872dad8-8c57-4627-963d-37cae007bc41-kube-api-access-rszwv\") pod \"metrics-server-b6876557c-zz5pl\" (UID: \"d872dad8-8c57-4627-963d-37cae007bc41\") " pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:56.287954 master-0 kubenswrapper[29936]: I1205 12:53:56.287709 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:56.287954 master-0 kubenswrapper[29936]: I1205 12:53:56.287829 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:56.291722 master-0 kubenswrapper[29936]: E1205 12:53:56.289624 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 05 12:53:56.291722 master-0 kubenswrapper[29936]: E1205 12:53:56.289723 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:53:57.289693304 +0000 UTC m=+234.421772985 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-tls" not found Dec 05 12:53:56.291722 master-0 kubenswrapper[29936]: E1205 12:53:56.290578 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:53:56.292347 master-0 kubenswrapper[29936]: E1205 12:53:56.292256 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:53:57.292222615 +0000 UTC m=+234.424302296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:53:56.425406 master-0 kubenswrapper[29936]: I1205 12:53:56.424437 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 05 12:53:56.427696 master-0 kubenswrapper[29936]: I1205 12:53:56.427506 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.432990 master-0 kubenswrapper[29936]: I1205 12:53:56.431504 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 05 12:53:56.432990 master-0 kubenswrapper[29936]: I1205 12:53:56.431521 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 05 12:53:56.432990 master-0 kubenswrapper[29936]: I1205 12:53:56.431537 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 05 12:53:56.432990 master-0 kubenswrapper[29936]: I1205 12:53:56.432612 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 05 12:53:56.432990 master-0 kubenswrapper[29936]: I1205 12:53:56.432973 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 05 12:53:56.433354 master-0 kubenswrapper[29936]: I1205 12:53:56.433098 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 05 12:53:56.435859 master-0 kubenswrapper[29936]: I1205 12:53:56.433399 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 05 12:53:56.441838 master-0 kubenswrapper[29936]: I1205 12:53:56.440817 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 05 12:53:56.454476 master-0 kubenswrapper[29936]: I1205 12:53:56.453520 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 05 12:53:56.498380 master-0 kubenswrapper[29936]: I1205 12:53:56.498307 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fbcv\" (UniqueName: \"kubernetes.io/projected/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-kube-api-access-4fbcv\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.498380 master-0 kubenswrapper[29936]: I1205 12:53:56.498388 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.498835 master-0 kubenswrapper[29936]: I1205 12:53:56.498637 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.498899 master-0 kubenswrapper[29936]: I1205 12:53:56.498871 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:56.498951 master-0 kubenswrapper[29936]: I1205 12:53:56.498899 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-config-out\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.498951 master-0 kubenswrapper[29936]: I1205 12:53:56.498929 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-web-config\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.498951 master-0 kubenswrapper[29936]: I1205 12:53:56.498951 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.499091 master-0 kubenswrapper[29936]: I1205 12:53:56.498976 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-config-volume\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.499091 master-0 kubenswrapper[29936]: I1205 12:53:56.498997 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.499091 master-0 kubenswrapper[29936]: I1205 12:53:56.499023 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.499091 master-0 kubenswrapper[29936]: I1205 12:53:56.499069 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.499091 master-0 kubenswrapper[29936]: I1205 12:53:56.499095 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.499334 master-0 kubenswrapper[29936]: I1205 12:53:56.499122 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.499411 master-0 kubenswrapper[29936]: E1205 12:53:56.499381 29936 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 05 12:53:56.499464 master-0 kubenswrapper[29936]: E1205 12:53:56.499441 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls podName:d02714eb-260c-4004-9feb-3a6c524756aa nodeName:}" failed. No retries permitted until 2025-12-05 12:53:57.499426302 +0000 UTC m=+234.631505973 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls") pod "telemeter-client-58cd4759dd-kpl86" (UID: "d02714eb-260c-4004-9feb-3a6c524756aa") : secret "telemeter-client-tls" not found Dec 05 12:53:56.523828 master-0 kubenswrapper[29936]: I1205 12:53:56.523759 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x"] Dec 05 12:53:56.532100 master-0 kubenswrapper[29936]: W1205 12:53:56.532020 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3771a3be_b579_428a_8ef3_1513cce5292b.slice/crio-2ee1be2a42e47743bd8056afe76112d3d4599e36e27aa4f64fabbb33262f8a91 WatchSource:0}: Error finding container 2ee1be2a42e47743bd8056afe76112d3d4599e36e27aa4f64fabbb33262f8a91: Status 404 returned error can't find the container with id 2ee1be2a42e47743bd8056afe76112d3d4599e36e27aa4f64fabbb33262f8a91 Dec 05 12:53:56.574558 master-0 kubenswrapper[29936]: I1205 12:53:56.574476 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:53:56.600355 master-0 kubenswrapper[29936]: I1205 12:53:56.600273 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.600355 master-0 kubenswrapper[29936]: I1205 12:53:56.600354 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-config-out\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.600743 master-0 kubenswrapper[29936]: I1205 12:53:56.600604 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-web-config\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.600743 master-0 kubenswrapper[29936]: I1205 12:53:56.600720 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.600819 master-0 kubenswrapper[29936]: I1205 12:53:56.600782 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-config-volume\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.600851 master-0 kubenswrapper[29936]: I1205 12:53:56.600838 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.600956 master-0 kubenswrapper[29936]: I1205 12:53:56.600926 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.601077 master-0 kubenswrapper[29936]: I1205 12:53:56.601054 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.601634 master-0 kubenswrapper[29936]: I1205 12:53:56.601580 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.601634 master-0 kubenswrapper[29936]: I1205 12:53:56.601620 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.601789 master-0 kubenswrapper[29936]: I1205 12:53:56.601759 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.602039 master-0 kubenswrapper[29936]: I1205 12:53:56.602007 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fbcv\" (UniqueName: \"kubernetes.io/projected/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-kube-api-access-4fbcv\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.602121 master-0 kubenswrapper[29936]: I1205 12:53:56.602101 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.602197 master-0 kubenswrapper[29936]: E1205 12:53:56.602129 29936 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 05 12:53:56.602261 master-0 kubenswrapper[29936]: E1205 12:53:56.602219 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls podName:b97e29e1-c1cf-4f1f-a530-094bcb24ab4c nodeName:}" failed. No retries permitted until 2025-12-05 12:53:57.102169667 +0000 UTC m=+234.234249348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "b97e29e1-c1cf-4f1f-a530-094bcb24ab4c") : secret "alertmanager-main-tls" not found Dec 05 12:53:56.602663 master-0 kubenswrapper[29936]: I1205 12:53:56.602616 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.603380 master-0 kubenswrapper[29936]: I1205 12:53:56.603343 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.604920 master-0 kubenswrapper[29936]: I1205 12:53:56.604865 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-config-out\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.605352 master-0 kubenswrapper[29936]: I1205 12:53:56.605276 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.605733 master-0 kubenswrapper[29936]: I1205 12:53:56.605641 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.605733 master-0 kubenswrapper[29936]: I1205 12:53:56.605697 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.606154 master-0 kubenswrapper[29936]: I1205 12:53:56.606090 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-config-volume\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.606826 master-0 kubenswrapper[29936]: I1205 12:53:56.606751 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-web-config\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.610838 master-0 kubenswrapper[29936]: I1205 12:53:56.610770 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.622050 master-0 kubenswrapper[29936]: I1205 12:53:56.621095 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x" event={"ID":"3771a3be-b579-428a-8ef3-1513cce5292b","Type":"ContainerStarted","Data":"2ee1be2a42e47743bd8056afe76112d3d4599e36e27aa4f64fabbb33262f8a91"} Dec 05 12:53:56.623439 master-0 kubenswrapper[29936]: I1205 12:53:56.623415 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-56b6c94668-k88cs_a2c37759-9414-4ca7-8bd4-20c4f689189b/monitoring-plugin/0.log" Dec 05 12:53:56.623562 master-0 kubenswrapper[29936]: I1205 12:53:56.623456 29936 generic.go:334] "Generic (PLEG): container finished" podID="a2c37759-9414-4ca7-8bd4-20c4f689189b" containerID="d7f23bbeaa1c82e3247b30df8f006c095fc00a421e05c2e364c2c95154bd5ea5" exitCode=2 Dec 05 12:53:56.623562 master-0 kubenswrapper[29936]: I1205 12:53:56.623483 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" event={"ID":"a2c37759-9414-4ca7-8bd4-20c4f689189b","Type":"ContainerDied","Data":"d7f23bbeaa1c82e3247b30df8f006c095fc00a421e05c2e364c2c95154bd5ea5"} Dec 05 12:53:56.637689 master-0 kubenswrapper[29936]: I1205 12:53:56.637193 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fbcv\" (UniqueName: \"kubernetes.io/projected/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-kube-api-access-4fbcv\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:56.654050 master-0 kubenswrapper[29936]: I1205 12:53:56.653987 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-56b6c94668-k88cs_a2c37759-9414-4ca7-8bd4-20c4f689189b/monitoring-plugin/0.log" Dec 05 12:53:56.654311 master-0 kubenswrapper[29936]: I1205 12:53:56.654098 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" Dec 05 12:53:56.697823 master-0 kubenswrapper[29936]: I1205 12:53:56.697655 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5686c796dc-z46lg"] Dec 05 12:53:56.703021 master-0 kubenswrapper[29936]: I1205 12:53:56.702973 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a2c37759-9414-4ca7-8bd4-20c4f689189b-monitoring-plugin-cert\") pod \"a2c37759-9414-4ca7-8bd4-20c4f689189b\" (UID: \"a2c37759-9414-4ca7-8bd4-20c4f689189b\") " Dec 05 12:53:56.703021 master-0 kubenswrapper[29936]: W1205 12:53:56.702982 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42f3c259_d745_4060_9286_ac8999a49163.slice/crio-1d8d773821c8a94695aaed1b5190824601dcda90cee4420b1251133ba9b33048 WatchSource:0}: Error finding container 1d8d773821c8a94695aaed1b5190824601dcda90cee4420b1251133ba9b33048: Status 404 returned error can't find the container with id 1d8d773821c8a94695aaed1b5190824601dcda90cee4420b1251133ba9b33048 Dec 05 12:53:56.713881 master-0 kubenswrapper[29936]: I1205 12:53:56.713707 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2c37759-9414-4ca7-8bd4-20c4f689189b-monitoring-plugin-cert" (OuterVolumeSpecName: "monitoring-plugin-cert") pod "a2c37759-9414-4ca7-8bd4-20c4f689189b" (UID: "a2c37759-9414-4ca7-8bd4-20c4f689189b"). InnerVolumeSpecName "monitoring-plugin-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:53:56.813288 master-0 kubenswrapper[29936]: I1205 12:53:56.813186 29936 reconciler_common.go:293] "Volume detached for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a2c37759-9414-4ca7-8bd4-20c4f689189b-monitoring-plugin-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:53:57.039491 master-0 kubenswrapper[29936]: I1205 12:53:57.039386 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-b6876557c-zz5pl"] Dec 05 12:53:57.041909 master-0 kubenswrapper[29936]: W1205 12:53:57.041639 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd872dad8_8c57_4627_963d_37cae007bc41.slice/crio-cfa1c2c87822a5d0e5c1efd1f3c8d76faa8100ef60e3ce52905ab2825aaf050d WatchSource:0}: Error finding container cfa1c2c87822a5d0e5c1efd1f3c8d76faa8100ef60e3ce52905ab2825aaf050d: Status 404 returned error can't find the container with id cfa1c2c87822a5d0e5c1efd1f3c8d76faa8100ef60e3ce52905ab2825aaf050d Dec 05 12:53:57.120160 master-0 kubenswrapper[29936]: I1205 12:53:57.120083 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:57.120458 master-0 kubenswrapper[29936]: E1205 12:53:57.120271 29936 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 05 12:53:57.120507 master-0 kubenswrapper[29936]: E1205 12:53:57.120459 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls podName:b97e29e1-c1cf-4f1f-a530-094bcb24ab4c nodeName:}" failed. No retries permitted until 2025-12-05 12:53:58.120439468 +0000 UTC m=+235.252519139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "b97e29e1-c1cf-4f1f-a530-094bcb24ab4c") : secret "alertmanager-main-tls" not found Dec 05 12:53:57.324459 master-0 kubenswrapper[29936]: I1205 12:53:57.324282 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:57.324907 master-0 kubenswrapper[29936]: E1205 12:53:57.324794 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:53:57.324976 master-0 kubenswrapper[29936]: E1205 12:53:57.324939 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:53:59.324916599 +0000 UTC m=+236.456996280 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:53:57.325692 master-0 kubenswrapper[29936]: I1205 12:53:57.325611 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:57.326067 master-0 kubenswrapper[29936]: E1205 12:53:57.326039 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 05 12:53:57.326141 master-0 kubenswrapper[29936]: E1205 12:53:57.326122 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:53:59.326097662 +0000 UTC m=+236.458177343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-tls" not found Dec 05 12:53:57.530630 master-0 kubenswrapper[29936]: I1205 12:53:57.530133 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:57.530630 master-0 kubenswrapper[29936]: E1205 12:53:57.530381 29936 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 05 12:53:57.530630 master-0 kubenswrapper[29936]: E1205 12:53:57.530487 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls podName:d02714eb-260c-4004-9feb-3a6c524756aa nodeName:}" failed. No retries permitted until 2025-12-05 12:53:59.530464081 +0000 UTC m=+236.662543952 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls") pod "telemeter-client-58cd4759dd-kpl86" (UID: "d02714eb-260c-4004-9feb-3a6c524756aa") : secret "telemeter-client-tls" not found Dec 05 12:53:57.641854 master-0 kubenswrapper[29936]: I1205 12:53:57.639480 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-56b6c94668-k88cs_a2c37759-9414-4ca7-8bd4-20c4f689189b/monitoring-plugin/0.log" Dec 05 12:53:57.641854 master-0 kubenswrapper[29936]: I1205 12:53:57.639597 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" event={"ID":"a2c37759-9414-4ca7-8bd4-20c4f689189b","Type":"ContainerDied","Data":"9a52cbd6fbe23825c1470655da3ab68674cd3776b092896841a68e428064c1ff"} Dec 05 12:53:57.641854 master-0 kubenswrapper[29936]: I1205 12:53:57.639646 29936 scope.go:117] "RemoveContainer" containerID="d7f23bbeaa1c82e3247b30df8f006c095fc00a421e05c2e364c2c95154bd5ea5" Dec 05 12:53:57.641854 master-0 kubenswrapper[29936]: I1205 12:53:57.639790 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-56b6c94668-k88cs" Dec 05 12:53:57.642933 master-0 kubenswrapper[29936]: I1205 12:53:57.642784 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x" event={"ID":"3771a3be-b579-428a-8ef3-1513cce5292b","Type":"ContainerStarted","Data":"3c3c1f76b34e3370c0732f758799a57371679b78b4d3e8083a3f8ad077f6cb29"} Dec 05 12:53:57.644290 master-0 kubenswrapper[29936]: I1205 12:53:57.643179 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x" Dec 05 12:53:57.648737 master-0 kubenswrapper[29936]: I1205 12:53:57.648672 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" event={"ID":"d872dad8-8c57-4627-963d-37cae007bc41","Type":"ContainerStarted","Data":"104ecaeba7cdd4d2e8c4d49fc045142af6c4f09a52fd5a6b61c9676d4469a5c0"} Dec 05 12:53:57.648737 master-0 kubenswrapper[29936]: I1205 12:53:57.648722 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" event={"ID":"d872dad8-8c57-4627-963d-37cae007bc41","Type":"ContainerStarted","Data":"cfa1c2c87822a5d0e5c1efd1f3c8d76faa8100ef60e3ce52905ab2825aaf050d"} Dec 05 12:53:57.650301 master-0 kubenswrapper[29936]: I1205 12:53:57.649849 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" event={"ID":"42f3c259-d745-4060-9286-ac8999a49163","Type":"ContainerStarted","Data":"1d8d773821c8a94695aaed1b5190824601dcda90cee4420b1251133ba9b33048"} Dec 05 12:53:57.652980 master-0 kubenswrapper[29936]: I1205 12:53:57.652925 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x" Dec 05 12:53:57.674403 master-0 kubenswrapper[29936]: I1205 12:53:57.673892 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6b44d597f9-szx4x" podStartSLOduration=2.673868836 podStartE2EDuration="2.673868836s" podCreationTimestamp="2025-12-05 12:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:53:57.668323352 +0000 UTC m=+234.800403043" watchObservedRunningTime="2025-12-05 12:53:57.673868836 +0000 UTC m=+234.805948517" Dec 05 12:53:57.691049 master-0 kubenswrapper[29936]: I1205 12:53:57.690975 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/monitoring-plugin-56b6c94668-k88cs"] Dec 05 12:53:57.700154 master-0 kubenswrapper[29936]: I1205 12:53:57.700065 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/monitoring-plugin-56b6c94668-k88cs"] Dec 05 12:53:57.716224 master-0 kubenswrapper[29936]: I1205 12:53:57.716107 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" podStartSLOduration=2.716081889 podStartE2EDuration="2.716081889s" podCreationTimestamp="2025-12-05 12:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:53:57.710937136 +0000 UTC m=+234.843016827" watchObservedRunningTime="2025-12-05 12:53:57.716081889 +0000 UTC m=+234.848161570" Dec 05 12:53:57.947570 master-0 kubenswrapper[29936]: E1205 12:53:57.941508 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: secret "metrics-server-e63soeg91on8p" not found Dec 05 12:53:57.947570 master-0 kubenswrapper[29936]: E1205 12:53:57.941594 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:54:01.941576845 +0000 UTC m=+239.073656526 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : secret "metrics-server-e63soeg91on8p" not found Dec 05 12:53:58.144539 master-0 kubenswrapper[29936]: I1205 12:53:58.144485 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:53:58.144754 master-0 kubenswrapper[29936]: E1205 12:53:58.144694 29936 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 05 12:53:58.144831 master-0 kubenswrapper[29936]: E1205 12:53:58.144789 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls podName:b97e29e1-c1cf-4f1f-a530-094bcb24ab4c nodeName:}" failed. No retries permitted until 2025-12-05 12:54:00.144765381 +0000 UTC m=+237.276845062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "b97e29e1-c1cf-4f1f-a530-094bcb24ab4c") : secret "alertmanager-main-tls" not found Dec 05 12:53:59.196726 master-0 kubenswrapper[29936]: I1205 12:53:59.196612 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c37759-9414-4ca7-8bd4-20c4f689189b" path="/var/lib/kubelet/pods/a2c37759-9414-4ca7-8bd4-20c4f689189b/volumes" Dec 05 12:53:59.376603 master-0 kubenswrapper[29936]: I1205 12:53:59.376453 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:59.377129 master-0 kubenswrapper[29936]: E1205 12:53:59.376737 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:53:59.377129 master-0 kubenswrapper[29936]: I1205 12:53:59.376808 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:53:59.377129 master-0 kubenswrapper[29936]: E1205 12:53:59.376838 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:54:03.376813455 +0000 UTC m=+240.508893136 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:53:59.377129 master-0 kubenswrapper[29936]: E1205 12:53:59.377010 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 05 12:53:59.377129 master-0 kubenswrapper[29936]: E1205 12:53:59.377042 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:54:03.377030981 +0000 UTC m=+240.509110652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-tls" not found Dec 05 12:53:59.581886 master-0 kubenswrapper[29936]: I1205 12:53:59.581814 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:53:59.582169 master-0 kubenswrapper[29936]: E1205 12:53:59.582105 29936 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 05 12:53:59.582299 master-0 kubenswrapper[29936]: E1205 12:53:59.582275 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls podName:d02714eb-260c-4004-9feb-3a6c524756aa nodeName:}" failed. No retries permitted until 2025-12-05 12:54:03.582238314 +0000 UTC m=+240.714318155 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls") pod "telemeter-client-58cd4759dd-kpl86" (UID: "d02714eb-260c-4004-9feb-3a6c524756aa") : secret "telemeter-client-tls" not found Dec 05 12:53:59.677728 master-0 kubenswrapper[29936]: I1205 12:53:59.677680 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" event={"ID":"42f3c259-d745-4060-9286-ac8999a49163","Type":"ContainerStarted","Data":"b8672428ee5cd00e7c9d82f372210ee574e7b4a5efa1ae710bc6274abe00ab86"} Dec 05 12:53:59.677942 master-0 kubenswrapper[29936]: I1205 12:53:59.677741 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" event={"ID":"42f3c259-d745-4060-9286-ac8999a49163","Type":"ContainerStarted","Data":"7e2489a2fcd49856c055e8c59fe8c6dc5a5a76d4dacdff9bfa57f99980a68482"} Dec 05 12:54:00.194211 master-0 kubenswrapper[29936]: I1205 12:54:00.194107 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:54:00.194524 master-0 kubenswrapper[29936]: E1205 12:54:00.194358 29936 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 05 12:54:00.194524 master-0 kubenswrapper[29936]: E1205 12:54:00.194450 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls podName:b97e29e1-c1cf-4f1f-a530-094bcb24ab4c nodeName:}" failed. No retries permitted until 2025-12-05 12:54:04.194420484 +0000 UTC m=+241.326500175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "b97e29e1-c1cf-4f1f-a530-094bcb24ab4c") : secret "alertmanager-main-tls" not found Dec 05 12:54:00.703469 master-0 kubenswrapper[29936]: I1205 12:54:00.703388 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" event={"ID":"42f3c259-d745-4060-9286-ac8999a49163","Type":"ContainerStarted","Data":"a74d93ec5ad9420d85324556448337e3c38e04f50751771ef545d59dfaddba6a"} Dec 05 12:54:02.029916 master-0 kubenswrapper[29936]: E1205 12:54:02.029838 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: secret "metrics-server-e63soeg91on8p" not found Dec 05 12:54:02.030451 master-0 kubenswrapper[29936]: E1205 12:54:02.029990 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:54:10.029959038 +0000 UTC m=+247.162038739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : secret "metrics-server-e63soeg91on8p" not found Dec 05 12:54:02.721561 master-0 kubenswrapper[29936]: I1205 12:54:02.721501 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" event={"ID":"42f3c259-d745-4060-9286-ac8999a49163","Type":"ContainerStarted","Data":"95e6af8ecadc7765294b104bee9d9f319922cd6486a6b8a63d7046f8bb818eac"} Dec 05 12:54:02.721561 master-0 kubenswrapper[29936]: I1205 12:54:02.721564 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" event={"ID":"42f3c259-d745-4060-9286-ac8999a49163","Type":"ContainerStarted","Data":"0d43dca588377c7ae90c09aa99a699e8dee64d9c2ca875d47ae1ba3d13c70a1a"} Dec 05 12:54:03.457240 master-0 kubenswrapper[29936]: I1205 12:54:03.457129 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:54:03.458411 master-0 kubenswrapper[29936]: E1205 12:54:03.457457 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:54:03.458411 master-0 kubenswrapper[29936]: E1205 12:54:03.457706 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:54:11.457669149 +0000 UTC m=+248.589748840 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:54:03.458411 master-0 kubenswrapper[29936]: I1205 12:54:03.457770 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:54:03.458411 master-0 kubenswrapper[29936]: E1205 12:54:03.458103 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 05 12:54:03.458411 master-0 kubenswrapper[29936]: E1205 12:54:03.458291 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:54:11.458251885 +0000 UTC m=+248.590331606 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-tls" not found Dec 05 12:54:03.662343 master-0 kubenswrapper[29936]: I1205 12:54:03.662082 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:54:03.662708 master-0 kubenswrapper[29936]: E1205 12:54:03.662526 29936 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 05 12:54:03.662708 master-0 kubenswrapper[29936]: E1205 12:54:03.662684 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls podName:d02714eb-260c-4004-9feb-3a6c524756aa nodeName:}" failed. No retries permitted until 2025-12-05 12:54:11.662645794 +0000 UTC m=+248.794725635 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls") pod "telemeter-client-58cd4759dd-kpl86" (UID: "d02714eb-260c-4004-9feb-3a6c524756aa") : secret "telemeter-client-tls" not found Dec 05 12:54:03.736081 master-0 kubenswrapper[29936]: I1205 12:54:03.735949 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" event={"ID":"42f3c259-d745-4060-9286-ac8999a49163","Type":"ContainerStarted","Data":"a6c04a4d690d751c6427e7d6c7623ee16f5497d8628794fc0199cfa8c2f46170"} Dec 05 12:54:04.273859 master-0 kubenswrapper[29936]: I1205 12:54:04.273759 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:54:04.275141 master-0 kubenswrapper[29936]: E1205 12:54:04.275075 29936 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 05 12:54:04.275301 master-0 kubenswrapper[29936]: E1205 12:54:04.275264 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls podName:b97e29e1-c1cf-4f1f-a530-094bcb24ab4c nodeName:}" failed. No retries permitted until 2025-12-05 12:54:12.275234246 +0000 UTC m=+249.407314167 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "b97e29e1-c1cf-4f1f-a530-094bcb24ab4c") : secret "alertmanager-main-tls" not found Dec 05 12:54:04.744408 master-0 kubenswrapper[29936]: I1205 12:54:04.744329 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:54:04.757566 master-0 kubenswrapper[29936]: I1205 12:54:04.757510 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" Dec 05 12:54:04.794230 master-0 kubenswrapper[29936]: I1205 12:54:04.794091 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5686c796dc-z46lg" podStartSLOduration=5.15001493 podStartE2EDuration="9.794057533s" podCreationTimestamp="2025-12-05 12:53:55 +0000 UTC" firstStartedPulling="2025-12-05 12:53:56.711364451 +0000 UTC m=+233.843444132" lastFinishedPulling="2025-12-05 12:54:01.355407054 +0000 UTC m=+238.487486735" observedRunningTime="2025-12-05 12:54:04.790953607 +0000 UTC m=+241.923033298" watchObservedRunningTime="2025-12-05 12:54:04.794057533 +0000 UTC m=+241.926137234" Dec 05 12:54:10.090275 master-0 kubenswrapper[29936]: E1205 12:54:10.090135 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: secret "metrics-server-e63soeg91on8p" not found Dec 05 12:54:10.090275 master-0 kubenswrapper[29936]: E1205 12:54:10.090286 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:54:26.090266077 +0000 UTC m=+263.222345759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : secret "metrics-server-e63soeg91on8p" not found Dec 05 12:54:11.518138 master-0 kubenswrapper[29936]: I1205 12:54:11.518014 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:54:11.519142 master-0 kubenswrapper[29936]: I1205 12:54:11.518177 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:54:11.519142 master-0 kubenswrapper[29936]: E1205 12:54:11.518452 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Dec 05 12:54:11.519142 master-0 kubenswrapper[29936]: E1205 12:54:11.518521 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:54:27.518499844 +0000 UTC m=+264.650579535 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-tls" not found Dec 05 12:54:11.519142 master-0 kubenswrapper[29936]: E1205 12:54:11.518943 29936 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:54:11.519436 master-0 kubenswrapper[29936]: E1205 12:54:11.519141 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls podName:34b86c49-87d9-4167-899e-d070aff1dc10 nodeName:}" failed. No retries permitted until 2025-12-05 12:54:27.51910771 +0000 UTC m=+264.651187431 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "34b86c49-87d9-4167-899e-d070aff1dc10") : secret "prometheus-k8s-thanos-sidecar-tls" not found Dec 05 12:54:11.722804 master-0 kubenswrapper[29936]: I1205 12:54:11.722686 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:54:11.723241 master-0 kubenswrapper[29936]: E1205 12:54:11.722958 29936 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Dec 05 12:54:11.723241 master-0 kubenswrapper[29936]: E1205 12:54:11.723126 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls podName:d02714eb-260c-4004-9feb-3a6c524756aa nodeName:}" failed. No retries permitted until 2025-12-05 12:54:27.723086509 +0000 UTC m=+264.855166220 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls") pod "telemeter-client-58cd4759dd-kpl86" (UID: "d02714eb-260c-4004-9feb-3a6c524756aa") : secret "telemeter-client-tls" not found Dec 05 12:54:12.333414 master-0 kubenswrapper[29936]: I1205 12:54:12.333308 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:54:12.333851 master-0 kubenswrapper[29936]: E1205 12:54:12.333567 29936 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 05 12:54:12.333851 master-0 kubenswrapper[29936]: E1205 12:54:12.333734 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls podName:b97e29e1-c1cf-4f1f-a530-094bcb24ab4c nodeName:}" failed. No retries permitted until 2025-12-05 12:54:28.333704746 +0000 UTC m=+265.465784637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "b97e29e1-c1cf-4f1f-a530-094bcb24ab4c") : secret "alertmanager-main-tls" not found Dec 05 12:54:16.575165 master-0 kubenswrapper[29936]: I1205 12:54:16.574947 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:54:16.575165 master-0 kubenswrapper[29936]: I1205 12:54:16.575070 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:54:20.816368 master-0 kubenswrapper[29936]: I1205 12:54:20.816243 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-74ffd5f75f-slrkr" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" containerID="cri-o://04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd" gracePeriod=15 Dec 05 12:54:21.328472 master-0 kubenswrapper[29936]: I1205 12:54:21.328416 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74ffd5f75f-slrkr_5f5f6985-a4f8-467b-8277-4ea20bfc4570/console/0.log" Dec 05 12:54:21.328787 master-0 kubenswrapper[29936]: I1205 12:54:21.328772 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:54:21.420277 master-0 kubenswrapper[29936]: I1205 12:54:21.420097 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-oauth-serving-cert\") pod \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " Dec 05 12:54:21.420277 master-0 kubenswrapper[29936]: I1205 12:54:21.420216 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-oauth-config\") pod \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " Dec 05 12:54:21.420277 master-0 kubenswrapper[29936]: I1205 12:54:21.420281 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-trusted-ca-bundle\") pod \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " Dec 05 12:54:21.420618 master-0 kubenswrapper[29936]: I1205 12:54:21.420336 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-config\") pod \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " Dec 05 12:54:21.420618 master-0 kubenswrapper[29936]: I1205 12:54:21.420369 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4tpx\" (UniqueName: \"kubernetes.io/projected/5f5f6985-a4f8-467b-8277-4ea20bfc4570-kube-api-access-q4tpx\") pod \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " Dec 05 12:54:21.420618 master-0 kubenswrapper[29936]: I1205 12:54:21.420431 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-service-ca\") pod \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " Dec 05 12:54:21.420618 master-0 kubenswrapper[29936]: I1205 12:54:21.420490 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-serving-cert\") pod \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\" (UID: \"5f5f6985-a4f8-467b-8277-4ea20bfc4570\") " Dec 05 12:54:21.420843 master-0 kubenswrapper[29936]: I1205 12:54:21.420798 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5f5f6985-a4f8-467b-8277-4ea20bfc4570" (UID: "5f5f6985-a4f8-467b-8277-4ea20bfc4570"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:54:21.421022 master-0 kubenswrapper[29936]: I1205 12:54:21.420998 29936 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:54:21.421277 master-0 kubenswrapper[29936]: I1205 12:54:21.421107 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-config" (OuterVolumeSpecName: "console-config") pod "5f5f6985-a4f8-467b-8277-4ea20bfc4570" (UID: "5f5f6985-a4f8-467b-8277-4ea20bfc4570"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:54:21.421585 master-0 kubenswrapper[29936]: I1205 12:54:21.421531 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-service-ca" (OuterVolumeSpecName: "service-ca") pod "5f5f6985-a4f8-467b-8277-4ea20bfc4570" (UID: "5f5f6985-a4f8-467b-8277-4ea20bfc4570"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:54:21.421652 master-0 kubenswrapper[29936]: I1205 12:54:21.421544 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5f5f6985-a4f8-467b-8277-4ea20bfc4570" (UID: "5f5f6985-a4f8-467b-8277-4ea20bfc4570"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:54:21.425274 master-0 kubenswrapper[29936]: I1205 12:54:21.423705 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5f5f6985-a4f8-467b-8277-4ea20bfc4570" (UID: "5f5f6985-a4f8-467b-8277-4ea20bfc4570"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:54:21.425274 master-0 kubenswrapper[29936]: I1205 12:54:21.424307 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5f5f6985-a4f8-467b-8277-4ea20bfc4570" (UID: "5f5f6985-a4f8-467b-8277-4ea20bfc4570"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:54:21.425873 master-0 kubenswrapper[29936]: I1205 12:54:21.425396 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f5f6985-a4f8-467b-8277-4ea20bfc4570-kube-api-access-q4tpx" (OuterVolumeSpecName: "kube-api-access-q4tpx") pod "5f5f6985-a4f8-467b-8277-4ea20bfc4570" (UID: "5f5f6985-a4f8-467b-8277-4ea20bfc4570"). InnerVolumeSpecName "kube-api-access-q4tpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:54:21.524509 master-0 kubenswrapper[29936]: I1205 12:54:21.524414 29936 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:54:21.524509 master-0 kubenswrapper[29936]: I1205 12:54:21.524489 29936 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:54:21.524509 master-0 kubenswrapper[29936]: I1205 12:54:21.524506 29936 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:54:21.524509 master-0 kubenswrapper[29936]: I1205 12:54:21.524522 29936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:54:21.524859 master-0 kubenswrapper[29936]: I1205 12:54:21.524541 29936 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f5f6985-a4f8-467b-8277-4ea20bfc4570-console-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:54:21.524859 master-0 kubenswrapper[29936]: I1205 12:54:21.524559 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4tpx\" (UniqueName: \"kubernetes.io/projected/5f5f6985-a4f8-467b-8277-4ea20bfc4570-kube-api-access-q4tpx\") on node \"master-0\" DevicePath \"\"" Dec 05 12:54:21.885092 master-0 kubenswrapper[29936]: I1205 12:54:21.885042 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74ffd5f75f-slrkr_5f5f6985-a4f8-467b-8277-4ea20bfc4570/console/0.log" Dec 05 12:54:21.885914 master-0 kubenswrapper[29936]: I1205 12:54:21.885794 29936 generic.go:334] "Generic (PLEG): container finished" podID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerID="04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd" exitCode=2 Dec 05 12:54:21.885914 master-0 kubenswrapper[29936]: I1205 12:54:21.885861 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74ffd5f75f-slrkr" event={"ID":"5f5f6985-a4f8-467b-8277-4ea20bfc4570","Type":"ContainerDied","Data":"04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd"} Dec 05 12:54:21.885914 master-0 kubenswrapper[29936]: I1205 12:54:21.885840 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74ffd5f75f-slrkr" Dec 05 12:54:21.885914 master-0 kubenswrapper[29936]: I1205 12:54:21.885912 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74ffd5f75f-slrkr" event={"ID":"5f5f6985-a4f8-467b-8277-4ea20bfc4570","Type":"ContainerDied","Data":"806da7cc8b3f2ccc3bc34f5676a5083409af02a71f48c7bc097f3c69fbe20029"} Dec 05 12:54:21.886105 master-0 kubenswrapper[29936]: I1205 12:54:21.885934 29936 scope.go:117] "RemoveContainer" containerID="04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd" Dec 05 12:54:21.918246 master-0 kubenswrapper[29936]: I1205 12:54:21.918190 29936 scope.go:117] "RemoveContainer" containerID="04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd" Dec 05 12:54:21.919060 master-0 kubenswrapper[29936]: E1205 12:54:21.918936 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd\": container with ID starting with 04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd not found: ID does not exist" containerID="04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd" Dec 05 12:54:21.919060 master-0 kubenswrapper[29936]: I1205 12:54:21.919013 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd"} err="failed to get container status \"04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd\": rpc error: code = NotFound desc = could not find container \"04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd\": container with ID starting with 04a45b1cc49f4f9500c41ee8cc68ff20313db4f4ee99928b970dc5be7392d5dd not found: ID does not exist" Dec 05 12:54:22.489023 master-0 kubenswrapper[29936]: I1205 12:54:22.488913 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74ffd5f75f-slrkr"] Dec 05 12:54:22.495328 master-0 kubenswrapper[29936]: I1205 12:54:22.495246 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74ffd5f75f-slrkr"] Dec 05 12:54:23.194780 master-0 kubenswrapper[29936]: I1205 12:54:23.194718 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" path="/var/lib/kubelet/pods/5f5f6985-a4f8-467b-8277-4ea20bfc4570/volumes" Dec 05 12:54:26.105058 master-0 kubenswrapper[29936]: E1205 12:54:26.104935 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: secret "metrics-server-e63soeg91on8p" not found Dec 05 12:54:26.105058 master-0 kubenswrapper[29936]: E1205 12:54:26.105101 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:54:58.105071559 +0000 UTC m=+295.237151240 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : secret "metrics-server-e63soeg91on8p" not found Dec 05 12:54:27.533154 master-0 kubenswrapper[29936]: I1205 12:54:27.533051 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:54:27.533821 master-0 kubenswrapper[29936]: I1205 12:54:27.533381 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:54:27.536697 master-0 kubenswrapper[29936]: I1205 12:54:27.536642 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:54:27.537450 master-0 kubenswrapper[29936]: I1205 12:54:27.537410 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/34b86c49-87d9-4167-899e-d070aff1dc10-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"34b86c49-87d9-4167-899e-d070aff1dc10\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:54:27.738286 master-0 kubenswrapper[29936]: I1205 12:54:27.738153 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:54:27.739219 master-0 kubenswrapper[29936]: I1205 12:54:27.739101 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:54:27.742567 master-0 kubenswrapper[29936]: I1205 12:54:27.742526 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d02714eb-260c-4004-9feb-3a6c524756aa-telemeter-client-tls\") pod \"telemeter-client-58cd4759dd-kpl86\" (UID: \"d02714eb-260c-4004-9feb-3a6c524756aa\") " pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:54:27.763591 master-0 kubenswrapper[29936]: I1205 12:54:27.763518 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" Dec 05 12:54:28.200841 master-0 kubenswrapper[29936]: W1205 12:54:28.200656 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34b86c49_87d9_4167_899e_d070aff1dc10.slice/crio-ca2a324c6c4e9e1f838a04cf1f07ca0b5c134f1f355b6aa0310783cb91c4c04b WatchSource:0}: Error finding container ca2a324c6c4e9e1f838a04cf1f07ca0b5c134f1f355b6aa0310783cb91c4c04b: Status 404 returned error can't find the container with id ca2a324c6c4e9e1f838a04cf1f07ca0b5c134f1f355b6aa0310783cb91c4c04b Dec 05 12:54:28.209419 master-0 kubenswrapper[29936]: I1205 12:54:28.203154 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 05 12:54:28.241136 master-0 kubenswrapper[29936]: I1205 12:54:28.241074 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-58cd4759dd-kpl86"] Dec 05 12:54:28.243868 master-0 kubenswrapper[29936]: W1205 12:54:28.243804 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd02714eb_260c_4004_9feb_3a6c524756aa.slice/crio-53f2a1f35ef01d219b42a29d2540ff11962f28fb12386d873fd777136fe22d79 WatchSource:0}: Error finding container 53f2a1f35ef01d219b42a29d2540ff11962f28fb12386d873fd777136fe22d79: Status 404 returned error can't find the container with id 53f2a1f35ef01d219b42a29d2540ff11962f28fb12386d873fd777136fe22d79 Dec 05 12:54:28.348908 master-0 kubenswrapper[29936]: I1205 12:54:28.348835 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:54:28.351950 master-0 kubenswrapper[29936]: I1205 12:54:28.351919 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b97e29e1-c1cf-4f1f-a530-094bcb24ab4c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c\") " pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:54:28.563117 master-0 kubenswrapper[29936]: I1205 12:54:28.562937 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 05 12:54:28.833680 master-0 kubenswrapper[29936]: I1205 12:54:28.833528 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 05 12:54:28.958367 master-0 kubenswrapper[29936]: I1205 12:54:28.958281 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c","Type":"ContainerStarted","Data":"67170118a00e5f4d4e5f01cafc5726686198ce0dda3db0b9c7a07964f184b66f"} Dec 05 12:54:28.960416 master-0 kubenswrapper[29936]: I1205 12:54:28.960328 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34b86c49-87d9-4167-899e-d070aff1dc10","Type":"ContainerStarted","Data":"ca2a324c6c4e9e1f838a04cf1f07ca0b5c134f1f355b6aa0310783cb91c4c04b"} Dec 05 12:54:28.961691 master-0 kubenswrapper[29936]: I1205 12:54:28.961647 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" event={"ID":"d02714eb-260c-4004-9feb-3a6c524756aa","Type":"ContainerStarted","Data":"53f2a1f35ef01d219b42a29d2540ff11962f28fb12386d873fd777136fe22d79"} Dec 05 12:54:29.975783 master-0 kubenswrapper[29936]: I1205 12:54:29.975710 29936 generic.go:334] "Generic (PLEG): container finished" podID="b97e29e1-c1cf-4f1f-a530-094bcb24ab4c" containerID="0f2508841536a91d79ab873f7fa4a38d7ad8f641ec6a21ffe0ad2227c1ae5f20" exitCode=0 Dec 05 12:54:29.976404 master-0 kubenswrapper[29936]: I1205 12:54:29.975815 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c","Type":"ContainerDied","Data":"0f2508841536a91d79ab873f7fa4a38d7ad8f641ec6a21ffe0ad2227c1ae5f20"} Dec 05 12:54:29.980165 master-0 kubenswrapper[29936]: I1205 12:54:29.980077 29936 generic.go:334] "Generic (PLEG): container finished" podID="34b86c49-87d9-4167-899e-d070aff1dc10" containerID="328728209a2b81485730bbc28ebf7782cc489e65c968b5a280653e80f1f13272" exitCode=0 Dec 05 12:54:29.980305 master-0 kubenswrapper[29936]: I1205 12:54:29.980165 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34b86c49-87d9-4167-899e-d070aff1dc10","Type":"ContainerDied","Data":"328728209a2b81485730bbc28ebf7782cc489e65c968b5a280653e80f1f13272"} Dec 05 12:54:33.005278 master-0 kubenswrapper[29936]: I1205 12:54:33.005163 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" event={"ID":"d02714eb-260c-4004-9feb-3a6c524756aa","Type":"ContainerStarted","Data":"abb3191c5ba7e7e21f0dc9f5a569bb644e7cbc905ddb946d162b478dff65dddd"} Dec 05 12:54:34.017964 master-0 kubenswrapper[29936]: I1205 12:54:34.017879 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c","Type":"ContainerStarted","Data":"78bb1fecdccdf781946c400c3303671f7ead7a41f4f937a48b9543bd3cf30a01"} Dec 05 12:54:34.017964 master-0 kubenswrapper[29936]: I1205 12:54:34.017945 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c","Type":"ContainerStarted","Data":"3d09226b1ca233ca3a7b46e895803203963fe7f6496d06f6b11fad934617b00b"} Dec 05 12:54:34.021235 master-0 kubenswrapper[29936]: I1205 12:54:34.021171 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" event={"ID":"d02714eb-260c-4004-9feb-3a6c524756aa","Type":"ContainerStarted","Data":"106a59b26474259531a9cb3d3aed0e5a80557368c8403efa6f116094195f585f"} Dec 05 12:54:34.021235 master-0 kubenswrapper[29936]: I1205 12:54:34.021231 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" event={"ID":"d02714eb-260c-4004-9feb-3a6c524756aa","Type":"ContainerStarted","Data":"e5faa1bf0be5ea7bdd027045f71104ccdd91dc5c9c3dfc03f88c82255e423ec1"} Dec 05 12:54:34.430210 master-0 kubenswrapper[29936]: I1205 12:54:34.429914 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-58cd4759dd-kpl86" podStartSLOduration=35.74652693 podStartE2EDuration="39.429874428s" podCreationTimestamp="2025-12-05 12:53:55 +0000 UTC" firstStartedPulling="2025-12-05 12:54:28.247106549 +0000 UTC m=+265.379186230" lastFinishedPulling="2025-12-05 12:54:31.930454057 +0000 UTC m=+269.062533728" observedRunningTime="2025-12-05 12:54:34.422680578 +0000 UTC m=+271.554760279" watchObservedRunningTime="2025-12-05 12:54:34.429874428 +0000 UTC m=+271.561954129" Dec 05 12:54:35.031456 master-0 kubenswrapper[29936]: I1205 12:54:35.031390 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c","Type":"ContainerStarted","Data":"342a26e0ed0dc57db756a41aefde6bc070724fe544bbc9b8a7d52fd24d9c1d71"} Dec 05 12:54:35.033004 master-0 kubenswrapper[29936]: I1205 12:54:35.032907 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34b86c49-87d9-4167-899e-d070aff1dc10","Type":"ContainerStarted","Data":"0021636e95730ac978888c6f3e57be9232abc9f866e4b7d13b906033c49285a6"} Dec 05 12:54:36.046416 master-0 kubenswrapper[29936]: I1205 12:54:36.046218 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c","Type":"ContainerStarted","Data":"c360a4e4155b2282ed210e4a8704e33c72741f64e2a646d4442ccc5e6d40237b"} Dec 05 12:54:36.049315 master-0 kubenswrapper[29936]: I1205 12:54:36.049281 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34b86c49-87d9-4167-899e-d070aff1dc10","Type":"ContainerStarted","Data":"89b54fa07ff3cd16f06bec1a18e267b67cb86ce92baa0f3793050d431b6f796f"} Dec 05 12:54:36.581844 master-0 kubenswrapper[29936]: I1205 12:54:36.581783 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:54:36.592306 master-0 kubenswrapper[29936]: I1205 12:54:36.592140 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-b6876557c-zz5pl" Dec 05 12:54:36.708850 master-0 kubenswrapper[29936]: I1205 12:54:36.708763 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-76b95955f5-vk4wx"] Dec 05 12:54:36.709908 master-0 kubenswrapper[29936]: E1205 12:54:36.709242 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c37759-9414-4ca7-8bd4-20c4f689189b" containerName="monitoring-plugin" Dec 05 12:54:36.709908 master-0 kubenswrapper[29936]: I1205 12:54:36.709273 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c37759-9414-4ca7-8bd4-20c4f689189b" containerName="monitoring-plugin" Dec 05 12:54:36.709908 master-0 kubenswrapper[29936]: E1205 12:54:36.709297 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" Dec 05 12:54:36.709908 master-0 kubenswrapper[29936]: I1205 12:54:36.709307 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" Dec 05 12:54:36.709908 master-0 kubenswrapper[29936]: I1205 12:54:36.709556 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c37759-9414-4ca7-8bd4-20c4f689189b" containerName="monitoring-plugin" Dec 05 12:54:36.709908 master-0 kubenswrapper[29936]: I1205 12:54:36.709600 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f5f6985-a4f8-467b-8277-4ea20bfc4570" containerName="console" Dec 05 12:54:36.711119 master-0 kubenswrapper[29936]: I1205 12:54:36.711080 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:36.750900 master-0 kubenswrapper[29936]: I1205 12:54:36.750437 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76b95955f5-vk4wx"] Dec 05 12:54:36.918241 master-0 kubenswrapper[29936]: I1205 12:54:36.915752 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svl7z\" (UniqueName: \"kubernetes.io/projected/9ba34a06-4873-4b03-a94b-055afbf70898-kube-api-access-svl7z\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:36.918241 master-0 kubenswrapper[29936]: I1205 12:54:36.915831 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-oauth-serving-cert\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:36.918241 master-0 kubenswrapper[29936]: I1205 12:54:36.915857 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-trusted-ca-bundle\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:36.918241 master-0 kubenswrapper[29936]: I1205 12:54:36.915873 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-service-ca\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:36.918241 master-0 kubenswrapper[29936]: I1205 12:54:36.915898 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-serving-cert\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:36.918241 master-0 kubenswrapper[29936]: I1205 12:54:36.915918 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-oauth-config\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:36.918241 master-0 kubenswrapper[29936]: I1205 12:54:36.915941 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-console-config\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.019219 master-0 kubenswrapper[29936]: I1205 12:54:37.019099 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-trusted-ca-bundle\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.019219 master-0 kubenswrapper[29936]: I1205 12:54:37.019195 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-service-ca\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.019219 master-0 kubenswrapper[29936]: I1205 12:54:37.019235 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-serving-cert\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.019604 master-0 kubenswrapper[29936]: I1205 12:54:37.019263 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-oauth-config\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.019604 master-0 kubenswrapper[29936]: I1205 12:54:37.019305 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-console-config\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.019604 master-0 kubenswrapper[29936]: I1205 12:54:37.019409 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svl7z\" (UniqueName: \"kubernetes.io/projected/9ba34a06-4873-4b03-a94b-055afbf70898-kube-api-access-svl7z\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.019604 master-0 kubenswrapper[29936]: I1205 12:54:37.019441 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-oauth-serving-cert\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.020643 master-0 kubenswrapper[29936]: I1205 12:54:37.020569 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-oauth-serving-cert\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.025214 master-0 kubenswrapper[29936]: I1205 12:54:37.021596 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-service-ca\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.025214 master-0 kubenswrapper[29936]: I1205 12:54:37.022117 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-console-config\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.025214 master-0 kubenswrapper[29936]: I1205 12:54:37.023030 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-trusted-ca-bundle\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.025214 master-0 kubenswrapper[29936]: I1205 12:54:37.024929 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-oauth-config\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.026959 master-0 kubenswrapper[29936]: I1205 12:54:37.026888 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-serving-cert\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.047246 master-0 kubenswrapper[29936]: I1205 12:54:37.045206 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svl7z\" (UniqueName: \"kubernetes.io/projected/9ba34a06-4873-4b03-a94b-055afbf70898-kube-api-access-svl7z\") pod \"console-76b95955f5-vk4wx\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:37.086230 master-0 kubenswrapper[29936]: I1205 12:54:37.086115 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c","Type":"ContainerStarted","Data":"2d11e5b3dd4f5ef2043199fde0f27ed14439ba91cb4aafcf991415bcbdb7b887"} Dec 05 12:54:37.086230 master-0 kubenswrapper[29936]: I1205 12:54:37.086225 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b97e29e1-c1cf-4f1f-a530-094bcb24ab4c","Type":"ContainerStarted","Data":"1e509719b2dbc6e4f3886b89454126e095174050bbfa34acbf5ba3c426c1aac4"} Dec 05 12:54:37.093212 master-0 kubenswrapper[29936]: I1205 12:54:37.092321 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34b86c49-87d9-4167-899e-d070aff1dc10","Type":"ContainerStarted","Data":"032b031e2f8e8097ba6a97886f9d181e9ec8fdf3c315cbb322ec2aac42f1f106"} Dec 05 12:54:37.093212 master-0 kubenswrapper[29936]: I1205 12:54:37.092399 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34b86c49-87d9-4167-899e-d070aff1dc10","Type":"ContainerStarted","Data":"d25facd92577882249632d5dd364765343fea87549d70b88add73f3e1c5df771"} Dec 05 12:54:37.093212 master-0 kubenswrapper[29936]: I1205 12:54:37.092415 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34b86c49-87d9-4167-899e-d070aff1dc10","Type":"ContainerStarted","Data":"9cef7e4707cf5829f12ad9ff31a32451de0c174f1ba5ba38349e830ef53ec915"} Dec 05 12:54:37.346207 master-0 kubenswrapper[29936]: I1205 12:54:37.345432 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:38.045504 master-0 kubenswrapper[29936]: I1205 12:54:38.045405 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=37.956656529 podStartE2EDuration="42.045385782s" podCreationTimestamp="2025-12-05 12:53:56 +0000 UTC" firstStartedPulling="2025-12-05 12:54:28.844801857 +0000 UTC m=+265.976881538" lastFinishedPulling="2025-12-05 12:54:32.93353111 +0000 UTC m=+270.065610791" observedRunningTime="2025-12-05 12:54:37.214056322 +0000 UTC m=+274.346136003" watchObservedRunningTime="2025-12-05 12:54:38.045385782 +0000 UTC m=+275.177465463" Dec 05 12:54:38.051338 master-0 kubenswrapper[29936]: I1205 12:54:38.051279 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76b95955f5-vk4wx"] Dec 05 12:54:38.055678 master-0 kubenswrapper[29936]: W1205 12:54:38.055591 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ba34a06_4873_4b03_a94b_055afbf70898.slice/crio-4782731ff97a7d89f1b63c757e6d2b309aca86030a5cd604c9336032a54442b7 WatchSource:0}: Error finding container 4782731ff97a7d89f1b63c757e6d2b309aca86030a5cd604c9336032a54442b7: Status 404 returned error can't find the container with id 4782731ff97a7d89f1b63c757e6d2b309aca86030a5cd604c9336032a54442b7 Dec 05 12:54:38.110589 master-0 kubenswrapper[29936]: I1205 12:54:38.110514 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b95955f5-vk4wx" event={"ID":"9ba34a06-4873-4b03-a94b-055afbf70898","Type":"ContainerStarted","Data":"4782731ff97a7d89f1b63c757e6d2b309aca86030a5cd604c9336032a54442b7"} Dec 05 12:54:38.116502 master-0 kubenswrapper[29936]: I1205 12:54:38.116424 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"34b86c49-87d9-4167-899e-d070aff1dc10","Type":"ContainerStarted","Data":"ec622bda6665a76419dcef7120b5787189daeb4ccb11636da0c22b0885a0aa18"} Dec 05 12:54:38.162831 master-0 kubenswrapper[29936]: I1205 12:54:38.162597 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=36.60308541 podStartE2EDuration="43.162574448s" podCreationTimestamp="2025-12-05 12:53:55 +0000 UTC" firstStartedPulling="2025-12-05 12:54:28.204249618 +0000 UTC m=+265.336329299" lastFinishedPulling="2025-12-05 12:54:34.763738656 +0000 UTC m=+271.895818337" observedRunningTime="2025-12-05 12:54:38.161149018 +0000 UTC m=+275.293228709" watchObservedRunningTime="2025-12-05 12:54:38.162574448 +0000 UTC m=+275.294654129" Dec 05 12:54:39.131532 master-0 kubenswrapper[29936]: I1205 12:54:39.131456 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b95955f5-vk4wx" event={"ID":"9ba34a06-4873-4b03-a94b-055afbf70898","Type":"ContainerStarted","Data":"f1b2eb8267907aa0d29e8a51321bba3dc60dff9552814df59e9790e9100b0169"} Dec 05 12:54:39.165933 master-0 kubenswrapper[29936]: I1205 12:54:39.164821 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76b95955f5-vk4wx" podStartSLOduration=3.164778276 podStartE2EDuration="3.164778276s" podCreationTimestamp="2025-12-05 12:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:54:39.15810636 +0000 UTC m=+276.290186041" watchObservedRunningTime="2025-12-05 12:54:39.164778276 +0000 UTC m=+276.296857947" Dec 05 12:54:42.745235 master-0 kubenswrapper[29936]: I1205 12:54:42.745113 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:54:47.347114 master-0 kubenswrapper[29936]: I1205 12:54:47.347001 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:47.347818 master-0 kubenswrapper[29936]: I1205 12:54:47.347138 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:47.355418 master-0 kubenswrapper[29936]: I1205 12:54:47.355322 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:48.226007 master-0 kubenswrapper[29936]: I1205 12:54:48.225898 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:54:48.334365 master-0 kubenswrapper[29936]: I1205 12:54:48.334288 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c9b8d8fb9-7pxzk"] Dec 05 12:54:58.137441 master-0 kubenswrapper[29936]: E1205 12:54:58.137384 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: secret "metrics-server-e63soeg91on8p" not found Dec 05 12:54:58.139016 master-0 kubenswrapper[29936]: E1205 12:54:58.138938 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:56:02.138243368 +0000 UTC m=+359.270323089 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : secret "metrics-server-e63soeg91on8p" not found Dec 05 12:55:03.195529 master-0 kubenswrapper[29936]: I1205 12:55:03.195466 29936 kubelet.go:1505] "Image garbage collection succeeded" Dec 05 12:55:03.380728 master-0 kubenswrapper[29936]: I1205 12:55:03.380631 29936 scope.go:117] "RemoveContainer" containerID="ce5bd605cc76993bca2c497ff38423a9bcba04863edec782efc7ee32483a630a" Dec 05 12:55:03.403506 master-0 kubenswrapper[29936]: I1205 12:55:03.403456 29936 scope.go:117] "RemoveContainer" containerID="b24c1b8d78045ff86297a6b78ba71b900f89c5e046061babf21a495bd9bf95d3" Dec 05 12:55:03.430468 master-0 kubenswrapper[29936]: I1205 12:55:03.430400 29936 scope.go:117] "RemoveContainer" containerID="3287f56a58ec6df79eb961042eccb67f5309daab6cc145e4e1caa74cca9833e8" Dec 05 12:55:03.460299 master-0 kubenswrapper[29936]: I1205 12:55:03.460152 29936 scope.go:117] "RemoveContainer" containerID="2c505d1745e5c41c810aeede53577e7297a75c5a2221af8e371f406e5004dcbf" Dec 05 12:55:03.478854 master-0 kubenswrapper[29936]: I1205 12:55:03.478798 29936 scope.go:117] "RemoveContainer" containerID="ba110a7b76ad288df7047b8cf5908c2bd3487d9f6a715466f139c0f2eb3f27da" Dec 05 12:55:13.379621 master-0 kubenswrapper[29936]: I1205 12:55:13.379509 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-c9b8d8fb9-7pxzk" podUID="ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" containerName="console" containerID="cri-o://a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026" gracePeriod=15 Dec 05 12:55:13.835021 master-0 kubenswrapper[29936]: I1205 12:55:13.834956 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c9b8d8fb9-7pxzk_ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73/console/0.log" Dec 05 12:55:13.835490 master-0 kubenswrapper[29936]: I1205 12:55:13.835475 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:55:13.962461 master-0 kubenswrapper[29936]: I1205 12:55:13.962371 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-service-ca\") pod \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " Dec 05 12:55:13.962881 master-0 kubenswrapper[29936]: I1205 12:55:13.962516 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-oauth-serving-cert\") pod \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " Dec 05 12:55:13.962881 master-0 kubenswrapper[29936]: I1205 12:55:13.962608 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-config\") pod \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " Dec 05 12:55:13.962881 master-0 kubenswrapper[29936]: I1205 12:55:13.962692 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-oauth-config\") pod \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " Dec 05 12:55:13.962881 master-0 kubenswrapper[29936]: I1205 12:55:13.962850 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-trusted-ca-bundle\") pod \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " Dec 05 12:55:13.963153 master-0 kubenswrapper[29936]: I1205 12:55:13.962914 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9wrr\" (UniqueName: \"kubernetes.io/projected/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-kube-api-access-t9wrr\") pod \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " Dec 05 12:55:13.963153 master-0 kubenswrapper[29936]: I1205 12:55:13.962953 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-serving-cert\") pod \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\" (UID: \"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73\") " Dec 05 12:55:13.964164 master-0 kubenswrapper[29936]: I1205 12:55:13.964055 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" (UID: "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:55:13.964164 master-0 kubenswrapper[29936]: I1205 12:55:13.964129 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" (UID: "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:55:13.964720 master-0 kubenswrapper[29936]: I1205 12:55:13.964615 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-service-ca" (OuterVolumeSpecName: "service-ca") pod "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" (UID: "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:55:13.964981 master-0 kubenswrapper[29936]: I1205 12:55:13.964948 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-config" (OuterVolumeSpecName: "console-config") pod "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" (UID: "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:55:13.968446 master-0 kubenswrapper[29936]: I1205 12:55:13.968375 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-kube-api-access-t9wrr" (OuterVolumeSpecName: "kube-api-access-t9wrr") pod "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" (UID: "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73"). InnerVolumeSpecName "kube-api-access-t9wrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:55:13.969580 master-0 kubenswrapper[29936]: I1205 12:55:13.969528 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" (UID: "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:55:13.969782 master-0 kubenswrapper[29936]: I1205 12:55:13.969743 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" (UID: "ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:55:14.065790 master-0 kubenswrapper[29936]: I1205 12:55:14.065571 29936 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:55:14.065790 master-0 kubenswrapper[29936]: I1205 12:55:14.065666 29936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:55:14.065790 master-0 kubenswrapper[29936]: I1205 12:55:14.065690 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9wrr\" (UniqueName: \"kubernetes.io/projected/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-kube-api-access-t9wrr\") on node \"master-0\" DevicePath \"\"" Dec 05 12:55:14.065790 master-0 kubenswrapper[29936]: I1205 12:55:14.065716 29936 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:55:14.065790 master-0 kubenswrapper[29936]: I1205 12:55:14.065744 29936 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:55:14.065790 master-0 kubenswrapper[29936]: I1205 12:55:14.065770 29936 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:55:14.065790 master-0 kubenswrapper[29936]: I1205 12:55:14.065792 29936 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73-console-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:55:14.458810 master-0 kubenswrapper[29936]: I1205 12:55:14.458726 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c9b8d8fb9-7pxzk_ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73/console/0.log" Dec 05 12:55:14.458810 master-0 kubenswrapper[29936]: I1205 12:55:14.458814 29936 generic.go:334] "Generic (PLEG): container finished" podID="ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" containerID="a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026" exitCode=2 Dec 05 12:55:14.460089 master-0 kubenswrapper[29936]: I1205 12:55:14.458865 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c9b8d8fb9-7pxzk" event={"ID":"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73","Type":"ContainerDied","Data":"a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026"} Dec 05 12:55:14.460089 master-0 kubenswrapper[29936]: I1205 12:55:14.458927 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c9b8d8fb9-7pxzk" event={"ID":"ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73","Type":"ContainerDied","Data":"d415ce7aad1b9f977544ee6fda9f28aacf11572f4cd6fe849747fe5679c38b34"} Dec 05 12:55:14.460089 master-0 kubenswrapper[29936]: I1205 12:55:14.458982 29936 scope.go:117] "RemoveContainer" containerID="a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026" Dec 05 12:55:14.460089 master-0 kubenswrapper[29936]: I1205 12:55:14.458998 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c9b8d8fb9-7pxzk" Dec 05 12:55:14.483931 master-0 kubenswrapper[29936]: I1205 12:55:14.483860 29936 scope.go:117] "RemoveContainer" containerID="a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026" Dec 05 12:55:14.484555 master-0 kubenswrapper[29936]: E1205 12:55:14.484498 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026\": container with ID starting with a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026 not found: ID does not exist" containerID="a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026" Dec 05 12:55:14.484623 master-0 kubenswrapper[29936]: I1205 12:55:14.484554 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026"} err="failed to get container status \"a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026\": rpc error: code = NotFound desc = could not find container \"a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026\": container with ID starting with a7751470f31b897af34fb3b43f1cd31ed859bb0e58cce9aac5f1c2e057881026 not found: ID does not exist" Dec 05 12:55:14.512610 master-0 kubenswrapper[29936]: I1205 12:55:14.512520 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c9b8d8fb9-7pxzk"] Dec 05 12:55:14.522296 master-0 kubenswrapper[29936]: I1205 12:55:14.522209 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c9b8d8fb9-7pxzk"] Dec 05 12:55:15.197023 master-0 kubenswrapper[29936]: I1205 12:55:15.196923 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" path="/var/lib/kubelet/pods/ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73/volumes" Dec 05 12:55:27.739953 master-0 kubenswrapper[29936]: I1205 12:55:27.739837 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:55:27.783162 master-0 kubenswrapper[29936]: I1205 12:55:27.783069 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:55:28.675289 master-0 kubenswrapper[29936]: I1205 12:55:28.675161 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 05 12:55:37.373942 master-0 kubenswrapper[29936]: I1205 12:55:37.373861 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b44b57fdf-jcmhk"] Dec 05 12:55:37.374787 master-0 kubenswrapper[29936]: E1205 12:55:37.374267 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" containerName="console" Dec 05 12:55:37.374787 master-0 kubenswrapper[29936]: I1205 12:55:37.374287 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" containerName="console" Dec 05 12:55:37.374787 master-0 kubenswrapper[29936]: I1205 12:55:37.374449 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed3f0eab-f26c-4b16-9d01-a6fd7e4bce73" containerName="console" Dec 05 12:55:37.375085 master-0 kubenswrapper[29936]: I1205 12:55:37.375053 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.402583 master-0 kubenswrapper[29936]: I1205 12:55:37.401716 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b44b57fdf-jcmhk"] Dec 05 12:55:37.437519 master-0 kubenswrapper[29936]: I1205 12:55:37.435488 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-oauth-config\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.437519 master-0 kubenswrapper[29936]: I1205 12:55:37.435652 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-oauth-serving-cert\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.437519 master-0 kubenswrapper[29936]: I1205 12:55:37.435728 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-service-ca\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.437519 master-0 kubenswrapper[29936]: I1205 12:55:37.435779 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-config\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.437519 master-0 kubenswrapper[29936]: I1205 12:55:37.435889 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67pcc\" (UniqueName: \"kubernetes.io/projected/49ebc188-493c-4edf-8c0f-3854429fbbdc-kube-api-access-67pcc\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.437519 master-0 kubenswrapper[29936]: I1205 12:55:37.435982 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-trusted-ca-bundle\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.437519 master-0 kubenswrapper[29936]: I1205 12:55:37.436014 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-serving-cert\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.538036 master-0 kubenswrapper[29936]: I1205 12:55:37.537962 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-service-ca\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.538347 master-0 kubenswrapper[29936]: I1205 12:55:37.538058 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-config\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.538347 master-0 kubenswrapper[29936]: I1205 12:55:37.538107 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67pcc\" (UniqueName: \"kubernetes.io/projected/49ebc188-493c-4edf-8c0f-3854429fbbdc-kube-api-access-67pcc\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.538347 master-0 kubenswrapper[29936]: I1205 12:55:37.538162 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-trusted-ca-bundle\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.538347 master-0 kubenswrapper[29936]: I1205 12:55:37.538211 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-serving-cert\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.538347 master-0 kubenswrapper[29936]: I1205 12:55:37.538260 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-oauth-config\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.538347 master-0 kubenswrapper[29936]: I1205 12:55:37.538315 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-oauth-serving-cert\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.539440 master-0 kubenswrapper[29936]: I1205 12:55:37.539405 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-oauth-serving-cert\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.539524 master-0 kubenswrapper[29936]: I1205 12:55:37.539408 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-service-ca\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.539672 master-0 kubenswrapper[29936]: I1205 12:55:37.539630 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-config\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.542251 master-0 kubenswrapper[29936]: I1205 12:55:37.541080 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-trusted-ca-bundle\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.546194 master-0 kubenswrapper[29936]: I1205 12:55:37.544017 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-oauth-config\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.546194 master-0 kubenswrapper[29936]: I1205 12:55:37.545017 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-serving-cert\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.557330 master-0 kubenswrapper[29936]: I1205 12:55:37.557264 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67pcc\" (UniqueName: \"kubernetes.io/projected/49ebc188-493c-4edf-8c0f-3854429fbbdc-kube-api-access-67pcc\") pod \"console-6b44b57fdf-jcmhk\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.705023 master-0 kubenswrapper[29936]: I1205 12:55:37.704927 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:37.930491 master-0 kubenswrapper[29936]: I1205 12:55:37.928702 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b44b57fdf-jcmhk"] Dec 05 12:55:37.968842 master-0 kubenswrapper[29936]: I1205 12:55:37.968662 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c4c944c57-cbgzt"] Dec 05 12:55:37.970603 master-0 kubenswrapper[29936]: I1205 12:55:37.970568 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:37.981830 master-0 kubenswrapper[29936]: I1205 12:55:37.981504 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c4c944c57-cbgzt"] Dec 05 12:55:38.047671 master-0 kubenswrapper[29936]: I1205 12:55:38.047597 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-service-ca\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.047958 master-0 kubenswrapper[29936]: I1205 12:55:38.047822 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-serving-cert\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.048248 master-0 kubenswrapper[29936]: I1205 12:55:38.048194 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-console-config\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.048248 master-0 kubenswrapper[29936]: I1205 12:55:38.048234 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-oauth-serving-cert\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.048493 master-0 kubenswrapper[29936]: I1205 12:55:38.048416 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rf24\" (UniqueName: \"kubernetes.io/projected/5c779b4b-b368-4573-9502-17ea8fc60aac-kube-api-access-6rf24\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.048912 master-0 kubenswrapper[29936]: I1205 12:55:38.048621 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-trusted-ca-bundle\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.048912 master-0 kubenswrapper[29936]: I1205 12:55:38.048695 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-oauth-config\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.150050 master-0 kubenswrapper[29936]: I1205 12:55:38.149974 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-oauth-config\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.150359 master-0 kubenswrapper[29936]: I1205 12:55:38.150233 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-service-ca\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.150359 master-0 kubenswrapper[29936]: I1205 12:55:38.150277 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-serving-cert\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.150470 master-0 kubenswrapper[29936]: I1205 12:55:38.150362 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-console-config\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.150470 master-0 kubenswrapper[29936]: I1205 12:55:38.150427 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-oauth-serving-cert\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.150470 master-0 kubenswrapper[29936]: I1205 12:55:38.150462 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rf24\" (UniqueName: \"kubernetes.io/projected/5c779b4b-b368-4573-9502-17ea8fc60aac-kube-api-access-6rf24\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.150771 master-0 kubenswrapper[29936]: I1205 12:55:38.150729 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-trusted-ca-bundle\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.151461 master-0 kubenswrapper[29936]: I1205 12:55:38.151423 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-service-ca\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.151687 master-0 kubenswrapper[29936]: I1205 12:55:38.151636 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-console-config\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.151762 master-0 kubenswrapper[29936]: I1205 12:55:38.151660 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-oauth-serving-cert\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.153697 master-0 kubenswrapper[29936]: I1205 12:55:38.153642 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-oauth-config\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.154809 master-0 kubenswrapper[29936]: I1205 12:55:38.154746 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-trusted-ca-bundle\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.154976 master-0 kubenswrapper[29936]: I1205 12:55:38.154929 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-serving-cert\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.170742 master-0 kubenswrapper[29936]: I1205 12:55:38.170685 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rf24\" (UniqueName: \"kubernetes.io/projected/5c779b4b-b368-4573-9502-17ea8fc60aac-kube-api-access-6rf24\") pod \"console-5c4c944c57-cbgzt\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.172441 master-0 kubenswrapper[29936]: I1205 12:55:38.172376 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b44b57fdf-jcmhk"] Dec 05 12:55:38.177089 master-0 kubenswrapper[29936]: W1205 12:55:38.177043 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49ebc188_493c_4edf_8c0f_3854429fbbdc.slice/crio-11d6715764b699aac40d6ac360abb7b5556eb04b2cc12afc57c509c737f24243 WatchSource:0}: Error finding container 11d6715764b699aac40d6ac360abb7b5556eb04b2cc12afc57c509c737f24243: Status 404 returned error can't find the container with id 11d6715764b699aac40d6ac360abb7b5556eb04b2cc12afc57c509c737f24243 Dec 05 12:55:38.299672 master-0 kubenswrapper[29936]: I1205 12:55:38.299614 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:38.705038 master-0 kubenswrapper[29936]: I1205 12:55:38.704972 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b44b57fdf-jcmhk" event={"ID":"49ebc188-493c-4edf-8c0f-3854429fbbdc","Type":"ContainerStarted","Data":"e5bfa5a807bebcb8438604664026e8022d5219ac9d6da1932b5fd96c5d59c16f"} Dec 05 12:55:38.705038 master-0 kubenswrapper[29936]: I1205 12:55:38.705040 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b44b57fdf-jcmhk" event={"ID":"49ebc188-493c-4edf-8c0f-3854429fbbdc","Type":"ContainerStarted","Data":"11d6715764b699aac40d6ac360abb7b5556eb04b2cc12afc57c509c737f24243"} Dec 05 12:55:38.741858 master-0 kubenswrapper[29936]: I1205 12:55:38.740774 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b44b57fdf-jcmhk" podStartSLOduration=1.740739154 podStartE2EDuration="1.740739154s" podCreationTimestamp="2025-12-05 12:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:55:38.728839202 +0000 UTC m=+335.860918893" watchObservedRunningTime="2025-12-05 12:55:38.740739154 +0000 UTC m=+335.872818875" Dec 05 12:55:38.850699 master-0 kubenswrapper[29936]: I1205 12:55:38.850624 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c4c944c57-cbgzt"] Dec 05 12:55:39.716222 master-0 kubenswrapper[29936]: I1205 12:55:39.716109 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4c944c57-cbgzt" event={"ID":"5c779b4b-b368-4573-9502-17ea8fc60aac","Type":"ContainerStarted","Data":"2b70f89cbf491ef83a5dfaec4f3b7f047c49a3a50041133350f4e7c8b8132f0e"} Dec 05 12:55:39.716897 master-0 kubenswrapper[29936]: I1205 12:55:39.716253 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4c944c57-cbgzt" event={"ID":"5c779b4b-b368-4573-9502-17ea8fc60aac","Type":"ContainerStarted","Data":"e2fb0775164b3ff160de6c0f50b7f54076b8ce3a94bff44fa5df6930dfa34de1"} Dec 05 12:55:39.749607 master-0 kubenswrapper[29936]: I1205 12:55:39.749511 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c4c944c57-cbgzt" podStartSLOduration=2.749484063 podStartE2EDuration="2.749484063s" podCreationTimestamp="2025-12-05 12:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:55:39.741772959 +0000 UTC m=+336.873852660" watchObservedRunningTime="2025-12-05 12:55:39.749484063 +0000 UTC m=+336.881563744" Dec 05 12:55:43.781571 master-0 kubenswrapper[29936]: I1205 12:55:43.781493 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Dec 05 12:55:43.785609 master-0 kubenswrapper[29936]: I1205 12:55:43.785558 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:43.791261 master-0 kubenswrapper[29936]: I1205 12:55:43.788811 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 12:55:43.791261 master-0 kubenswrapper[29936]: I1205 12:55:43.789782 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kkk9n" Dec 05 12:55:43.792343 master-0 kubenswrapper[29936]: I1205 12:55:43.792293 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Dec 05 12:55:43.877050 master-0 kubenswrapper[29936]: I1205 12:55:43.876963 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-var-lock\") pod \"installer-5-master-0\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:43.877357 master-0 kubenswrapper[29936]: I1205 12:55:43.877094 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kube-api-access\") pod \"installer-5-master-0\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:43.877645 master-0 kubenswrapper[29936]: I1205 12:55:43.877552 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:43.979300 master-0 kubenswrapper[29936]: I1205 12:55:43.979157 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:43.979300 master-0 kubenswrapper[29936]: I1205 12:55:43.979266 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:43.979656 master-0 kubenswrapper[29936]: I1205 12:55:43.979337 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-var-lock\") pod \"installer-5-master-0\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:43.979656 master-0 kubenswrapper[29936]: I1205 12:55:43.979420 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kube-api-access\") pod \"installer-5-master-0\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:43.979656 master-0 kubenswrapper[29936]: I1205 12:55:43.979494 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-var-lock\") pod \"installer-5-master-0\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:43.996490 master-0 kubenswrapper[29936]: I1205 12:55:43.996439 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kube-api-access\") pod \"installer-5-master-0\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:44.115784 master-0 kubenswrapper[29936]: I1205 12:55:44.115629 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:55:44.633598 master-0 kubenswrapper[29936]: I1205 12:55:44.633520 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Dec 05 12:55:44.755787 master-0 kubenswrapper[29936]: I1205 12:55:44.755725 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb","Type":"ContainerStarted","Data":"8b9f0205682424862e6b6dc02fa7919cdbaffe2375b186b7aea8ee2f9ac432ba"} Dec 05 12:55:45.767287 master-0 kubenswrapper[29936]: I1205 12:55:45.767106 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb","Type":"ContainerStarted","Data":"518b499d058096d1c8b316aa26dc8e71f11cac93ae5fecc6eb7f81c99cbcc957"} Dec 05 12:55:46.053058 master-0 kubenswrapper[29936]: I1205 12:55:46.052820 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-5-master-0" podStartSLOduration=3.052791312 podStartE2EDuration="3.052791312s" podCreationTimestamp="2025-12-05 12:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:55:46.048500942 +0000 UTC m=+343.180580633" watchObservedRunningTime="2025-12-05 12:55:46.052791312 +0000 UTC m=+343.184870993" Dec 05 12:55:47.705747 master-0 kubenswrapper[29936]: I1205 12:55:47.705649 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:55:48.300593 master-0 kubenswrapper[29936]: I1205 12:55:48.300516 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:48.300851 master-0 kubenswrapper[29936]: I1205 12:55:48.300806 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:48.305793 master-0 kubenswrapper[29936]: I1205 12:55:48.305711 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:48.795570 master-0 kubenswrapper[29936]: I1205 12:55:48.795458 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 12:55:52.467837 master-0 kubenswrapper[29936]: I1205 12:55:52.467627 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76b95955f5-vk4wx"] Dec 05 12:56:02.200340 master-0 kubenswrapper[29936]: E1205 12:56:02.200251 29936 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-e63soeg91on8p: secret "metrics-server-e63soeg91on8p" not found Dec 05 12:56:02.201557 master-0 kubenswrapper[29936]: E1205 12:56:02.200430 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle podName:a5338041-f213-46ef-9d81-248567ba958d nodeName:}" failed. No retries permitted until 2025-12-05 12:58:04.200388475 +0000 UTC m=+481.332468186 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle") pod "metrics-server-54c5748c8c-kqs7s" (UID: "a5338041-f213-46ef-9d81-248567ba958d") : secret "metrics-server-e63soeg91on8p" not found Dec 05 12:56:03.548021 master-0 kubenswrapper[29936]: I1205 12:56:03.547929 29936 scope.go:117] "RemoveContainer" containerID="8d14f1413c8e8a2ef6cd9ab523725814ba9ff7a6021dd1c6a68ef759cfabfdf3" Dec 05 12:56:03.574613 master-0 kubenswrapper[29936]: I1205 12:56:03.574535 29936 scope.go:117] "RemoveContainer" containerID="91dbe5959251acff62db45931eb5a5e1e4e7af9bb363ef308eee803d4237a389" Dec 05 12:56:03.593365 master-0 kubenswrapper[29936]: I1205 12:56:03.593305 29936 scope.go:117] "RemoveContainer" containerID="0a16bc5dbf4947d3592d7a160d069d5ae407c8eecca6478799c03089401c073c" Dec 05 12:56:03.612589 master-0 kubenswrapper[29936]: I1205 12:56:03.612534 29936 scope.go:117] "RemoveContainer" containerID="1b3283d0fac22ca78f337b1d5e3afe8d01431a02a7bb6f2fb90c61b14196aefb" Dec 05 12:56:03.745876 master-0 kubenswrapper[29936]: I1205 12:56:03.745767 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6b44b57fdf-jcmhk" podUID="49ebc188-493c-4edf-8c0f-3854429fbbdc" containerName="console" containerID="cri-o://e5bfa5a807bebcb8438604664026e8022d5219ac9d6da1932b5fd96c5d59c16f" gracePeriod=15 Dec 05 12:56:03.913099 master-0 kubenswrapper[29936]: I1205 12:56:03.913039 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b44b57fdf-jcmhk_49ebc188-493c-4edf-8c0f-3854429fbbdc/console/0.log" Dec 05 12:56:03.913099 master-0 kubenswrapper[29936]: I1205 12:56:03.913101 29936 generic.go:334] "Generic (PLEG): container finished" podID="49ebc188-493c-4edf-8c0f-3854429fbbdc" containerID="e5bfa5a807bebcb8438604664026e8022d5219ac9d6da1932b5fd96c5d59c16f" exitCode=2 Dec 05 12:56:03.913497 master-0 kubenswrapper[29936]: I1205 12:56:03.913145 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b44b57fdf-jcmhk" event={"ID":"49ebc188-493c-4edf-8c0f-3854429fbbdc","Type":"ContainerDied","Data":"e5bfa5a807bebcb8438604664026e8022d5219ac9d6da1932b5fd96c5d59c16f"} Dec 05 12:56:04.198550 master-0 kubenswrapper[29936]: I1205 12:56:04.198470 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b44b57fdf-jcmhk_49ebc188-493c-4edf-8c0f-3854429fbbdc/console/0.log" Dec 05 12:56:04.198841 master-0 kubenswrapper[29936]: I1205 12:56:04.198589 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:56:04.344963 master-0 kubenswrapper[29936]: I1205 12:56:04.344701 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-config\") pod \"49ebc188-493c-4edf-8c0f-3854429fbbdc\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " Dec 05 12:56:04.344963 master-0 kubenswrapper[29936]: I1205 12:56:04.344803 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-service-ca\") pod \"49ebc188-493c-4edf-8c0f-3854429fbbdc\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " Dec 05 12:56:04.345400 master-0 kubenswrapper[29936]: I1205 12:56:04.345037 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-oauth-config\") pod \"49ebc188-493c-4edf-8c0f-3854429fbbdc\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " Dec 05 12:56:04.345400 master-0 kubenswrapper[29936]: I1205 12:56:04.345129 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67pcc\" (UniqueName: \"kubernetes.io/projected/49ebc188-493c-4edf-8c0f-3854429fbbdc-kube-api-access-67pcc\") pod \"49ebc188-493c-4edf-8c0f-3854429fbbdc\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " Dec 05 12:56:04.345400 master-0 kubenswrapper[29936]: I1205 12:56:04.345208 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-serving-cert\") pod \"49ebc188-493c-4edf-8c0f-3854429fbbdc\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " Dec 05 12:56:04.345400 master-0 kubenswrapper[29936]: I1205 12:56:04.345236 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-trusted-ca-bundle\") pod \"49ebc188-493c-4edf-8c0f-3854429fbbdc\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " Dec 05 12:56:04.345400 master-0 kubenswrapper[29936]: I1205 12:56:04.345275 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-oauth-serving-cert\") pod \"49ebc188-493c-4edf-8c0f-3854429fbbdc\" (UID: \"49ebc188-493c-4edf-8c0f-3854429fbbdc\") " Dec 05 12:56:04.345400 master-0 kubenswrapper[29936]: I1205 12:56:04.345350 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-config" (OuterVolumeSpecName: "console-config") pod "49ebc188-493c-4edf-8c0f-3854429fbbdc" (UID: "49ebc188-493c-4edf-8c0f-3854429fbbdc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:56:04.345400 master-0 kubenswrapper[29936]: I1205 12:56:04.345345 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-service-ca" (OuterVolumeSpecName: "service-ca") pod "49ebc188-493c-4edf-8c0f-3854429fbbdc" (UID: "49ebc188-493c-4edf-8c0f-3854429fbbdc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:56:04.347050 master-0 kubenswrapper[29936]: I1205 12:56:04.345718 29936 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:04.347050 master-0 kubenswrapper[29936]: I1205 12:56:04.345741 29936 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:04.347050 master-0 kubenswrapper[29936]: I1205 12:56:04.345899 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "49ebc188-493c-4edf-8c0f-3854429fbbdc" (UID: "49ebc188-493c-4edf-8c0f-3854429fbbdc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:56:04.347050 master-0 kubenswrapper[29936]: I1205 12:56:04.346006 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "49ebc188-493c-4edf-8c0f-3854429fbbdc" (UID: "49ebc188-493c-4edf-8c0f-3854429fbbdc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:56:04.349033 master-0 kubenswrapper[29936]: I1205 12:56:04.348994 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "49ebc188-493c-4edf-8c0f-3854429fbbdc" (UID: "49ebc188-493c-4edf-8c0f-3854429fbbdc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:56:04.349167 master-0 kubenswrapper[29936]: I1205 12:56:04.349118 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ebc188-493c-4edf-8c0f-3854429fbbdc-kube-api-access-67pcc" (OuterVolumeSpecName: "kube-api-access-67pcc") pod "49ebc188-493c-4edf-8c0f-3854429fbbdc" (UID: "49ebc188-493c-4edf-8c0f-3854429fbbdc"). InnerVolumeSpecName "kube-api-access-67pcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:56:04.349311 master-0 kubenswrapper[29936]: I1205 12:56:04.349272 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "49ebc188-493c-4edf-8c0f-3854429fbbdc" (UID: "49ebc188-493c-4edf-8c0f-3854429fbbdc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:56:04.447427 master-0 kubenswrapper[29936]: I1205 12:56:04.447343 29936 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:04.447427 master-0 kubenswrapper[29936]: I1205 12:56:04.447404 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67pcc\" (UniqueName: \"kubernetes.io/projected/49ebc188-493c-4edf-8c0f-3854429fbbdc-kube-api-access-67pcc\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:04.447427 master-0 kubenswrapper[29936]: I1205 12:56:04.447418 29936 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49ebc188-493c-4edf-8c0f-3854429fbbdc-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:04.447427 master-0 kubenswrapper[29936]: I1205 12:56:04.447432 29936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:04.447427 master-0 kubenswrapper[29936]: I1205 12:56:04.447445 29936 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49ebc188-493c-4edf-8c0f-3854429fbbdc-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:04.925322 master-0 kubenswrapper[29936]: I1205 12:56:04.925106 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b44b57fdf-jcmhk_49ebc188-493c-4edf-8c0f-3854429fbbdc/console/0.log" Dec 05 12:56:04.925322 master-0 kubenswrapper[29936]: I1205 12:56:04.925232 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b44b57fdf-jcmhk" event={"ID":"49ebc188-493c-4edf-8c0f-3854429fbbdc","Type":"ContainerDied","Data":"11d6715764b699aac40d6ac360abb7b5556eb04b2cc12afc57c509c737f24243"} Dec 05 12:56:04.925322 master-0 kubenswrapper[29936]: I1205 12:56:04.925256 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b44b57fdf-jcmhk" Dec 05 12:56:04.925322 master-0 kubenswrapper[29936]: I1205 12:56:04.925290 29936 scope.go:117] "RemoveContainer" containerID="e5bfa5a807bebcb8438604664026e8022d5219ac9d6da1932b5fd96c5d59c16f" Dec 05 12:56:04.983947 master-0 kubenswrapper[29936]: I1205 12:56:04.983569 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b44b57fdf-jcmhk"] Dec 05 12:56:05.001005 master-0 kubenswrapper[29936]: I1205 12:56:04.999926 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b44b57fdf-jcmhk"] Dec 05 12:56:05.196935 master-0 kubenswrapper[29936]: I1205 12:56:05.196845 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ebc188-493c-4edf-8c0f-3854429fbbdc" path="/var/lib/kubelet/pods/49ebc188-493c-4edf-8c0f-3854429fbbdc/volumes" Dec 05 12:56:17.860522 master-0 kubenswrapper[29936]: I1205 12:56:17.860404 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-76b95955f5-vk4wx" podUID="9ba34a06-4873-4b03-a94b-055afbf70898" containerName="console" containerID="cri-o://f1b2eb8267907aa0d29e8a51321bba3dc60dff9552814df59e9790e9100b0169" gracePeriod=15 Dec 05 12:56:18.041948 master-0 kubenswrapper[29936]: I1205 12:56:18.041869 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b95955f5-vk4wx_9ba34a06-4873-4b03-a94b-055afbf70898/console/0.log" Dec 05 12:56:18.041948 master-0 kubenswrapper[29936]: I1205 12:56:18.041943 29936 generic.go:334] "Generic (PLEG): container finished" podID="9ba34a06-4873-4b03-a94b-055afbf70898" containerID="f1b2eb8267907aa0d29e8a51321bba3dc60dff9552814df59e9790e9100b0169" exitCode=2 Dec 05 12:56:18.042300 master-0 kubenswrapper[29936]: I1205 12:56:18.041992 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b95955f5-vk4wx" event={"ID":"9ba34a06-4873-4b03-a94b-055afbf70898","Type":"ContainerDied","Data":"f1b2eb8267907aa0d29e8a51321bba3dc60dff9552814df59e9790e9100b0169"} Dec 05 12:56:18.350481 master-0 kubenswrapper[29936]: I1205 12:56:18.350408 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b95955f5-vk4wx_9ba34a06-4873-4b03-a94b-055afbf70898/console/0.log" Dec 05 12:56:18.350877 master-0 kubenswrapper[29936]: I1205 12:56:18.350538 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:56:18.452689 master-0 kubenswrapper[29936]: I1205 12:56:18.452611 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-console-config\") pod \"9ba34a06-4873-4b03-a94b-055afbf70898\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " Dec 05 12:56:18.453331 master-0 kubenswrapper[29936]: I1205 12:56:18.453297 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-console-config" (OuterVolumeSpecName: "console-config") pod "9ba34a06-4873-4b03-a94b-055afbf70898" (UID: "9ba34a06-4873-4b03-a94b-055afbf70898"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:56:18.455268 master-0 kubenswrapper[29936]: I1205 12:56:18.455149 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-oauth-config\") pod \"9ba34a06-4873-4b03-a94b-055afbf70898\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " Dec 05 12:56:18.455399 master-0 kubenswrapper[29936]: I1205 12:56:18.455361 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-trusted-ca-bundle\") pod \"9ba34a06-4873-4b03-a94b-055afbf70898\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " Dec 05 12:56:18.455523 master-0 kubenswrapper[29936]: I1205 12:56:18.455420 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-oauth-serving-cert\") pod \"9ba34a06-4873-4b03-a94b-055afbf70898\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " Dec 05 12:56:18.455593 master-0 kubenswrapper[29936]: I1205 12:56:18.455544 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svl7z\" (UniqueName: \"kubernetes.io/projected/9ba34a06-4873-4b03-a94b-055afbf70898-kube-api-access-svl7z\") pod \"9ba34a06-4873-4b03-a94b-055afbf70898\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " Dec 05 12:56:18.455770 master-0 kubenswrapper[29936]: I1205 12:56:18.455651 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-serving-cert\") pod \"9ba34a06-4873-4b03-a94b-055afbf70898\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " Dec 05 12:56:18.455854 master-0 kubenswrapper[29936]: I1205 12:56:18.455790 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-service-ca\") pod \"9ba34a06-4873-4b03-a94b-055afbf70898\" (UID: \"9ba34a06-4873-4b03-a94b-055afbf70898\") " Dec 05 12:56:18.456293 master-0 kubenswrapper[29936]: I1205 12:56:18.456242 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9ba34a06-4873-4b03-a94b-055afbf70898" (UID: "9ba34a06-4873-4b03-a94b-055afbf70898"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:56:18.456589 master-0 kubenswrapper[29936]: I1205 12:56:18.456511 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9ba34a06-4873-4b03-a94b-055afbf70898" (UID: "9ba34a06-4873-4b03-a94b-055afbf70898"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:56:18.456646 master-0 kubenswrapper[29936]: I1205 12:56:18.456579 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-service-ca" (OuterVolumeSpecName: "service-ca") pod "9ba34a06-4873-4b03-a94b-055afbf70898" (UID: "9ba34a06-4873-4b03-a94b-055afbf70898"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:56:18.457095 master-0 kubenswrapper[29936]: I1205 12:56:18.457031 29936 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-console-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:18.457095 master-0 kubenswrapper[29936]: I1205 12:56:18.457090 29936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:18.457399 master-0 kubenswrapper[29936]: I1205 12:56:18.457125 29936 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:18.457399 master-0 kubenswrapper[29936]: I1205 12:56:18.457152 29936 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ba34a06-4873-4b03-a94b-055afbf70898-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:18.461752 master-0 kubenswrapper[29936]: I1205 12:56:18.461685 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba34a06-4873-4b03-a94b-055afbf70898-kube-api-access-svl7z" (OuterVolumeSpecName: "kube-api-access-svl7z") pod "9ba34a06-4873-4b03-a94b-055afbf70898" (UID: "9ba34a06-4873-4b03-a94b-055afbf70898"). InnerVolumeSpecName "kube-api-access-svl7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:56:18.461911 master-0 kubenswrapper[29936]: I1205 12:56:18.461691 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9ba34a06-4873-4b03-a94b-055afbf70898" (UID: "9ba34a06-4873-4b03-a94b-055afbf70898"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:56:18.462635 master-0 kubenswrapper[29936]: I1205 12:56:18.462558 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9ba34a06-4873-4b03-a94b-055afbf70898" (UID: "9ba34a06-4873-4b03-a94b-055afbf70898"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:56:18.560487 master-0 kubenswrapper[29936]: I1205 12:56:18.560298 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svl7z\" (UniqueName: \"kubernetes.io/projected/9ba34a06-4873-4b03-a94b-055afbf70898-kube-api-access-svl7z\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:18.560487 master-0 kubenswrapper[29936]: I1205 12:56:18.560406 29936 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:18.560487 master-0 kubenswrapper[29936]: I1205 12:56:18.560426 29936 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ba34a06-4873-4b03-a94b-055afbf70898-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:18.713008 master-0 kubenswrapper[29936]: I1205 12:56:18.712926 29936 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:56:18.713364 master-0 kubenswrapper[29936]: I1205 12:56:18.713319 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="cluster-policy-controller" containerID="cri-o://613fa64c416308b42c0d2958d3f3712126e52a447f52c90eaabf9bb657dccfd4" gracePeriod=30 Dec 05 12:56:18.713439 master-0 kubenswrapper[29936]: I1205 12:56:18.713355 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" containerID="cri-o://e8917c3711bbe1adfa1dc4fa6befd9275e69d1180a7505f4e499700e3290a159" gracePeriod=30 Dec 05 12:56:18.713498 master-0 kubenswrapper[29936]: I1205 12:56:18.713428 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://46a38d7a2db34bb7213e7c44dd4da9930d6a2962fb71de116eced4ef1aa3810e" gracePeriod=30 Dec 05 12:56:18.713615 master-0 kubenswrapper[29936]: I1205 12:56:18.713462 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://9e00fa2595fad4ad014a23b8074a3240a2449d47373074fecd9654f334a13fb7" gracePeriod=30 Dec 05 12:56:18.714932 master-0 kubenswrapper[29936]: I1205 12:56:18.714885 29936 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:56:18.715288 master-0 kubenswrapper[29936]: E1205 12:56:18.715254 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" Dec 05 12:56:18.715288 master-0 kubenswrapper[29936]: I1205 12:56:18.715279 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" Dec 05 12:56:18.715408 master-0 kubenswrapper[29936]: E1205 12:56:18.715318 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager-cert-syncer" Dec 05 12:56:18.715408 master-0 kubenswrapper[29936]: I1205 12:56:18.715328 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager-cert-syncer" Dec 05 12:56:18.715408 master-0 kubenswrapper[29936]: E1205 12:56:18.715348 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="cluster-policy-controller" Dec 05 12:56:18.715408 master-0 kubenswrapper[29936]: I1205 12:56:18.715356 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="cluster-policy-controller" Dec 05 12:56:18.715408 master-0 kubenswrapper[29936]: E1205 12:56:18.715366 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" Dec 05 12:56:18.715408 master-0 kubenswrapper[29936]: I1205 12:56:18.715372 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" Dec 05 12:56:18.715408 master-0 kubenswrapper[29936]: E1205 12:56:18.715380 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager-recovery-controller" Dec 05 12:56:18.715408 master-0 kubenswrapper[29936]: I1205 12:56:18.715386 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager-recovery-controller" Dec 05 12:56:18.715408 master-0 kubenswrapper[29936]: E1205 12:56:18.715400 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ebc188-493c-4edf-8c0f-3854429fbbdc" containerName="console" Dec 05 12:56:18.715408 master-0 kubenswrapper[29936]: I1205 12:56:18.715412 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ebc188-493c-4edf-8c0f-3854429fbbdc" containerName="console" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: E1205 12:56:18.715434 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba34a06-4873-4b03-a94b-055afbf70898" containerName="console" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: I1205 12:56:18.715442 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba34a06-4873-4b03-a94b-055afbf70898" containerName="console" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: I1205 12:56:18.715569 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager-recovery-controller" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: I1205 12:56:18.715611 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ebc188-493c-4edf-8c0f-3854429fbbdc" containerName="console" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: I1205 12:56:18.715629 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: I1205 12:56:18.715639 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: I1205 12:56:18.715648 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: I1205 12:56:18.715655 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager-cert-syncer" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: I1205 12:56:18.715664 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="cluster-policy-controller" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: I1205 12:56:18.715675 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba34a06-4873-4b03-a94b-055afbf70898" containerName="console" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: E1205 12:56:18.715796 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" Dec 05 12:56:18.715900 master-0 kubenswrapper[29936]: I1205 12:56:18.715803 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="610dc2015b38bc32879d55a7d39b2587" containerName="kube-controller-manager" Dec 05 12:56:18.865228 master-0 kubenswrapper[29936]: I1205 12:56:18.865136 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f516c058086fe449b55cd324bd8e0223-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"f516c058086fe449b55cd324bd8e0223\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:18.865804 master-0 kubenswrapper[29936]: I1205 12:56:18.865322 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f516c058086fe449b55cd324bd8e0223-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"f516c058086fe449b55cd324bd8e0223\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:18.866757 master-0 kubenswrapper[29936]: I1205 12:56:18.866712 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_610dc2015b38bc32879d55a7d39b2587/kube-controller-manager/1.log" Dec 05 12:56:18.867993 master-0 kubenswrapper[29936]: I1205 12:56:18.867952 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_610dc2015b38bc32879d55a7d39b2587/kube-controller-manager-cert-syncer/0.log" Dec 05 12:56:18.868588 master-0 kubenswrapper[29936]: I1205 12:56:18.868552 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:18.873137 master-0 kubenswrapper[29936]: I1205 12:56:18.873050 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="610dc2015b38bc32879d55a7d39b2587" podUID="f516c058086fe449b55cd324bd8e0223" Dec 05 12:56:18.966477 master-0 kubenswrapper[29936]: I1205 12:56:18.966372 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-cert-dir\") pod \"610dc2015b38bc32879d55a7d39b2587\" (UID: \"610dc2015b38bc32879d55a7d39b2587\") " Dec 05 12:56:18.966477 master-0 kubenswrapper[29936]: I1205 12:56:18.966465 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-resource-dir\") pod \"610dc2015b38bc32879d55a7d39b2587\" (UID: \"610dc2015b38bc32879d55a7d39b2587\") " Dec 05 12:56:18.966980 master-0 kubenswrapper[29936]: I1205 12:56:18.966938 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f516c058086fe449b55cd324bd8e0223-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"f516c058086fe449b55cd324bd8e0223\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:18.967093 master-0 kubenswrapper[29936]: I1205 12:56:18.967057 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f516c058086fe449b55cd324bd8e0223-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"f516c058086fe449b55cd324bd8e0223\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:18.967293 master-0 kubenswrapper[29936]: I1205 12:56:18.967259 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f516c058086fe449b55cd324bd8e0223-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"f516c058086fe449b55cd324bd8e0223\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:18.967422 master-0 kubenswrapper[29936]: I1205 12:56:18.967393 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "610dc2015b38bc32879d55a7d39b2587" (UID: "610dc2015b38bc32879d55a7d39b2587"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:56:18.967461 master-0 kubenswrapper[29936]: I1205 12:56:18.967445 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f516c058086fe449b55cd324bd8e0223-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"f516c058086fe449b55cd324bd8e0223\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:18.967514 master-0 kubenswrapper[29936]: I1205 12:56:18.967451 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "610dc2015b38bc32879d55a7d39b2587" (UID: "610dc2015b38bc32879d55a7d39b2587"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:56:19.051924 master-0 kubenswrapper[29936]: I1205 12:56:19.051841 29936 generic.go:334] "Generic (PLEG): container finished" podID="312e2ad9-ddf1-42f4-8460-fbb9b4099dfb" containerID="518b499d058096d1c8b316aa26dc8e71f11cac93ae5fecc6eb7f81c99cbcc957" exitCode=0 Dec 05 12:56:19.052306 master-0 kubenswrapper[29936]: I1205 12:56:19.051950 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb","Type":"ContainerDied","Data":"518b499d058096d1c8b316aa26dc8e71f11cac93ae5fecc6eb7f81c99cbcc957"} Dec 05 12:56:19.055154 master-0 kubenswrapper[29936]: I1205 12:56:19.055056 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_610dc2015b38bc32879d55a7d39b2587/kube-controller-manager/1.log" Dec 05 12:56:19.056488 master-0 kubenswrapper[29936]: I1205 12:56:19.056434 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_610dc2015b38bc32879d55a7d39b2587/kube-controller-manager-cert-syncer/0.log" Dec 05 12:56:19.057078 master-0 kubenswrapper[29936]: I1205 12:56:19.057010 29936 generic.go:334] "Generic (PLEG): container finished" podID="610dc2015b38bc32879d55a7d39b2587" containerID="e8917c3711bbe1adfa1dc4fa6befd9275e69d1180a7505f4e499700e3290a159" exitCode=0 Dec 05 12:56:19.057143 master-0 kubenswrapper[29936]: I1205 12:56:19.057125 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:19.057207 master-0 kubenswrapper[29936]: I1205 12:56:19.057141 29936 generic.go:334] "Generic (PLEG): container finished" podID="610dc2015b38bc32879d55a7d39b2587" containerID="46a38d7a2db34bb7213e7c44dd4da9930d6a2962fb71de116eced4ef1aa3810e" exitCode=0 Dec 05 12:56:19.057309 master-0 kubenswrapper[29936]: I1205 12:56:19.057254 29936 generic.go:334] "Generic (PLEG): container finished" podID="610dc2015b38bc32879d55a7d39b2587" containerID="9e00fa2595fad4ad014a23b8074a3240a2449d47373074fecd9654f334a13fb7" exitCode=2 Dec 05 12:56:19.057478 master-0 kubenswrapper[29936]: I1205 12:56:19.057428 29936 generic.go:334] "Generic (PLEG): container finished" podID="610dc2015b38bc32879d55a7d39b2587" containerID="613fa64c416308b42c0d2958d3f3712126e52a447f52c90eaabf9bb657dccfd4" exitCode=0 Dec 05 12:56:19.057603 master-0 kubenswrapper[29936]: I1205 12:56:19.057116 29936 scope.go:117] "RemoveContainer" containerID="67d67d3e89e13fa99e16d8850ae9285f71eeb433f6a0cc9257e00f3e497935e9" Dec 05 12:56:19.057760 master-0 kubenswrapper[29936]: I1205 12:56:19.057698 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60d3aaefe0051812ad1090dac5fefb06749299f2a086d05d22b6029b515dfaaa" Dec 05 12:56:19.060291 master-0 kubenswrapper[29936]: I1205 12:56:19.060249 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76b95955f5-vk4wx_9ba34a06-4873-4b03-a94b-055afbf70898/console/0.log" Dec 05 12:56:19.060359 master-0 kubenswrapper[29936]: I1205 12:56:19.060332 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76b95955f5-vk4wx" event={"ID":"9ba34a06-4873-4b03-a94b-055afbf70898","Type":"ContainerDied","Data":"4782731ff97a7d89f1b63c757e6d2b309aca86030a5cd604c9336032a54442b7"} Dec 05 12:56:19.060413 master-0 kubenswrapper[29936]: I1205 12:56:19.060383 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76b95955f5-vk4wx" Dec 05 12:56:19.069048 master-0 kubenswrapper[29936]: I1205 12:56:19.068952 29936 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:19.069048 master-0 kubenswrapper[29936]: I1205 12:56:19.068998 29936 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/610dc2015b38bc32879d55a7d39b2587-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:19.085729 master-0 kubenswrapper[29936]: I1205 12:56:19.082560 29936 scope.go:117] "RemoveContainer" containerID="f1b2eb8267907aa0d29e8a51321bba3dc60dff9552814df59e9790e9100b0169" Dec 05 12:56:19.086985 master-0 kubenswrapper[29936]: I1205 12:56:19.086928 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="610dc2015b38bc32879d55a7d39b2587" podUID="f516c058086fe449b55cd324bd8e0223" Dec 05 12:56:19.113337 master-0 kubenswrapper[29936]: I1205 12:56:19.113251 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76b95955f5-vk4wx"] Dec 05 12:56:19.120877 master-0 kubenswrapper[29936]: I1205 12:56:19.120774 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76b95955f5-vk4wx"] Dec 05 12:56:19.195260 master-0 kubenswrapper[29936]: I1205 12:56:19.195173 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="610dc2015b38bc32879d55a7d39b2587" path="/var/lib/kubelet/pods/610dc2015b38bc32879d55a7d39b2587/volumes" Dec 05 12:56:19.195956 master-0 kubenswrapper[29936]: I1205 12:56:19.195927 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba34a06-4873-4b03-a94b-055afbf70898" path="/var/lib/kubelet/pods/9ba34a06-4873-4b03-a94b-055afbf70898/volumes" Dec 05 12:56:20.074280 master-0 kubenswrapper[29936]: I1205 12:56:20.074140 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_610dc2015b38bc32879d55a7d39b2587/kube-controller-manager-cert-syncer/0.log" Dec 05 12:56:20.473604 master-0 kubenswrapper[29936]: I1205 12:56:20.473525 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:56:20.603230 master-0 kubenswrapper[29936]: I1205 12:56:20.603070 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kubelet-dir\") pod \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " Dec 05 12:56:20.603230 master-0 kubenswrapper[29936]: I1205 12:56:20.603158 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-var-lock\") pod \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " Dec 05 12:56:20.603701 master-0 kubenswrapper[29936]: I1205 12:56:20.603252 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "312e2ad9-ddf1-42f4-8460-fbb9b4099dfb" (UID: "312e2ad9-ddf1-42f4-8460-fbb9b4099dfb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:56:20.603701 master-0 kubenswrapper[29936]: I1205 12:56:20.603285 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kube-api-access\") pod \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\" (UID: \"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb\") " Dec 05 12:56:20.603701 master-0 kubenswrapper[29936]: I1205 12:56:20.603430 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-var-lock" (OuterVolumeSpecName: "var-lock") pod "312e2ad9-ddf1-42f4-8460-fbb9b4099dfb" (UID: "312e2ad9-ddf1-42f4-8460-fbb9b4099dfb"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 12:56:20.604263 master-0 kubenswrapper[29936]: I1205 12:56:20.604156 29936 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:20.604263 master-0 kubenswrapper[29936]: I1205 12:56:20.604212 29936 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:20.606441 master-0 kubenswrapper[29936]: I1205 12:56:20.606375 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "312e2ad9-ddf1-42f4-8460-fbb9b4099dfb" (UID: "312e2ad9-ddf1-42f4-8460-fbb9b4099dfb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:56:20.705835 master-0 kubenswrapper[29936]: I1205 12:56:20.705748 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/312e2ad9-ddf1-42f4-8460-fbb9b4099dfb-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:21.088504 master-0 kubenswrapper[29936]: I1205 12:56:21.088210 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"312e2ad9-ddf1-42f4-8460-fbb9b4099dfb","Type":"ContainerDied","Data":"8b9f0205682424862e6b6dc02fa7919cdbaffe2375b186b7aea8ee2f9ac432ba"} Dec 05 12:56:21.088504 master-0 kubenswrapper[29936]: I1205 12:56:21.088274 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b9f0205682424862e6b6dc02fa7919cdbaffe2375b186b7aea8ee2f9ac432ba" Dec 05 12:56:21.088504 master-0 kubenswrapper[29936]: I1205 12:56:21.088374 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Dec 05 12:56:27.050505 master-0 kubenswrapper[29936]: I1205 12:56:27.050439 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:56:27.153836 master-0 kubenswrapper[29936]: I1205 12:56:27.153624 29936 generic.go:334] "Generic (PLEG): container finished" podID="a5338041-f213-46ef-9d81-248567ba958d" containerID="8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853" exitCode=0 Dec 05 12:56:27.153836 master-0 kubenswrapper[29936]: I1205 12:56:27.153717 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" event={"ID":"a5338041-f213-46ef-9d81-248567ba958d","Type":"ContainerDied","Data":"8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853"} Dec 05 12:56:27.155326 master-0 kubenswrapper[29936]: I1205 12:56:27.153791 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" Dec 05 12:56:27.155326 master-0 kubenswrapper[29936]: I1205 12:56:27.153955 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-54c5748c8c-kqs7s" event={"ID":"a5338041-f213-46ef-9d81-248567ba958d","Type":"ContainerDied","Data":"1234ab8fb98aae2372aaa8236a21f36a20e417c28feeae32f634a7022c473171"} Dec 05 12:56:27.155326 master-0 kubenswrapper[29936]: I1205 12:56:27.154048 29936 scope.go:117] "RemoveContainer" containerID="8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853" Dec 05 12:56:27.190395 master-0 kubenswrapper[29936]: I1205 12:56:27.185435 29936 scope.go:117] "RemoveContainer" containerID="8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853" Dec 05 12:56:27.190395 master-0 kubenswrapper[29936]: E1205 12:56:27.186752 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853\": container with ID starting with 8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853 not found: ID does not exist" containerID="8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853" Dec 05 12:56:27.190395 master-0 kubenswrapper[29936]: I1205 12:56:27.186816 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853"} err="failed to get container status \"8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853\": rpc error: code = NotFound desc = could not find container \"8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853\": container with ID starting with 8f4c10e53fa9bdea151c26cb8da907a4175dbcda2ac105b3ac1ba5c0a0254853 not found: ID does not exist" Dec 05 12:56:27.232744 master-0 kubenswrapper[29936]: I1205 12:56:27.232666 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle\") pod \"a5338041-f213-46ef-9d81-248567ba958d\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " Dec 05 12:56:27.232744 master-0 kubenswrapper[29936]: I1205 12:56:27.232744 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a5338041-f213-46ef-9d81-248567ba958d-audit-log\") pod \"a5338041-f213-46ef-9d81-248567ba958d\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " Dec 05 12:56:27.233084 master-0 kubenswrapper[29936]: I1205 12:56:27.232798 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls\") pod \"a5338041-f213-46ef-9d81-248567ba958d\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " Dec 05 12:56:27.233084 master-0 kubenswrapper[29936]: I1205 12:56:27.232897 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle\") pod \"a5338041-f213-46ef-9d81-248567ba958d\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " Dec 05 12:56:27.233084 master-0 kubenswrapper[29936]: I1205 12:56:27.232925 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles\") pod \"a5338041-f213-46ef-9d81-248567ba958d\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " Dec 05 12:56:27.233084 master-0 kubenswrapper[29936]: I1205 12:56:27.232966 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs\") pod \"a5338041-f213-46ef-9d81-248567ba958d\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " Dec 05 12:56:27.233084 master-0 kubenswrapper[29936]: I1205 12:56:27.233016 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnwdh\" (UniqueName: \"kubernetes.io/projected/a5338041-f213-46ef-9d81-248567ba958d-kube-api-access-bnwdh\") pod \"a5338041-f213-46ef-9d81-248567ba958d\" (UID: \"a5338041-f213-46ef-9d81-248567ba958d\") " Dec 05 12:56:27.234199 master-0 kubenswrapper[29936]: I1205 12:56:27.233566 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5338041-f213-46ef-9d81-248567ba958d-audit-log" (OuterVolumeSpecName: "audit-log") pod "a5338041-f213-46ef-9d81-248567ba958d" (UID: "a5338041-f213-46ef-9d81-248567ba958d"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:56:27.234199 master-0 kubenswrapper[29936]: I1205 12:56:27.234132 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "a5338041-f213-46ef-9d81-248567ba958d" (UID: "a5338041-f213-46ef-9d81-248567ba958d"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:56:27.235205 master-0 kubenswrapper[29936]: I1205 12:56:27.234896 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "a5338041-f213-46ef-9d81-248567ba958d" (UID: "a5338041-f213-46ef-9d81-248567ba958d"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 12:56:27.236703 master-0 kubenswrapper[29936]: I1205 12:56:27.236636 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "a5338041-f213-46ef-9d81-248567ba958d" (UID: "a5338041-f213-46ef-9d81-248567ba958d"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:56:27.237508 master-0 kubenswrapper[29936]: I1205 12:56:27.237481 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5338041-f213-46ef-9d81-248567ba958d-kube-api-access-bnwdh" (OuterVolumeSpecName: "kube-api-access-bnwdh") pod "a5338041-f213-46ef-9d81-248567ba958d" (UID: "a5338041-f213-46ef-9d81-248567ba958d"). InnerVolumeSpecName "kube-api-access-bnwdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:56:27.237933 master-0 kubenswrapper[29936]: I1205 12:56:27.237874 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "a5338041-f213-46ef-9d81-248567ba958d" (UID: "a5338041-f213-46ef-9d81-248567ba958d"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:56:27.238099 master-0 kubenswrapper[29936]: I1205 12:56:27.238011 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "a5338041-f213-46ef-9d81-248567ba958d" (UID: "a5338041-f213-46ef-9d81-248567ba958d"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 12:56:27.336800 master-0 kubenswrapper[29936]: I1205 12:56:27.336689 29936 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-client-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:27.336800 master-0 kubenswrapper[29936]: I1205 12:56:27.336736 29936 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a5338041-f213-46ef-9d81-248567ba958d-audit-log\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:27.336800 master-0 kubenswrapper[29936]: I1205 12:56:27.336767 29936 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-server-tls\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:27.337441 master-0 kubenswrapper[29936]: I1205 12:56:27.336840 29936 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:27.337441 master-0 kubenswrapper[29936]: I1205 12:56:27.336919 29936 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a5338041-f213-46ef-9d81-248567ba958d-metrics-server-audit-profiles\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:27.337441 master-0 kubenswrapper[29936]: I1205 12:56:27.336963 29936 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a5338041-f213-46ef-9d81-248567ba958d-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:27.337441 master-0 kubenswrapper[29936]: I1205 12:56:27.337002 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnwdh\" (UniqueName: \"kubernetes.io/projected/a5338041-f213-46ef-9d81-248567ba958d-kube-api-access-bnwdh\") on node \"master-0\" DevicePath \"\"" Dec 05 12:56:27.505419 master-0 kubenswrapper[29936]: I1205 12:56:27.505348 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-54c5748c8c-kqs7s"] Dec 05 12:56:27.516388 master-0 kubenswrapper[29936]: I1205 12:56:27.516212 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-54c5748c8c-kqs7s"] Dec 05 12:56:29.197405 master-0 kubenswrapper[29936]: I1205 12:56:29.197320 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5338041-f213-46ef-9d81-248567ba958d" path="/var/lib/kubelet/pods/a5338041-f213-46ef-9d81-248567ba958d/volumes" Dec 05 12:56:30.185613 master-0 kubenswrapper[29936]: I1205 12:56:30.185507 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:30.212674 master-0 kubenswrapper[29936]: I1205 12:56:30.212599 29936 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7e00cac1-9aab-4982-a157-c8f798898324" Dec 05 12:56:30.212674 master-0 kubenswrapper[29936]: I1205 12:56:30.212669 29936 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7e00cac1-9aab-4982-a157-c8f798898324" Dec 05 12:56:30.263406 master-0 kubenswrapper[29936]: I1205 12:56:30.263342 29936 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:30.266127 master-0 kubenswrapper[29936]: I1205 12:56:30.266081 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:56:30.276662 master-0 kubenswrapper[29936]: I1205 12:56:30.276543 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:56:30.290732 master-0 kubenswrapper[29936]: I1205 12:56:30.290626 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:30.300575 master-0 kubenswrapper[29936]: I1205 12:56:30.300491 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 05 12:56:30.326416 master-0 kubenswrapper[29936]: W1205 12:56:30.326347 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf516c058086fe449b55cd324bd8e0223.slice/crio-4b72960e3a645ba59babca121b47e2ad571aff5bd0541e181d359433f66d9160 WatchSource:0}: Error finding container 4b72960e3a645ba59babca121b47e2ad571aff5bd0541e181d359433f66d9160: Status 404 returned error can't find the container with id 4b72960e3a645ba59babca121b47e2ad571aff5bd0541e181d359433f66d9160 Dec 05 12:56:31.201395 master-0 kubenswrapper[29936]: I1205 12:56:31.201332 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"f516c058086fe449b55cd324bd8e0223","Type":"ContainerStarted","Data":"26b8df8c1e17fc0ba2df5cc8ecb3561c9034aa5e128fe7374c7a3af7bda3a7c8"} Dec 05 12:56:31.201395 master-0 kubenswrapper[29936]: I1205 12:56:31.201392 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"f516c058086fe449b55cd324bd8e0223","Type":"ContainerStarted","Data":"a640d6ffd123c6ae62ff72b5bcee2c305714b10125272ac69933719cd846b34a"} Dec 05 12:56:31.201395 master-0 kubenswrapper[29936]: I1205 12:56:31.201404 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"f516c058086fe449b55cd324bd8e0223","Type":"ContainerStarted","Data":"1eb3bb2c03aeeaa24ad3f262fd7d7d83b942951736136a2909649e2a3fb4fa9b"} Dec 05 12:56:31.201716 master-0 kubenswrapper[29936]: I1205 12:56:31.201414 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"f516c058086fe449b55cd324bd8e0223","Type":"ContainerStarted","Data":"4b72960e3a645ba59babca121b47e2ad571aff5bd0541e181d359433f66d9160"} Dec 05 12:56:32.217901 master-0 kubenswrapper[29936]: I1205 12:56:32.217771 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"f516c058086fe449b55cd324bd8e0223","Type":"ContainerStarted","Data":"399ecf6a604f70b8262090f6b52676d9c025bf5a2403f6510e090f8d83da591a"} Dec 05 12:56:32.259747 master-0 kubenswrapper[29936]: I1205 12:56:32.259640 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.259616234 podStartE2EDuration="2.259616234s" podCreationTimestamp="2025-12-05 12:56:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:56:32.253903806 +0000 UTC m=+389.385983567" watchObservedRunningTime="2025-12-05 12:56:32.259616234 +0000 UTC m=+389.391695915" Dec 05 12:56:40.291145 master-0 kubenswrapper[29936]: I1205 12:56:40.291043 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:40.291145 master-0 kubenswrapper[29936]: I1205 12:56:40.291134 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:40.291145 master-0 kubenswrapper[29936]: I1205 12:56:40.291164 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:40.292779 master-0 kubenswrapper[29936]: I1205 12:56:40.291224 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:40.295408 master-0 kubenswrapper[29936]: I1205 12:56:40.295371 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:40.299298 master-0 kubenswrapper[29936]: I1205 12:56:40.299232 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:41.303339 master-0 kubenswrapper[29936]: I1205 12:56:41.303277 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:41.308858 master-0 kubenswrapper[29936]: I1205 12:56:41.308812 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 12:56:48.093828 master-0 kubenswrapper[29936]: I1205 12:56:48.093739 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-7rmf8"] Dec 05 12:56:48.094782 master-0 kubenswrapper[29936]: E1205 12:56:48.094133 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5338041-f213-46ef-9d81-248567ba958d" containerName="metrics-server" Dec 05 12:56:48.094782 master-0 kubenswrapper[29936]: I1205 12:56:48.094148 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5338041-f213-46ef-9d81-248567ba958d" containerName="metrics-server" Dec 05 12:56:48.094782 master-0 kubenswrapper[29936]: E1205 12:56:48.094170 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312e2ad9-ddf1-42f4-8460-fbb9b4099dfb" containerName="installer" Dec 05 12:56:48.094782 master-0 kubenswrapper[29936]: I1205 12:56:48.094196 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="312e2ad9-ddf1-42f4-8460-fbb9b4099dfb" containerName="installer" Dec 05 12:56:48.094782 master-0 kubenswrapper[29936]: I1205 12:56:48.094382 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5338041-f213-46ef-9d81-248567ba958d" containerName="metrics-server" Dec 05 12:56:48.094782 master-0 kubenswrapper[29936]: I1205 12:56:48.094421 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="312e2ad9-ddf1-42f4-8460-fbb9b4099dfb" containerName="installer" Dec 05 12:56:48.094972 master-0 kubenswrapper[29936]: I1205 12:56:48.094953 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.099455 master-0 kubenswrapper[29936]: I1205 12:56:48.099380 29936 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Dec 05 12:56:48.099690 master-0 kubenswrapper[29936]: I1205 12:56:48.099665 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Dec 05 12:56:48.099893 master-0 kubenswrapper[29936]: I1205 12:56:48.099871 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Dec 05 12:56:48.103053 master-0 kubenswrapper[29936]: I1205 12:56:48.102991 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Dec 05 12:56:48.133124 master-0 kubenswrapper[29936]: I1205 12:56:48.133041 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-os-client-config\") pod \"sushy-emulator-58f4c9b998-7rmf8\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.133124 master-0 kubenswrapper[29936]: I1205 12:56:48.133121 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-sushy-emulator-config\") pod \"sushy-emulator-58f4c9b998-7rmf8\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.133124 master-0 kubenswrapper[29936]: I1205 12:56:48.133151 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4w2x\" (UniqueName: \"kubernetes.io/projected/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-kube-api-access-j4w2x\") pod \"sushy-emulator-58f4c9b998-7rmf8\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.187927 master-0 kubenswrapper[29936]: I1205 12:56:48.187824 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-7rmf8"] Dec 05 12:56:48.235153 master-0 kubenswrapper[29936]: I1205 12:56:48.235068 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-os-client-config\") pod \"sushy-emulator-58f4c9b998-7rmf8\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.235522 master-0 kubenswrapper[29936]: I1205 12:56:48.235395 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-sushy-emulator-config\") pod \"sushy-emulator-58f4c9b998-7rmf8\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.235522 master-0 kubenswrapper[29936]: I1205 12:56:48.235508 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4w2x\" (UniqueName: \"kubernetes.io/projected/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-kube-api-access-j4w2x\") pod \"sushy-emulator-58f4c9b998-7rmf8\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.236997 master-0 kubenswrapper[29936]: I1205 12:56:48.236962 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-sushy-emulator-config\") pod \"sushy-emulator-58f4c9b998-7rmf8\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.245118 master-0 kubenswrapper[29936]: I1205 12:56:48.245070 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-os-client-config\") pod \"sushy-emulator-58f4c9b998-7rmf8\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.254616 master-0 kubenswrapper[29936]: I1205 12:56:48.254568 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4w2x\" (UniqueName: \"kubernetes.io/projected/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-kube-api-access-j4w2x\") pod \"sushy-emulator-58f4c9b998-7rmf8\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.428441 master-0 kubenswrapper[29936]: I1205 12:56:48.428245 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:48.871833 master-0 kubenswrapper[29936]: I1205 12:56:48.871484 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-7rmf8"] Dec 05 12:56:48.883155 master-0 kubenswrapper[29936]: W1205 12:56:48.883033 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d53a6f1_102a_40b2_85c0_0c4f34568cfc.slice/crio-51fd8a98951eec83e59fcd32531abc78a7a7dd434e9f6496929fe1c7d1bd1b1b WatchSource:0}: Error finding container 51fd8a98951eec83e59fcd32531abc78a7a7dd434e9f6496929fe1c7d1bd1b1b: Status 404 returned error can't find the container with id 51fd8a98951eec83e59fcd32531abc78a7a7dd434e9f6496929fe1c7d1bd1b1b Dec 05 12:56:48.888301 master-0 kubenswrapper[29936]: I1205 12:56:48.887945 29936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 12:56:49.378532 master-0 kubenswrapper[29936]: I1205 12:56:49.378429 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" event={"ID":"7d53a6f1-102a-40b2-85c0-0c4f34568cfc","Type":"ContainerStarted","Data":"51fd8a98951eec83e59fcd32531abc78a7a7dd434e9f6496929fe1c7d1bd1b1b"} Dec 05 12:56:57.447560 master-0 kubenswrapper[29936]: I1205 12:56:57.447436 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" event={"ID":"7d53a6f1-102a-40b2-85c0-0c4f34568cfc","Type":"ContainerStarted","Data":"3e07889350a9d9eb1276736f41d2d8ad3044e81473d397c8dc23bb90e0f14c5c"} Dec 05 12:56:57.477689 master-0 kubenswrapper[29936]: I1205 12:56:57.477576 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" podStartSLOduration=1.915527948 podStartE2EDuration="9.477552582s" podCreationTimestamp="2025-12-05 12:56:48 +0000 UTC" firstStartedPulling="2025-12-05 12:56:48.887729868 +0000 UTC m=+406.019809549" lastFinishedPulling="2025-12-05 12:56:56.449754462 +0000 UTC m=+413.581834183" observedRunningTime="2025-12-05 12:56:57.472589664 +0000 UTC m=+414.604669365" watchObservedRunningTime="2025-12-05 12:56:57.477552582 +0000 UTC m=+414.609632263" Dec 05 12:56:58.429356 master-0 kubenswrapper[29936]: I1205 12:56:58.429275 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:58.429626 master-0 kubenswrapper[29936]: I1205 12:56:58.429484 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:58.438122 master-0 kubenswrapper[29936]: I1205 12:56:58.438054 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:56:58.457461 master-0 kubenswrapper[29936]: I1205 12:56:58.457371 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 12:57:30.206311 master-0 kubenswrapper[29936]: I1205 12:57:30.205879 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8"] Dec 05 12:57:30.207699 master-0 kubenswrapper[29936]: I1205 12:57:30.207660 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:30.211376 master-0 kubenswrapper[29936]: I1205 12:57:30.210206 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-8ljxd" Dec 05 12:57:30.220070 master-0 kubenswrapper[29936]: I1205 12:57:30.220011 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8"] Dec 05 12:57:30.273340 master-0 kubenswrapper[29936]: I1205 12:57:30.273245 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qnhh\" (UniqueName: \"kubernetes.io/projected/0d790e93-0499-4b77-a695-e63153c32084-kube-api-access-5qnhh\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:30.273752 master-0 kubenswrapper[29936]: I1205 12:57:30.273714 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:30.273842 master-0 kubenswrapper[29936]: I1205 12:57:30.273821 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:30.375597 master-0 kubenswrapper[29936]: I1205 12:57:30.375496 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:30.375597 master-0 kubenswrapper[29936]: I1205 12:57:30.375572 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:30.375966 master-0 kubenswrapper[29936]: I1205 12:57:30.375684 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qnhh\" (UniqueName: \"kubernetes.io/projected/0d790e93-0499-4b77-a695-e63153c32084-kube-api-access-5qnhh\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:30.376090 master-0 kubenswrapper[29936]: I1205 12:57:30.376051 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:30.376241 master-0 kubenswrapper[29936]: I1205 12:57:30.376155 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:30.397566 master-0 kubenswrapper[29936]: I1205 12:57:30.397482 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qnhh\" (UniqueName: \"kubernetes.io/projected/0d790e93-0499-4b77-a695-e63153c32084-kube-api-access-5qnhh\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:30.531404 master-0 kubenswrapper[29936]: I1205 12:57:30.531161 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:31.023120 master-0 kubenswrapper[29936]: I1205 12:57:31.023035 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8"] Dec 05 12:57:31.732599 master-0 kubenswrapper[29936]: I1205 12:57:31.732514 29936 generic.go:334] "Generic (PLEG): container finished" podID="0d790e93-0499-4b77-a695-e63153c32084" containerID="27b32398a5c915ecdf5e8a06d36da9a974d258b1742f8ab1ea531b784b72ce8e" exitCode=0 Dec 05 12:57:31.733352 master-0 kubenswrapper[29936]: I1205 12:57:31.732576 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" event={"ID":"0d790e93-0499-4b77-a695-e63153c32084","Type":"ContainerDied","Data":"27b32398a5c915ecdf5e8a06d36da9a974d258b1742f8ab1ea531b784b72ce8e"} Dec 05 12:57:31.733352 master-0 kubenswrapper[29936]: I1205 12:57:31.732668 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" event={"ID":"0d790e93-0499-4b77-a695-e63153c32084","Type":"ContainerStarted","Data":"438046b356fd1d547cedf712693dd5a7ee8e220bdbe05ee9996da03fd88f96af"} Dec 05 12:57:33.753473 master-0 kubenswrapper[29936]: I1205 12:57:33.753384 29936 generic.go:334] "Generic (PLEG): container finished" podID="0d790e93-0499-4b77-a695-e63153c32084" containerID="fbe715c3766cd39b1dcd508e3f6ca9249ad21305cceeac344964fa940fd564ac" exitCode=0 Dec 05 12:57:33.753473 master-0 kubenswrapper[29936]: I1205 12:57:33.753428 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" event={"ID":"0d790e93-0499-4b77-a695-e63153c32084","Type":"ContainerDied","Data":"fbe715c3766cd39b1dcd508e3f6ca9249ad21305cceeac344964fa940fd564ac"} Dec 05 12:57:34.765359 master-0 kubenswrapper[29936]: I1205 12:57:34.765271 29936 generic.go:334] "Generic (PLEG): container finished" podID="0d790e93-0499-4b77-a695-e63153c32084" containerID="0ccb7d896fb8dbe1e129b54b59d02c4c85989cf642e43639c0d7d8ae3f2cb3e9" exitCode=0 Dec 05 12:57:34.766245 master-0 kubenswrapper[29936]: I1205 12:57:34.765353 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" event={"ID":"0d790e93-0499-4b77-a695-e63153c32084","Type":"ContainerDied","Data":"0ccb7d896fb8dbe1e129b54b59d02c4c85989cf642e43639c0d7d8ae3f2cb3e9"} Dec 05 12:57:36.141440 master-0 kubenswrapper[29936]: I1205 12:57:36.141358 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:36.183101 master-0 kubenswrapper[29936]: I1205 12:57:36.182544 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qnhh\" (UniqueName: \"kubernetes.io/projected/0d790e93-0499-4b77-a695-e63153c32084-kube-api-access-5qnhh\") pod \"0d790e93-0499-4b77-a695-e63153c32084\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " Dec 05 12:57:36.183453 master-0 kubenswrapper[29936]: I1205 12:57:36.183324 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-bundle\") pod \"0d790e93-0499-4b77-a695-e63153c32084\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " Dec 05 12:57:36.183453 master-0 kubenswrapper[29936]: I1205 12:57:36.183363 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-util\") pod \"0d790e93-0499-4b77-a695-e63153c32084\" (UID: \"0d790e93-0499-4b77-a695-e63153c32084\") " Dec 05 12:57:36.184895 master-0 kubenswrapper[29936]: I1205 12:57:36.184442 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-bundle" (OuterVolumeSpecName: "bundle") pod "0d790e93-0499-4b77-a695-e63153c32084" (UID: "0d790e93-0499-4b77-a695-e63153c32084"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:57:36.185502 master-0 kubenswrapper[29936]: I1205 12:57:36.185439 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d790e93-0499-4b77-a695-e63153c32084-kube-api-access-5qnhh" (OuterVolumeSpecName: "kube-api-access-5qnhh") pod "0d790e93-0499-4b77-a695-e63153c32084" (UID: "0d790e93-0499-4b77-a695-e63153c32084"). InnerVolumeSpecName "kube-api-access-5qnhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:57:36.210133 master-0 kubenswrapper[29936]: I1205 12:57:36.207235 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-util" (OuterVolumeSpecName: "util") pod "0d790e93-0499-4b77-a695-e63153c32084" (UID: "0d790e93-0499-4b77-a695-e63153c32084"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:57:36.287540 master-0 kubenswrapper[29936]: I1205 12:57:36.287435 29936 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:57:36.287540 master-0 kubenswrapper[29936]: I1205 12:57:36.287557 29936 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0d790e93-0499-4b77-a695-e63153c32084-util\") on node \"master-0\" DevicePath \"\"" Dec 05 12:57:36.287990 master-0 kubenswrapper[29936]: I1205 12:57:36.287594 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qnhh\" (UniqueName: \"kubernetes.io/projected/0d790e93-0499-4b77-a695-e63153c32084-kube-api-access-5qnhh\") on node \"master-0\" DevicePath \"\"" Dec 05 12:57:36.788233 master-0 kubenswrapper[29936]: I1205 12:57:36.788132 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" event={"ID":"0d790e93-0499-4b77-a695-e63153c32084","Type":"ContainerDied","Data":"438046b356fd1d547cedf712693dd5a7ee8e220bdbe05ee9996da03fd88f96af"} Dec 05 12:57:36.788233 master-0 kubenswrapper[29936]: I1205 12:57:36.788225 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="438046b356fd1d547cedf712693dd5a7ee8e220bdbe05ee9996da03fd88f96af" Dec 05 12:57:36.788713 master-0 kubenswrapper[29936]: I1205 12:57:36.788329 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4rwrs8" Dec 05 12:57:43.321600 master-0 kubenswrapper[29936]: I1205 12:57:43.321530 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-58846d7f54-zmdrn"] Dec 05 12:57:43.322380 master-0 kubenswrapper[29936]: E1205 12:57:43.321951 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d790e93-0499-4b77-a695-e63153c32084" containerName="pull" Dec 05 12:57:43.322380 master-0 kubenswrapper[29936]: I1205 12:57:43.321969 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d790e93-0499-4b77-a695-e63153c32084" containerName="pull" Dec 05 12:57:43.322380 master-0 kubenswrapper[29936]: E1205 12:57:43.321984 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d790e93-0499-4b77-a695-e63153c32084" containerName="util" Dec 05 12:57:43.322380 master-0 kubenswrapper[29936]: I1205 12:57:43.321992 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d790e93-0499-4b77-a695-e63153c32084" containerName="util" Dec 05 12:57:43.322380 master-0 kubenswrapper[29936]: E1205 12:57:43.322019 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d790e93-0499-4b77-a695-e63153c32084" containerName="extract" Dec 05 12:57:43.322380 master-0 kubenswrapper[29936]: I1205 12:57:43.322032 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d790e93-0499-4b77-a695-e63153c32084" containerName="extract" Dec 05 12:57:43.322380 master-0 kubenswrapper[29936]: I1205 12:57:43.322239 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d790e93-0499-4b77-a695-e63153c32084" containerName="extract" Dec 05 12:57:43.323005 master-0 kubenswrapper[29936]: I1205 12:57:43.322979 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.325362 master-0 kubenswrapper[29936]: I1205 12:57:43.325311 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Dec 05 12:57:43.325594 master-0 kubenswrapper[29936]: I1205 12:57:43.325426 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Dec 05 12:57:43.326557 master-0 kubenswrapper[29936]: I1205 12:57:43.326521 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Dec 05 12:57:43.327164 master-0 kubenswrapper[29936]: I1205 12:57:43.327133 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Dec 05 12:57:43.335888 master-0 kubenswrapper[29936]: I1205 12:57:43.335846 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Dec 05 12:57:43.346612 master-0 kubenswrapper[29936]: I1205 12:57:43.346524 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-58846d7f54-zmdrn"] Dec 05 12:57:43.458015 master-0 kubenswrapper[29936]: I1205 12:57:43.457917 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b912234-92a5-44d4-ac53-fd7a22e994db-metrics-cert\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.458015 master-0 kubenswrapper[29936]: I1205 12:57:43.458008 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b912234-92a5-44d4-ac53-fd7a22e994db-apiservice-cert\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.458429 master-0 kubenswrapper[29936]: I1205 12:57:43.458053 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g7t2\" (UniqueName: \"kubernetes.io/projected/3b912234-92a5-44d4-ac53-fd7a22e994db-kube-api-access-2g7t2\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.458429 master-0 kubenswrapper[29936]: I1205 12:57:43.458093 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/3b912234-92a5-44d4-ac53-fd7a22e994db-socket-dir\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.458429 master-0 kubenswrapper[29936]: I1205 12:57:43.458188 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b912234-92a5-44d4-ac53-fd7a22e994db-webhook-cert\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.559802 master-0 kubenswrapper[29936]: I1205 12:57:43.559716 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b912234-92a5-44d4-ac53-fd7a22e994db-webhook-cert\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.560250 master-0 kubenswrapper[29936]: I1205 12:57:43.559917 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b912234-92a5-44d4-ac53-fd7a22e994db-metrics-cert\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.560250 master-0 kubenswrapper[29936]: I1205 12:57:43.559948 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b912234-92a5-44d4-ac53-fd7a22e994db-apiservice-cert\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.560250 master-0 kubenswrapper[29936]: I1205 12:57:43.559979 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7t2\" (UniqueName: \"kubernetes.io/projected/3b912234-92a5-44d4-ac53-fd7a22e994db-kube-api-access-2g7t2\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.560250 master-0 kubenswrapper[29936]: I1205 12:57:43.560000 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/3b912234-92a5-44d4-ac53-fd7a22e994db-socket-dir\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.560868 master-0 kubenswrapper[29936]: I1205 12:57:43.560809 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/3b912234-92a5-44d4-ac53-fd7a22e994db-socket-dir\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.563605 master-0 kubenswrapper[29936]: I1205 12:57:43.563556 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b912234-92a5-44d4-ac53-fd7a22e994db-apiservice-cert\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.563794 master-0 kubenswrapper[29936]: I1205 12:57:43.563742 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b912234-92a5-44d4-ac53-fd7a22e994db-metrics-cert\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.564624 master-0 kubenswrapper[29936]: I1205 12:57:43.564557 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b912234-92a5-44d4-ac53-fd7a22e994db-webhook-cert\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.578494 master-0 kubenswrapper[29936]: I1205 12:57:43.578312 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7t2\" (UniqueName: \"kubernetes.io/projected/3b912234-92a5-44d4-ac53-fd7a22e994db-kube-api-access-2g7t2\") pod \"lvms-operator-58846d7f54-zmdrn\" (UID: \"3b912234-92a5-44d4-ac53-fd7a22e994db\") " pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:43.640492 master-0 kubenswrapper[29936]: I1205 12:57:43.640384 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:44.032639 master-0 kubenswrapper[29936]: W1205 12:57:44.032567 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b912234_92a5_44d4_ac53_fd7a22e994db.slice/crio-1982539ebc799cb4441320b658ff741b7e4bb02ef591a81d08bac87e252f53f9 WatchSource:0}: Error finding container 1982539ebc799cb4441320b658ff741b7e4bb02ef591a81d08bac87e252f53f9: Status 404 returned error can't find the container with id 1982539ebc799cb4441320b658ff741b7e4bb02ef591a81d08bac87e252f53f9 Dec 05 12:57:44.033722 master-0 kubenswrapper[29936]: I1205 12:57:44.033653 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-58846d7f54-zmdrn"] Dec 05 12:57:44.851870 master-0 kubenswrapper[29936]: I1205 12:57:44.851757 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" event={"ID":"3b912234-92a5-44d4-ac53-fd7a22e994db","Type":"ContainerStarted","Data":"1982539ebc799cb4441320b658ff741b7e4bb02ef591a81d08bac87e252f53f9"} Dec 05 12:57:50.904900 master-0 kubenswrapper[29936]: I1205 12:57:50.904837 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" event={"ID":"3b912234-92a5-44d4-ac53-fd7a22e994db","Type":"ContainerStarted","Data":"7e0bbbe8442689adf5f66174295bcd003cb47a632c1a71491fc6b64c379806bd"} Dec 05 12:57:50.906138 master-0 kubenswrapper[29936]: I1205 12:57:50.906106 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:50.938924 master-0 kubenswrapper[29936]: I1205 12:57:50.938812 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" podStartSLOduration=2.085420431 podStartE2EDuration="7.93878361s" podCreationTimestamp="2025-12-05 12:57:43 +0000 UTC" firstStartedPulling="2025-12-05 12:57:44.035284368 +0000 UTC m=+461.167364049" lastFinishedPulling="2025-12-05 12:57:49.888647557 +0000 UTC m=+467.020727228" observedRunningTime="2025-12-05 12:57:50.931241655 +0000 UTC m=+468.063321356" watchObservedRunningTime="2025-12-05 12:57:50.93878361 +0000 UTC m=+468.070863311" Dec 05 12:57:51.919847 master-0 kubenswrapper[29936]: I1205 12:57:51.919747 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-58846d7f54-zmdrn" Dec 05 12:57:56.206702 master-0 kubenswrapper[29936]: I1205 12:57:56.206604 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr"] Dec 05 12:57:56.208412 master-0 kubenswrapper[29936]: I1205 12:57:56.208376 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:56.210985 master-0 kubenswrapper[29936]: I1205 12:57:56.210927 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-8ljxd" Dec 05 12:57:56.231483 master-0 kubenswrapper[29936]: I1205 12:57:56.231409 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr"] Dec 05 12:57:56.341666 master-0 kubenswrapper[29936]: I1205 12:57:56.341587 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:56.341924 master-0 kubenswrapper[29936]: I1205 12:57:56.341871 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:56.342018 master-0 kubenswrapper[29936]: I1205 12:57:56.341986 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjxtb\" (UniqueName: \"kubernetes.io/projected/5fe16cfc-afbd-467b-a036-72adcf763aed-kube-api-access-hjxtb\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:56.448517 master-0 kubenswrapper[29936]: I1205 12:57:56.448426 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjxtb\" (UniqueName: \"kubernetes.io/projected/5fe16cfc-afbd-467b-a036-72adcf763aed-kube-api-access-hjxtb\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:56.448803 master-0 kubenswrapper[29936]: I1205 12:57:56.448760 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:56.448848 master-0 kubenswrapper[29936]: I1205 12:57:56.448823 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:56.449496 master-0 kubenswrapper[29936]: I1205 12:57:56.449460 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:56.449547 master-0 kubenswrapper[29936]: I1205 12:57:56.449507 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:56.470608 master-0 kubenswrapper[29936]: I1205 12:57:56.470494 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjxtb\" (UniqueName: \"kubernetes.io/projected/5fe16cfc-afbd-467b-a036-72adcf763aed-kube-api-access-hjxtb\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:56.525791 master-0 kubenswrapper[29936]: I1205 12:57:56.525498 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:57:57.009767 master-0 kubenswrapper[29936]: I1205 12:57:57.009681 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr"] Dec 05 12:57:57.016277 master-0 kubenswrapper[29936]: W1205 12:57:57.016203 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe16cfc_afbd_467b_a036_72adcf763aed.slice/crio-96f7856a0684977e735eedbdbb32496b2bc10dde9c73f9736aa1dee38b7eef6a WatchSource:0}: Error finding container 96f7856a0684977e735eedbdbb32496b2bc10dde9c73f9736aa1dee38b7eef6a: Status 404 returned error can't find the container with id 96f7856a0684977e735eedbdbb32496b2bc10dde9c73f9736aa1dee38b7eef6a Dec 05 12:57:57.614093 master-0 kubenswrapper[29936]: I1205 12:57:57.614002 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58"] Dec 05 12:57:57.619479 master-0 kubenswrapper[29936]: I1205 12:57:57.619420 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.624634 master-0 kubenswrapper[29936]: I1205 12:57:57.624560 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58"] Dec 05 12:57:57.773117 master-0 kubenswrapper[29936]: I1205 12:57:57.772272 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.773117 master-0 kubenswrapper[29936]: I1205 12:57:57.772362 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br5wx\" (UniqueName: \"kubernetes.io/projected/156b8940-4ad6-4f94-b787-2943c984d2d7-kube-api-access-br5wx\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.773117 master-0 kubenswrapper[29936]: I1205 12:57:57.772399 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.874105 master-0 kubenswrapper[29936]: I1205 12:57:57.874041 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.874400 master-0 kubenswrapper[29936]: I1205 12:57:57.874127 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br5wx\" (UniqueName: \"kubernetes.io/projected/156b8940-4ad6-4f94-b787-2943c984d2d7-kube-api-access-br5wx\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.874400 master-0 kubenswrapper[29936]: I1205 12:57:57.874331 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.874918 master-0 kubenswrapper[29936]: I1205 12:57:57.874850 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.875103 master-0 kubenswrapper[29936]: I1205 12:57:57.875036 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.890530 master-0 kubenswrapper[29936]: I1205 12:57:57.890446 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br5wx\" (UniqueName: \"kubernetes.io/projected/156b8940-4ad6-4f94-b787-2943c984d2d7-kube-api-access-br5wx\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.938502 master-0 kubenswrapper[29936]: I1205 12:57:57.938409 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:57:57.986626 master-0 kubenswrapper[29936]: I1205 12:57:57.986529 29936 generic.go:334] "Generic (PLEG): container finished" podID="5fe16cfc-afbd-467b-a036-72adcf763aed" containerID="4ca391d45c80919fbb39956d4f5c2f6e0baab632dbad571738caaca3b24ee2cf" exitCode=0 Dec 05 12:57:57.986626 master-0 kubenswrapper[29936]: I1205 12:57:57.986610 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" event={"ID":"5fe16cfc-afbd-467b-a036-72adcf763aed","Type":"ContainerDied","Data":"4ca391d45c80919fbb39956d4f5c2f6e0baab632dbad571738caaca3b24ee2cf"} Dec 05 12:57:57.986626 master-0 kubenswrapper[29936]: I1205 12:57:57.986645 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" event={"ID":"5fe16cfc-afbd-467b-a036-72adcf763aed","Type":"ContainerStarted","Data":"96f7856a0684977e735eedbdbb32496b2bc10dde9c73f9736aa1dee38b7eef6a"} Dec 05 12:57:58.350746 master-0 kubenswrapper[29936]: I1205 12:57:58.350664 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58"] Dec 05 12:57:58.353302 master-0 kubenswrapper[29936]: W1205 12:57:58.353242 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod156b8940_4ad6_4f94_b787_2943c984d2d7.slice/crio-137efcb74390d4d48a037dd8c6250060599a7be63c0548b0645063a6fe75f675 WatchSource:0}: Error finding container 137efcb74390d4d48a037dd8c6250060599a7be63c0548b0645063a6fe75f675: Status 404 returned error can't find the container with id 137efcb74390d4d48a037dd8c6250060599a7be63c0548b0645063a6fe75f675 Dec 05 12:57:58.997751 master-0 kubenswrapper[29936]: I1205 12:57:58.997599 29936 generic.go:334] "Generic (PLEG): container finished" podID="156b8940-4ad6-4f94-b787-2943c984d2d7" containerID="5888867dd269c99eb70ebbfb211094a900ff36d425a0386488071f0de01e909f" exitCode=0 Dec 05 12:57:58.997751 master-0 kubenswrapper[29936]: I1205 12:57:58.997675 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" event={"ID":"156b8940-4ad6-4f94-b787-2943c984d2d7","Type":"ContainerDied","Data":"5888867dd269c99eb70ebbfb211094a900ff36d425a0386488071f0de01e909f"} Dec 05 12:57:58.998534 master-0 kubenswrapper[29936]: I1205 12:57:58.997794 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" event={"ID":"156b8940-4ad6-4f94-b787-2943c984d2d7","Type":"ContainerStarted","Data":"137efcb74390d4d48a037dd8c6250060599a7be63c0548b0645063a6fe75f675"} Dec 05 12:58:01.018943 master-0 kubenswrapper[29936]: I1205 12:58:01.018828 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" event={"ID":"5fe16cfc-afbd-467b-a036-72adcf763aed","Type":"ContainerStarted","Data":"75f70c05dd860a5606cb68fbf6b7dde07b116fdff19d148076f6ade7d4f61af3"} Dec 05 12:58:02.029543 master-0 kubenswrapper[29936]: I1205 12:58:02.029471 29936 generic.go:334] "Generic (PLEG): container finished" podID="5fe16cfc-afbd-467b-a036-72adcf763aed" containerID="75f70c05dd860a5606cb68fbf6b7dde07b116fdff19d148076f6ade7d4f61af3" exitCode=0 Dec 05 12:58:02.030242 master-0 kubenswrapper[29936]: I1205 12:58:02.029576 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" event={"ID":"5fe16cfc-afbd-467b-a036-72adcf763aed","Type":"ContainerDied","Data":"75f70c05dd860a5606cb68fbf6b7dde07b116fdff19d148076f6ade7d4f61af3"} Dec 05 12:58:02.032101 master-0 kubenswrapper[29936]: I1205 12:58:02.032059 29936 generic.go:334] "Generic (PLEG): container finished" podID="156b8940-4ad6-4f94-b787-2943c984d2d7" containerID="d4724f279e92099f204de9b98b564eb7355855c6a7372ad14655c223eab2f033" exitCode=0 Dec 05 12:58:02.032198 master-0 kubenswrapper[29936]: I1205 12:58:02.032105 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" event={"ID":"156b8940-4ad6-4f94-b787-2943c984d2d7","Type":"ContainerDied","Data":"d4724f279e92099f204de9b98b564eb7355855c6a7372ad14655c223eab2f033"} Dec 05 12:58:03.045081 master-0 kubenswrapper[29936]: I1205 12:58:03.044994 29936 generic.go:334] "Generic (PLEG): container finished" podID="156b8940-4ad6-4f94-b787-2943c984d2d7" containerID="7ad6f9e83ef3f9c6846cf4ce30bc069caac49872f02af428824668a4e5324fab" exitCode=0 Dec 05 12:58:03.046173 master-0 kubenswrapper[29936]: I1205 12:58:03.045082 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" event={"ID":"156b8940-4ad6-4f94-b787-2943c984d2d7","Type":"ContainerDied","Data":"7ad6f9e83ef3f9c6846cf4ce30bc069caac49872f02af428824668a4e5324fab"} Dec 05 12:58:03.049533 master-0 kubenswrapper[29936]: I1205 12:58:03.049447 29936 generic.go:334] "Generic (PLEG): container finished" podID="5fe16cfc-afbd-467b-a036-72adcf763aed" containerID="dd3db6e3e00292374113664d854ddd277f9e9aef38e5e8f698f0bc0d6668d4bd" exitCode=0 Dec 05 12:58:03.049533 master-0 kubenswrapper[29936]: I1205 12:58:03.049497 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" event={"ID":"5fe16cfc-afbd-467b-a036-72adcf763aed","Type":"ContainerDied","Data":"dd3db6e3e00292374113664d854ddd277f9e9aef38e5e8f698f0bc0d6668d4bd"} Dec 05 12:58:03.734925 master-0 kubenswrapper[29936]: I1205 12:58:03.734832 29936 scope.go:117] "RemoveContainer" containerID="9e00fa2595fad4ad014a23b8074a3240a2449d47373074fecd9654f334a13fb7" Dec 05 12:58:03.753237 master-0 kubenswrapper[29936]: I1205 12:58:03.753171 29936 scope.go:117] "RemoveContainer" containerID="46a38d7a2db34bb7213e7c44dd4da9930d6a2962fb71de116eced4ef1aa3810e" Dec 05 12:58:03.778867 master-0 kubenswrapper[29936]: I1205 12:58:03.778827 29936 scope.go:117] "RemoveContainer" containerID="613fa64c416308b42c0d2958d3f3712126e52a447f52c90eaabf9bb657dccfd4" Dec 05 12:58:04.483544 master-0 kubenswrapper[29936]: I1205 12:58:04.483482 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:58:04.487415 master-0 kubenswrapper[29936]: I1205 12:58:04.487374 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:58:04.614026 master-0 kubenswrapper[29936]: I1205 12:58:04.613953 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br5wx\" (UniqueName: \"kubernetes.io/projected/156b8940-4ad6-4f94-b787-2943c984d2d7-kube-api-access-br5wx\") pod \"156b8940-4ad6-4f94-b787-2943c984d2d7\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " Dec 05 12:58:04.614287 master-0 kubenswrapper[29936]: I1205 12:58:04.614080 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjxtb\" (UniqueName: \"kubernetes.io/projected/5fe16cfc-afbd-467b-a036-72adcf763aed-kube-api-access-hjxtb\") pod \"5fe16cfc-afbd-467b-a036-72adcf763aed\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " Dec 05 12:58:04.614287 master-0 kubenswrapper[29936]: I1205 12:58:04.614131 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-bundle\") pod \"5fe16cfc-afbd-467b-a036-72adcf763aed\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " Dec 05 12:58:04.614287 master-0 kubenswrapper[29936]: I1205 12:58:04.614202 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-bundle\") pod \"156b8940-4ad6-4f94-b787-2943c984d2d7\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " Dec 05 12:58:04.614287 master-0 kubenswrapper[29936]: I1205 12:58:04.614223 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-util\") pod \"5fe16cfc-afbd-467b-a036-72adcf763aed\" (UID: \"5fe16cfc-afbd-467b-a036-72adcf763aed\") " Dec 05 12:58:04.614287 master-0 kubenswrapper[29936]: I1205 12:58:04.614275 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-util\") pod \"156b8940-4ad6-4f94-b787-2943c984d2d7\" (UID: \"156b8940-4ad6-4f94-b787-2943c984d2d7\") " Dec 05 12:58:04.614958 master-0 kubenswrapper[29936]: I1205 12:58:04.614920 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-bundle" (OuterVolumeSpecName: "bundle") pod "156b8940-4ad6-4f94-b787-2943c984d2d7" (UID: "156b8940-4ad6-4f94-b787-2943c984d2d7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:58:04.615558 master-0 kubenswrapper[29936]: I1205 12:58:04.615523 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-bundle" (OuterVolumeSpecName: "bundle") pod "5fe16cfc-afbd-467b-a036-72adcf763aed" (UID: "5fe16cfc-afbd-467b-a036-72adcf763aed"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:58:04.618627 master-0 kubenswrapper[29936]: I1205 12:58:04.618586 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe16cfc-afbd-467b-a036-72adcf763aed-kube-api-access-hjxtb" (OuterVolumeSpecName: "kube-api-access-hjxtb") pod "5fe16cfc-afbd-467b-a036-72adcf763aed" (UID: "5fe16cfc-afbd-467b-a036-72adcf763aed"). InnerVolumeSpecName "kube-api-access-hjxtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:58:04.619677 master-0 kubenswrapper[29936]: I1205 12:58:04.619599 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156b8940-4ad6-4f94-b787-2943c984d2d7-kube-api-access-br5wx" (OuterVolumeSpecName: "kube-api-access-br5wx") pod "156b8940-4ad6-4f94-b787-2943c984d2d7" (UID: "156b8940-4ad6-4f94-b787-2943c984d2d7"). InnerVolumeSpecName "kube-api-access-br5wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:58:04.625319 master-0 kubenswrapper[29936]: I1205 12:58:04.625246 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-util" (OuterVolumeSpecName: "util") pod "5fe16cfc-afbd-467b-a036-72adcf763aed" (UID: "5fe16cfc-afbd-467b-a036-72adcf763aed"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:58:04.626651 master-0 kubenswrapper[29936]: I1205 12:58:04.626561 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-util" (OuterVolumeSpecName: "util") pod "156b8940-4ad6-4f94-b787-2943c984d2d7" (UID: "156b8940-4ad6-4f94-b787-2943c984d2d7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:58:04.717342 master-0 kubenswrapper[29936]: I1205 12:58:04.717221 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjxtb\" (UniqueName: \"kubernetes.io/projected/5fe16cfc-afbd-467b-a036-72adcf763aed-kube-api-access-hjxtb\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:04.717342 master-0 kubenswrapper[29936]: I1205 12:58:04.717295 29936 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:04.717342 master-0 kubenswrapper[29936]: I1205 12:58:04.717311 29936 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:04.717342 master-0 kubenswrapper[29936]: I1205 12:58:04.717321 29936 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fe16cfc-afbd-467b-a036-72adcf763aed-util\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:04.717342 master-0 kubenswrapper[29936]: I1205 12:58:04.717330 29936 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/156b8940-4ad6-4f94-b787-2943c984d2d7-util\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:04.717342 master-0 kubenswrapper[29936]: I1205 12:58:04.717339 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br5wx\" (UniqueName: \"kubernetes.io/projected/156b8940-4ad6-4f94-b787-2943c984d2d7-kube-api-access-br5wx\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:05.074353 master-0 kubenswrapper[29936]: I1205 12:58:05.074154 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" event={"ID":"156b8940-4ad6-4f94-b787-2943c984d2d7","Type":"ContainerDied","Data":"137efcb74390d4d48a037dd8c6250060599a7be63c0548b0645063a6fe75f675"} Dec 05 12:58:05.074353 master-0 kubenswrapper[29936]: I1205 12:58:05.074238 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="137efcb74390d4d48a037dd8c6250060599a7be63c0548b0645063a6fe75f675" Dec 05 12:58:05.074702 master-0 kubenswrapper[29936]: I1205 12:58:05.074655 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f4gg58" Dec 05 12:58:05.076741 master-0 kubenswrapper[29936]: I1205 12:58:05.076658 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" event={"ID":"5fe16cfc-afbd-467b-a036-72adcf763aed","Type":"ContainerDied","Data":"96f7856a0684977e735eedbdbb32496b2bc10dde9c73f9736aa1dee38b7eef6a"} Dec 05 12:58:05.076741 master-0 kubenswrapper[29936]: I1205 12:58:05.076706 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96f7856a0684977e735eedbdbb32496b2bc10dde9c73f9736aa1dee38b7eef6a" Dec 05 12:58:05.076741 master-0 kubenswrapper[29936]: I1205 12:58:05.076718 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a2gvwr" Dec 05 12:58:06.400317 master-0 kubenswrapper[29936]: I1205 12:58:06.400209 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2"] Dec 05 12:58:06.401402 master-0 kubenswrapper[29936]: E1205 12:58:06.401261 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe16cfc-afbd-467b-a036-72adcf763aed" containerName="pull" Dec 05 12:58:06.401402 master-0 kubenswrapper[29936]: I1205 12:58:06.401286 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe16cfc-afbd-467b-a036-72adcf763aed" containerName="pull" Dec 05 12:58:06.401402 master-0 kubenswrapper[29936]: E1205 12:58:06.401326 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe16cfc-afbd-467b-a036-72adcf763aed" containerName="extract" Dec 05 12:58:06.401402 master-0 kubenswrapper[29936]: I1205 12:58:06.401336 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe16cfc-afbd-467b-a036-72adcf763aed" containerName="extract" Dec 05 12:58:06.401402 master-0 kubenswrapper[29936]: E1205 12:58:06.401355 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe16cfc-afbd-467b-a036-72adcf763aed" containerName="util" Dec 05 12:58:06.401402 master-0 kubenswrapper[29936]: I1205 12:58:06.401365 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe16cfc-afbd-467b-a036-72adcf763aed" containerName="util" Dec 05 12:58:06.401830 master-0 kubenswrapper[29936]: E1205 12:58:06.401387 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156b8940-4ad6-4f94-b787-2943c984d2d7" containerName="extract" Dec 05 12:58:06.401830 master-0 kubenswrapper[29936]: I1205 12:58:06.401463 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="156b8940-4ad6-4f94-b787-2943c984d2d7" containerName="extract" Dec 05 12:58:06.401830 master-0 kubenswrapper[29936]: E1205 12:58:06.401485 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156b8940-4ad6-4f94-b787-2943c984d2d7" containerName="pull" Dec 05 12:58:06.401830 master-0 kubenswrapper[29936]: I1205 12:58:06.401494 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="156b8940-4ad6-4f94-b787-2943c984d2d7" containerName="pull" Dec 05 12:58:06.401830 master-0 kubenswrapper[29936]: E1205 12:58:06.401546 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156b8940-4ad6-4f94-b787-2943c984d2d7" containerName="util" Dec 05 12:58:06.401830 master-0 kubenswrapper[29936]: I1205 12:58:06.401557 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="156b8940-4ad6-4f94-b787-2943c984d2d7" containerName="util" Dec 05 12:58:06.402086 master-0 kubenswrapper[29936]: I1205 12:58:06.401940 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe16cfc-afbd-467b-a036-72adcf763aed" containerName="extract" Dec 05 12:58:06.402086 master-0 kubenswrapper[29936]: I1205 12:58:06.402007 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="156b8940-4ad6-4f94-b787-2943c984d2d7" containerName="extract" Dec 05 12:58:06.413065 master-0 kubenswrapper[29936]: I1205 12:58:06.408990 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:06.413065 master-0 kubenswrapper[29936]: I1205 12:58:06.412693 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-8ljxd" Dec 05 12:58:06.430573 master-0 kubenswrapper[29936]: I1205 12:58:06.430420 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2"] Dec 05 12:58:06.555689 master-0 kubenswrapper[29936]: I1205 12:58:06.555595 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:06.555689 master-0 kubenswrapper[29936]: I1205 12:58:06.555680 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjr8\" (UniqueName: \"kubernetes.io/projected/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-kube-api-access-prjr8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:06.556016 master-0 kubenswrapper[29936]: I1205 12:58:06.555819 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:06.658569 master-0 kubenswrapper[29936]: I1205 12:58:06.658339 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:06.658569 master-0 kubenswrapper[29936]: I1205 12:58:06.658443 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjr8\" (UniqueName: \"kubernetes.io/projected/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-kube-api-access-prjr8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:06.658569 master-0 kubenswrapper[29936]: I1205 12:58:06.658489 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:06.659031 master-0 kubenswrapper[29936]: I1205 12:58:06.658976 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:06.659079 master-0 kubenswrapper[29936]: I1205 12:58:06.659032 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:06.705520 master-0 kubenswrapper[29936]: I1205 12:58:06.705331 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjr8\" (UniqueName: \"kubernetes.io/projected/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-kube-api-access-prjr8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:06.736939 master-0 kubenswrapper[29936]: I1205 12:58:06.736827 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:07.215871 master-0 kubenswrapper[29936]: W1205 12:58:07.215786 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f0bbdb0_a70d_4ca4_b1cc_e6c18a816b3e.slice/crio-acaa87dae6cd4806d225923afb44de88c54da103014a9741e4a7b32701a43ceb WatchSource:0}: Error finding container acaa87dae6cd4806d225923afb44de88c54da103014a9741e4a7b32701a43ceb: Status 404 returned error can't find the container with id acaa87dae6cd4806d225923afb44de88c54da103014a9741e4a7b32701a43ceb Dec 05 12:58:07.230747 master-0 kubenswrapper[29936]: I1205 12:58:07.230649 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2"] Dec 05 12:58:08.104139 master-0 kubenswrapper[29936]: I1205 12:58:08.104061 29936 generic.go:334] "Generic (PLEG): container finished" podID="2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" containerID="2b3d8d4cae9c0d6accd7bcdce4cf018f5c83e32deeb152d1f5eefdbb1af6c642" exitCode=0 Dec 05 12:58:08.104734 master-0 kubenswrapper[29936]: I1205 12:58:08.104141 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" event={"ID":"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e","Type":"ContainerDied","Data":"2b3d8d4cae9c0d6accd7bcdce4cf018f5c83e32deeb152d1f5eefdbb1af6c642"} Dec 05 12:58:08.104734 master-0 kubenswrapper[29936]: I1205 12:58:08.104210 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" event={"ID":"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e","Type":"ContainerStarted","Data":"acaa87dae6cd4806d225923afb44de88c54da103014a9741e4a7b32701a43ceb"} Dec 05 12:58:10.126082 master-0 kubenswrapper[29936]: I1205 12:58:10.125941 29936 generic.go:334] "Generic (PLEG): container finished" podID="2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" containerID="01ec259eaef940600ac7edb7f4d219b02a5ddf01d3e8f4eecc7e52e52da4982a" exitCode=0 Dec 05 12:58:10.126833 master-0 kubenswrapper[29936]: I1205 12:58:10.126107 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" event={"ID":"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e","Type":"ContainerDied","Data":"01ec259eaef940600ac7edb7f4d219b02a5ddf01d3e8f4eecc7e52e52da4982a"} Dec 05 12:58:10.441524 master-0 kubenswrapper[29936]: I1205 12:58:10.441462 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw"] Dec 05 12:58:10.442689 master-0 kubenswrapper[29936]: I1205 12:58:10.442664 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" Dec 05 12:58:10.448463 master-0 kubenswrapper[29936]: I1205 12:58:10.448417 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 05 12:58:10.461946 master-0 kubenswrapper[29936]: I1205 12:58:10.461892 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 05 12:58:10.465042 master-0 kubenswrapper[29936]: I1205 12:58:10.464985 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw"] Dec 05 12:58:10.530095 master-0 kubenswrapper[29936]: I1205 12:58:10.530015 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8751ad34-0564-4af9-813c-a061278eb1f8-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-nk6hw\" (UID: \"8751ad34-0564-4af9-813c-a061278eb1f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" Dec 05 12:58:10.530411 master-0 kubenswrapper[29936]: I1205 12:58:10.530138 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtlhj\" (UniqueName: \"kubernetes.io/projected/8751ad34-0564-4af9-813c-a061278eb1f8-kube-api-access-vtlhj\") pod \"cert-manager-operator-controller-manager-64cf6dff88-nk6hw\" (UID: \"8751ad34-0564-4af9-813c-a061278eb1f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" Dec 05 12:58:10.631692 master-0 kubenswrapper[29936]: I1205 12:58:10.631623 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtlhj\" (UniqueName: \"kubernetes.io/projected/8751ad34-0564-4af9-813c-a061278eb1f8-kube-api-access-vtlhj\") pod \"cert-manager-operator-controller-manager-64cf6dff88-nk6hw\" (UID: \"8751ad34-0564-4af9-813c-a061278eb1f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" Dec 05 12:58:10.631808 master-0 kubenswrapper[29936]: I1205 12:58:10.631793 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8751ad34-0564-4af9-813c-a061278eb1f8-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-nk6hw\" (UID: \"8751ad34-0564-4af9-813c-a061278eb1f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" Dec 05 12:58:10.633070 master-0 kubenswrapper[29936]: I1205 12:58:10.632497 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8751ad34-0564-4af9-813c-a061278eb1f8-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-nk6hw\" (UID: \"8751ad34-0564-4af9-813c-a061278eb1f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" Dec 05 12:58:10.668206 master-0 kubenswrapper[29936]: I1205 12:58:10.661088 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtlhj\" (UniqueName: \"kubernetes.io/projected/8751ad34-0564-4af9-813c-a061278eb1f8-kube-api-access-vtlhj\") pod \"cert-manager-operator-controller-manager-64cf6dff88-nk6hw\" (UID: \"8751ad34-0564-4af9-813c-a061278eb1f8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" Dec 05 12:58:10.760321 master-0 kubenswrapper[29936]: I1205 12:58:10.760150 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" Dec 05 12:58:11.150074 master-0 kubenswrapper[29936]: I1205 12:58:11.149836 29936 generic.go:334] "Generic (PLEG): container finished" podID="2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" containerID="0e939e69dc348881c11616268a65366162040fb76444132106a747ece75297f8" exitCode=0 Dec 05 12:58:11.150074 master-0 kubenswrapper[29936]: I1205 12:58:11.149909 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" event={"ID":"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e","Type":"ContainerDied","Data":"0e939e69dc348881c11616268a65366162040fb76444132106a747ece75297f8"} Dec 05 12:58:11.258516 master-0 kubenswrapper[29936]: I1205 12:58:11.258424 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw"] Dec 05 12:58:12.169212 master-0 kubenswrapper[29936]: I1205 12:58:12.164842 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" event={"ID":"8751ad34-0564-4af9-813c-a061278eb1f8","Type":"ContainerStarted","Data":"5ddb0e53a1bb9ba2b3c25c5a1c051346ff7ff7c67c8aea536ee653f5ab3467b0"} Dec 05 12:58:12.533703 master-0 kubenswrapper[29936]: I1205 12:58:12.533629 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:12.673252 master-0 kubenswrapper[29936]: I1205 12:58:12.672724 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-bundle\") pod \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " Dec 05 12:58:12.673252 master-0 kubenswrapper[29936]: I1205 12:58:12.672939 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-util\") pod \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " Dec 05 12:58:12.673252 master-0 kubenswrapper[29936]: I1205 12:58:12.672974 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prjr8\" (UniqueName: \"kubernetes.io/projected/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-kube-api-access-prjr8\") pod \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\" (UID: \"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e\") " Dec 05 12:58:12.680087 master-0 kubenswrapper[29936]: I1205 12:58:12.676611 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-bundle" (OuterVolumeSpecName: "bundle") pod "2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" (UID: "2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:58:12.700911 master-0 kubenswrapper[29936]: I1205 12:58:12.700196 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-util" (OuterVolumeSpecName: "util") pod "2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" (UID: "2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:58:12.704552 master-0 kubenswrapper[29936]: I1205 12:58:12.702338 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-kube-api-access-prjr8" (OuterVolumeSpecName: "kube-api-access-prjr8") pod "2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" (UID: "2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e"). InnerVolumeSpecName "kube-api-access-prjr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:58:12.778515 master-0 kubenswrapper[29936]: I1205 12:58:12.776319 29936 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-util\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:12.778515 master-0 kubenswrapper[29936]: I1205 12:58:12.776375 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prjr8\" (UniqueName: \"kubernetes.io/projected/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-kube-api-access-prjr8\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:12.778515 master-0 kubenswrapper[29936]: I1205 12:58:12.776386 29936 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:12.912897 master-0 kubenswrapper[29936]: I1205 12:58:12.912821 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt"] Dec 05 12:58:12.913232 master-0 kubenswrapper[29936]: E1205 12:58:12.913170 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" containerName="extract" Dec 05 12:58:12.913232 master-0 kubenswrapper[29936]: I1205 12:58:12.913209 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" containerName="extract" Dec 05 12:58:12.913232 master-0 kubenswrapper[29936]: E1205 12:58:12.913226 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" containerName="util" Dec 05 12:58:12.913232 master-0 kubenswrapper[29936]: I1205 12:58:12.913235 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" containerName="util" Dec 05 12:58:12.913384 master-0 kubenswrapper[29936]: E1205 12:58:12.913280 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" containerName="pull" Dec 05 12:58:12.913384 master-0 kubenswrapper[29936]: I1205 12:58:12.913287 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" containerName="pull" Dec 05 12:58:12.916609 master-0 kubenswrapper[29936]: I1205 12:58:12.916585 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e" containerName="extract" Dec 05 12:58:12.918550 master-0 kubenswrapper[29936]: I1205 12:58:12.918501 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:12.935311 master-0 kubenswrapper[29936]: I1205 12:58:12.933325 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt"] Dec 05 12:58:13.055768 master-0 kubenswrapper[29936]: I1205 12:58:13.055102 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm"] Dec 05 12:58:13.057212 master-0 kubenswrapper[29936]: I1205 12:58:13.056389 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm" Dec 05 12:58:13.060621 master-0 kubenswrapper[29936]: I1205 12:58:13.058969 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 12:58:13.065265 master-0 kubenswrapper[29936]: I1205 12:58:13.065060 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 12:58:13.069144 master-0 kubenswrapper[29936]: I1205 12:58:13.068710 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm"] Dec 05 12:58:13.092258 master-0 kubenswrapper[29936]: I1205 12:58:13.086570 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:13.092258 master-0 kubenswrapper[29936]: I1205 12:58:13.086803 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5xtk\" (UniqueName: \"kubernetes.io/projected/0da32e96-eaaa-42a5-98d3-4be09def9381-kube-api-access-h5xtk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:13.092258 master-0 kubenswrapper[29936]: I1205 12:58:13.087478 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:13.175961 master-0 kubenswrapper[29936]: I1205 12:58:13.175714 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" event={"ID":"2f0bbdb0-a70d-4ca4-b1cc-e6c18a816b3e","Type":"ContainerDied","Data":"acaa87dae6cd4806d225923afb44de88c54da103014a9741e4a7b32701a43ceb"} Dec 05 12:58:13.175961 master-0 kubenswrapper[29936]: I1205 12:58:13.175770 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acaa87dae6cd4806d225923afb44de88c54da103014a9741e4a7b32701a43ceb" Dec 05 12:58:13.175961 master-0 kubenswrapper[29936]: I1205 12:58:13.175797 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108v4b2" Dec 05 12:58:13.190301 master-0 kubenswrapper[29936]: I1205 12:58:13.190207 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:13.190536 master-0 kubenswrapper[29936]: I1205 12:58:13.190341 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:13.190536 master-0 kubenswrapper[29936]: I1205 12:58:13.190383 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5945\" (UniqueName: \"kubernetes.io/projected/907064b3-d6a6-447a-902f-90b5b8e9bfc7-kube-api-access-g5945\") pod \"nmstate-operator-5b5b58f5c8-rxnvm\" (UID: \"907064b3-d6a6-447a-902f-90b5b8e9bfc7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm" Dec 05 12:58:13.190536 master-0 kubenswrapper[29936]: I1205 12:58:13.190414 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5xtk\" (UniqueName: \"kubernetes.io/projected/0da32e96-eaaa-42a5-98d3-4be09def9381-kube-api-access-h5xtk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:13.197283 master-0 kubenswrapper[29936]: I1205 12:58:13.191501 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:13.197283 master-0 kubenswrapper[29936]: I1205 12:58:13.192068 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:13.230364 master-0 kubenswrapper[29936]: I1205 12:58:13.228999 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5xtk\" (UniqueName: \"kubernetes.io/projected/0da32e96-eaaa-42a5-98d3-4be09def9381-kube-api-access-h5xtk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:13.247287 master-0 kubenswrapper[29936]: I1205 12:58:13.247209 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:13.307936 master-0 kubenswrapper[29936]: I1205 12:58:13.297001 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5945\" (UniqueName: \"kubernetes.io/projected/907064b3-d6a6-447a-902f-90b5b8e9bfc7-kube-api-access-g5945\") pod \"nmstate-operator-5b5b58f5c8-rxnvm\" (UID: \"907064b3-d6a6-447a-902f-90b5b8e9bfc7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm" Dec 05 12:58:13.329807 master-0 kubenswrapper[29936]: I1205 12:58:13.329733 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5945\" (UniqueName: \"kubernetes.io/projected/907064b3-d6a6-447a-902f-90b5b8e9bfc7-kube-api-access-g5945\") pod \"nmstate-operator-5b5b58f5c8-rxnvm\" (UID: \"907064b3-d6a6-447a-902f-90b5b8e9bfc7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm" Dec 05 12:58:13.382903 master-0 kubenswrapper[29936]: I1205 12:58:13.382825 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm" Dec 05 12:58:13.735688 master-0 kubenswrapper[29936]: I1205 12:58:13.735633 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt"] Dec 05 12:58:13.741480 master-0 kubenswrapper[29936]: W1205 12:58:13.741416 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0da32e96_eaaa_42a5_98d3_4be09def9381.slice/crio-f7c5ce84e92b85eda246c0544803669e75fcbfc0b81166bdbf260f78bef76989 WatchSource:0}: Error finding container f7c5ce84e92b85eda246c0544803669e75fcbfc0b81166bdbf260f78bef76989: Status 404 returned error can't find the container with id f7c5ce84e92b85eda246c0544803669e75fcbfc0b81166bdbf260f78bef76989 Dec 05 12:58:13.860245 master-0 kubenswrapper[29936]: I1205 12:58:13.858708 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm"] Dec 05 12:58:13.881144 master-0 kubenswrapper[29936]: W1205 12:58:13.881071 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod907064b3_d6a6_447a_902f_90b5b8e9bfc7.slice/crio-f32c99854f0f38a55c680d0d979937fc4b57d48dd34e9296ed8ab04e4a367853 WatchSource:0}: Error finding container f32c99854f0f38a55c680d0d979937fc4b57d48dd34e9296ed8ab04e4a367853: Status 404 returned error can't find the container with id f32c99854f0f38a55c680d0d979937fc4b57d48dd34e9296ed8ab04e4a367853 Dec 05 12:58:14.187359 master-0 kubenswrapper[29936]: I1205 12:58:14.187293 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm" event={"ID":"907064b3-d6a6-447a-902f-90b5b8e9bfc7","Type":"ContainerStarted","Data":"f32c99854f0f38a55c680d0d979937fc4b57d48dd34e9296ed8ab04e4a367853"} Dec 05 12:58:14.195684 master-0 kubenswrapper[29936]: I1205 12:58:14.189577 29936 generic.go:334] "Generic (PLEG): container finished" podID="0da32e96-eaaa-42a5-98d3-4be09def9381" containerID="633f76e635e7f863d9d34d67c3714a8745e6ce0a56b1a13548d955ae08a00169" exitCode=0 Dec 05 12:58:14.195684 master-0 kubenswrapper[29936]: I1205 12:58:14.189608 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" event={"ID":"0da32e96-eaaa-42a5-98d3-4be09def9381","Type":"ContainerDied","Data":"633f76e635e7f863d9d34d67c3714a8745e6ce0a56b1a13548d955ae08a00169"} Dec 05 12:58:14.195684 master-0 kubenswrapper[29936]: I1205 12:58:14.189647 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" event={"ID":"0da32e96-eaaa-42a5-98d3-4be09def9381","Type":"ContainerStarted","Data":"f7c5ce84e92b85eda246c0544803669e75fcbfc0b81166bdbf260f78bef76989"} Dec 05 12:58:20.246177 master-0 kubenswrapper[29936]: I1205 12:58:20.246089 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" event={"ID":"8751ad34-0564-4af9-813c-a061278eb1f8","Type":"ContainerStarted","Data":"5b82a3a84b80b0e7bca22440f2dec963b36c7b1d4bfc313c5715c7500362c5d8"} Dec 05 12:58:20.250157 master-0 kubenswrapper[29936]: I1205 12:58:20.250095 29936 generic.go:334] "Generic (PLEG): container finished" podID="0da32e96-eaaa-42a5-98d3-4be09def9381" containerID="e32585a4474d4c36b1048600251be92ac53571fc1f16f6aea947ba337c6a939d" exitCode=0 Dec 05 12:58:20.250251 master-0 kubenswrapper[29936]: I1205 12:58:20.250172 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" event={"ID":"0da32e96-eaaa-42a5-98d3-4be09def9381","Type":"ContainerDied","Data":"e32585a4474d4c36b1048600251be92ac53571fc1f16f6aea947ba337c6a939d"} Dec 05 12:58:20.293673 master-0 kubenswrapper[29936]: I1205 12:58:20.293575 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-nk6hw" podStartSLOduration=2.21862078 podStartE2EDuration="10.293557263s" podCreationTimestamp="2025-12-05 12:58:10 +0000 UTC" firstStartedPulling="2025-12-05 12:58:11.265679746 +0000 UTC m=+488.397759657" lastFinishedPulling="2025-12-05 12:58:19.340616459 +0000 UTC m=+496.472696140" observedRunningTime="2025-12-05 12:58:20.289963515 +0000 UTC m=+497.422043196" watchObservedRunningTime="2025-12-05 12:58:20.293557263 +0000 UTC m=+497.425636944" Dec 05 12:58:21.266708 master-0 kubenswrapper[29936]: I1205 12:58:21.266625 29936 generic.go:334] "Generic (PLEG): container finished" podID="0da32e96-eaaa-42a5-98d3-4be09def9381" containerID="4e27b500190b8f2ba3f3cf79d22d251368f8bda5294f849bead263eb2320b623" exitCode=0 Dec 05 12:58:21.268927 master-0 kubenswrapper[29936]: I1205 12:58:21.266705 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" event={"ID":"0da32e96-eaaa-42a5-98d3-4be09def9381","Type":"ContainerDied","Data":"4e27b500190b8f2ba3f3cf79d22d251368f8bda5294f849bead263eb2320b623"} Dec 05 12:58:23.271310 master-0 kubenswrapper[29936]: I1205 12:58:23.269116 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:23.302213 master-0 kubenswrapper[29936]: I1205 12:58:23.300829 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm" event={"ID":"907064b3-d6a6-447a-902f-90b5b8e9bfc7","Type":"ContainerStarted","Data":"cf68b96b98cfaca2c9772aff93e9f73a26ce89e9f258024fb4ed66e64fec1451"} Dec 05 12:58:23.311265 master-0 kubenswrapper[29936]: I1205 12:58:23.310409 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" event={"ID":"0da32e96-eaaa-42a5-98d3-4be09def9381","Type":"ContainerDied","Data":"f7c5ce84e92b85eda246c0544803669e75fcbfc0b81166bdbf260f78bef76989"} Dec 05 12:58:23.311265 master-0 kubenswrapper[29936]: I1205 12:58:23.310496 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7c5ce84e92b85eda246c0544803669e75fcbfc0b81166bdbf260f78bef76989" Dec 05 12:58:23.311265 master-0 kubenswrapper[29936]: I1205 12:58:23.310636 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83zzmwt" Dec 05 12:58:23.378695 master-0 kubenswrapper[29936]: I1205 12:58:23.378288 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-util\") pod \"0da32e96-eaaa-42a5-98d3-4be09def9381\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " Dec 05 12:58:23.378695 master-0 kubenswrapper[29936]: I1205 12:58:23.378420 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5xtk\" (UniqueName: \"kubernetes.io/projected/0da32e96-eaaa-42a5-98d3-4be09def9381-kube-api-access-h5xtk\") pod \"0da32e96-eaaa-42a5-98d3-4be09def9381\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " Dec 05 12:58:23.378695 master-0 kubenswrapper[29936]: I1205 12:58:23.378580 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-bundle\") pod \"0da32e96-eaaa-42a5-98d3-4be09def9381\" (UID: \"0da32e96-eaaa-42a5-98d3-4be09def9381\") " Dec 05 12:58:23.406220 master-0 kubenswrapper[29936]: I1205 12:58:23.396139 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-bundle" (OuterVolumeSpecName: "bundle") pod "0da32e96-eaaa-42a5-98d3-4be09def9381" (UID: "0da32e96-eaaa-42a5-98d3-4be09def9381"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:58:23.406220 master-0 kubenswrapper[29936]: I1205 12:58:23.401642 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da32e96-eaaa-42a5-98d3-4be09def9381-kube-api-access-h5xtk" (OuterVolumeSpecName: "kube-api-access-h5xtk") pod "0da32e96-eaaa-42a5-98d3-4be09def9381" (UID: "0da32e96-eaaa-42a5-98d3-4be09def9381"). InnerVolumeSpecName "kube-api-access-h5xtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:58:23.423301 master-0 kubenswrapper[29936]: I1205 12:58:23.421688 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-util" (OuterVolumeSpecName: "util") pod "0da32e96-eaaa-42a5-98d3-4be09def9381" (UID: "0da32e96-eaaa-42a5-98d3-4be09def9381"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:58:23.485285 master-0 kubenswrapper[29936]: I1205 12:58:23.481180 29936 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:23.485285 master-0 kubenswrapper[29936]: I1205 12:58:23.481241 29936 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0da32e96-eaaa-42a5-98d3-4be09def9381-util\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:23.485285 master-0 kubenswrapper[29936]: I1205 12:58:23.481255 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5xtk\" (UniqueName: \"kubernetes.io/projected/0da32e96-eaaa-42a5-98d3-4be09def9381-kube-api-access-h5xtk\") on node \"master-0\" DevicePath \"\"" Dec 05 12:58:23.920239 master-0 kubenswrapper[29936]: I1205 12:58:23.920025 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-rxnvm" podStartSLOduration=2.409573363 podStartE2EDuration="10.920001717s" podCreationTimestamp="2025-12-05 12:58:13 +0000 UTC" firstStartedPulling="2025-12-05 12:58:13.884529136 +0000 UTC m=+491.016608807" lastFinishedPulling="2025-12-05 12:58:22.39495748 +0000 UTC m=+499.527037161" observedRunningTime="2025-12-05 12:58:23.373650895 +0000 UTC m=+500.505730566" watchObservedRunningTime="2025-12-05 12:58:23.920001717 +0000 UTC m=+501.052081398" Dec 05 12:58:23.924693 master-0 kubenswrapper[29936]: I1205 12:58:23.924640 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-fxz9v"] Dec 05 12:58:23.925047 master-0 kubenswrapper[29936]: E1205 12:58:23.925019 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da32e96-eaaa-42a5-98d3-4be09def9381" containerName="util" Dec 05 12:58:23.925047 master-0 kubenswrapper[29936]: I1205 12:58:23.925038 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da32e96-eaaa-42a5-98d3-4be09def9381" containerName="util" Dec 05 12:58:23.925193 master-0 kubenswrapper[29936]: E1205 12:58:23.925053 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da32e96-eaaa-42a5-98d3-4be09def9381" containerName="pull" Dec 05 12:58:23.925193 master-0 kubenswrapper[29936]: I1205 12:58:23.925062 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da32e96-eaaa-42a5-98d3-4be09def9381" containerName="pull" Dec 05 12:58:23.925193 master-0 kubenswrapper[29936]: E1205 12:58:23.925081 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da32e96-eaaa-42a5-98d3-4be09def9381" containerName="extract" Dec 05 12:58:23.925193 master-0 kubenswrapper[29936]: I1205 12:58:23.925088 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da32e96-eaaa-42a5-98d3-4be09def9381" containerName="extract" Dec 05 12:58:23.925354 master-0 kubenswrapper[29936]: I1205 12:58:23.925284 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da32e96-eaaa-42a5-98d3-4be09def9381" containerName="extract" Dec 05 12:58:23.925939 master-0 kubenswrapper[29936]: I1205 12:58:23.925906 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" Dec 05 12:58:23.932655 master-0 kubenswrapper[29936]: I1205 12:58:23.932594 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 12:58:23.937068 master-0 kubenswrapper[29936]: I1205 12:58:23.937014 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 12:58:23.959057 master-0 kubenswrapper[29936]: I1205 12:58:23.958962 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-fxz9v"] Dec 05 12:58:24.103969 master-0 kubenswrapper[29936]: I1205 12:58:24.103908 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c861238-86b8-4421-9ad0-54485c8faa9a-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-fxz9v\" (UID: \"4c861238-86b8-4421-9ad0-54485c8faa9a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" Dec 05 12:58:24.104254 master-0 kubenswrapper[29936]: I1205 12:58:24.104013 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt6qg\" (UniqueName: \"kubernetes.io/projected/4c861238-86b8-4421-9ad0-54485c8faa9a-kube-api-access-nt6qg\") pod \"cert-manager-webhook-f4fb5df64-fxz9v\" (UID: \"4c861238-86b8-4421-9ad0-54485c8faa9a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" Dec 05 12:58:24.164670 master-0 kubenswrapper[29936]: I1205 12:58:24.164604 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8"] Dec 05 12:58:24.165804 master-0 kubenswrapper[29936]: I1205 12:58:24.165737 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" Dec 05 12:58:24.192420 master-0 kubenswrapper[29936]: I1205 12:58:24.191447 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8"] Dec 05 12:58:24.206240 master-0 kubenswrapper[29936]: I1205 12:58:24.206140 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt6qg\" (UniqueName: \"kubernetes.io/projected/4c861238-86b8-4421-9ad0-54485c8faa9a-kube-api-access-nt6qg\") pod \"cert-manager-webhook-f4fb5df64-fxz9v\" (UID: \"4c861238-86b8-4421-9ad0-54485c8faa9a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" Dec 05 12:58:24.206528 master-0 kubenswrapper[29936]: I1205 12:58:24.206289 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c861238-86b8-4421-9ad0-54485c8faa9a-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-fxz9v\" (UID: \"4c861238-86b8-4421-9ad0-54485c8faa9a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" Dec 05 12:58:24.236255 master-0 kubenswrapper[29936]: I1205 12:58:24.236203 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt6qg\" (UniqueName: \"kubernetes.io/projected/4c861238-86b8-4421-9ad0-54485c8faa9a-kube-api-access-nt6qg\") pod \"cert-manager-webhook-f4fb5df64-fxz9v\" (UID: \"4c861238-86b8-4421-9ad0-54485c8faa9a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" Dec 05 12:58:24.237743 master-0 kubenswrapper[29936]: I1205 12:58:24.237712 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c861238-86b8-4421-9ad0-54485c8faa9a-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-fxz9v\" (UID: \"4c861238-86b8-4421-9ad0-54485c8faa9a\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" Dec 05 12:58:24.240848 master-0 kubenswrapper[29936]: I1205 12:58:24.240818 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" Dec 05 12:58:24.314265 master-0 kubenswrapper[29936]: I1205 12:58:24.314004 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7skr7\" (UniqueName: \"kubernetes.io/projected/0878dccd-eef2-46f2-bf0c-91fdf0c258b6-kube-api-access-7skr7\") pod \"cert-manager-cainjector-855d9ccff4-dnmf8\" (UID: \"0878dccd-eef2-46f2-bf0c-91fdf0c258b6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" Dec 05 12:58:24.314265 master-0 kubenswrapper[29936]: I1205 12:58:24.314134 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0878dccd-eef2-46f2-bf0c-91fdf0c258b6-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-dnmf8\" (UID: \"0878dccd-eef2-46f2-bf0c-91fdf0c258b6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" Dec 05 12:58:24.416358 master-0 kubenswrapper[29936]: I1205 12:58:24.416286 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7skr7\" (UniqueName: \"kubernetes.io/projected/0878dccd-eef2-46f2-bf0c-91fdf0c258b6-kube-api-access-7skr7\") pod \"cert-manager-cainjector-855d9ccff4-dnmf8\" (UID: \"0878dccd-eef2-46f2-bf0c-91fdf0c258b6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" Dec 05 12:58:24.416593 master-0 kubenswrapper[29936]: I1205 12:58:24.416392 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0878dccd-eef2-46f2-bf0c-91fdf0c258b6-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-dnmf8\" (UID: \"0878dccd-eef2-46f2-bf0c-91fdf0c258b6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" Dec 05 12:58:24.451249 master-0 kubenswrapper[29936]: I1205 12:58:24.451119 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7skr7\" (UniqueName: \"kubernetes.io/projected/0878dccd-eef2-46f2-bf0c-91fdf0c258b6-kube-api-access-7skr7\") pod \"cert-manager-cainjector-855d9ccff4-dnmf8\" (UID: \"0878dccd-eef2-46f2-bf0c-91fdf0c258b6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" Dec 05 12:58:24.460214 master-0 kubenswrapper[29936]: I1205 12:58:24.452309 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0878dccd-eef2-46f2-bf0c-91fdf0c258b6-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-dnmf8\" (UID: \"0878dccd-eef2-46f2-bf0c-91fdf0c258b6\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" Dec 05 12:58:24.536354 master-0 kubenswrapper[29936]: I1205 12:58:24.536283 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" Dec 05 12:58:25.020114 master-0 kubenswrapper[29936]: I1205 12:58:25.014390 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-fxz9v"] Dec 05 12:58:25.204067 master-0 kubenswrapper[29936]: I1205 12:58:25.203973 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8"] Dec 05 12:58:25.217225 master-0 kubenswrapper[29936]: W1205 12:58:25.216405 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0878dccd_eef2_46f2_bf0c_91fdf0c258b6.slice/crio-72859ef27205d9e1c9cbed238bc7f59b4522df5564af62914302253f4ba098ee WatchSource:0}: Error finding container 72859ef27205d9e1c9cbed238bc7f59b4522df5564af62914302253f4ba098ee: Status 404 returned error can't find the container with id 72859ef27205d9e1c9cbed238bc7f59b4522df5564af62914302253f4ba098ee Dec 05 12:58:25.338907 master-0 kubenswrapper[29936]: I1205 12:58:25.338755 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" event={"ID":"4c861238-86b8-4421-9ad0-54485c8faa9a","Type":"ContainerStarted","Data":"3a023c82d3ebdaa0b7aebe8dc94f5cf0d604985263c9813acd5525367f315e29"} Dec 05 12:58:25.341333 master-0 kubenswrapper[29936]: I1205 12:58:25.341296 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" event={"ID":"0878dccd-eef2-46f2-bf0c-91fdf0c258b6","Type":"ContainerStarted","Data":"72859ef27205d9e1c9cbed238bc7f59b4522df5564af62914302253f4ba098ee"} Dec 05 12:58:29.078916 master-0 kubenswrapper[29936]: I1205 12:58:29.078816 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr"] Dec 05 12:58:29.082362 master-0 kubenswrapper[29936]: I1205 12:58:29.080158 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr" Dec 05 12:58:29.091249 master-0 kubenswrapper[29936]: I1205 12:58:29.084712 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 05 12:58:29.091249 master-0 kubenswrapper[29936]: I1205 12:58:29.086164 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 05 12:58:29.091249 master-0 kubenswrapper[29936]: I1205 12:58:29.088296 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57nn\" (UniqueName: \"kubernetes.io/projected/40a43e8c-a2c1-4995-9859-0bb724868066-kube-api-access-z57nn\") pod \"obo-prometheus-operator-668cf9dfbb-jz5qr\" (UID: \"40a43e8c-a2c1-4995-9859-0bb724868066\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr" Dec 05 12:58:29.102252 master-0 kubenswrapper[29936]: I1205 12:58:29.099110 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr"] Dec 05 12:58:29.195750 master-0 kubenswrapper[29936]: I1205 12:58:29.195655 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z57nn\" (UniqueName: \"kubernetes.io/projected/40a43e8c-a2c1-4995-9859-0bb724868066-kube-api-access-z57nn\") pod \"obo-prometheus-operator-668cf9dfbb-jz5qr\" (UID: \"40a43e8c-a2c1-4995-9859-0bb724868066\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr" Dec 05 12:58:29.215347 master-0 kubenswrapper[29936]: I1205 12:58:29.215277 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt"] Dec 05 12:58:29.219839 master-0 kubenswrapper[29936]: I1205 12:58:29.219795 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" Dec 05 12:58:29.226136 master-0 kubenswrapper[29936]: I1205 12:58:29.226022 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 05 12:58:29.227542 master-0 kubenswrapper[29936]: I1205 12:58:29.227371 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt"] Dec 05 12:58:29.234355 master-0 kubenswrapper[29936]: I1205 12:58:29.234135 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t"] Dec 05 12:58:29.244026 master-0 kubenswrapper[29936]: I1205 12:58:29.235974 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" Dec 05 12:58:29.246904 master-0 kubenswrapper[29936]: I1205 12:58:29.246825 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t"] Dec 05 12:58:29.256063 master-0 kubenswrapper[29936]: I1205 12:58:29.255747 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57nn\" (UniqueName: \"kubernetes.io/projected/40a43e8c-a2c1-4995-9859-0bb724868066-kube-api-access-z57nn\") pod \"obo-prometheus-operator-668cf9dfbb-jz5qr\" (UID: \"40a43e8c-a2c1-4995-9859-0bb724868066\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr" Dec 05 12:58:29.399996 master-0 kubenswrapper[29936]: I1205 12:58:29.399293 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t\" (UID: \"c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" Dec 05 12:58:29.399996 master-0 kubenswrapper[29936]: I1205 12:58:29.399909 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab164ebb-1ca8-476c-be7d-b9b4cd8394d7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt\" (UID: \"ab164ebb-1ca8-476c-be7d-b9b4cd8394d7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" Dec 05 12:58:29.401553 master-0 kubenswrapper[29936]: I1205 12:58:29.401044 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-ttkdw"] Dec 05 12:58:29.402637 master-0 kubenswrapper[29936]: I1205 12:58:29.402498 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" Dec 05 12:58:29.405551 master-0 kubenswrapper[29936]: I1205 12:58:29.405355 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t\" (UID: \"c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" Dec 05 12:58:29.405551 master-0 kubenswrapper[29936]: I1205 12:58:29.405428 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab164ebb-1ca8-476c-be7d-b9b4cd8394d7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt\" (UID: \"ab164ebb-1ca8-476c-be7d-b9b4cd8394d7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" Dec 05 12:58:29.412102 master-0 kubenswrapper[29936]: I1205 12:58:29.412023 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 05 12:58:29.424326 master-0 kubenswrapper[29936]: I1205 12:58:29.424124 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-ttkdw"] Dec 05 12:58:29.441972 master-0 kubenswrapper[29936]: I1205 12:58:29.441129 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr" Dec 05 12:58:29.509262 master-0 kubenswrapper[29936]: I1205 12:58:29.509197 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab164ebb-1ca8-476c-be7d-b9b4cd8394d7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt\" (UID: \"ab164ebb-1ca8-476c-be7d-b9b4cd8394d7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" Dec 05 12:58:29.509511 master-0 kubenswrapper[29936]: I1205 12:58:29.509304 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t\" (UID: \"c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" Dec 05 12:58:29.509511 master-0 kubenswrapper[29936]: I1205 12:58:29.509322 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab164ebb-1ca8-476c-be7d-b9b4cd8394d7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt\" (UID: \"ab164ebb-1ca8-476c-be7d-b9b4cd8394d7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" Dec 05 12:58:29.509652 master-0 kubenswrapper[29936]: I1205 12:58:29.509593 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts2h7\" (UniqueName: \"kubernetes.io/projected/0cbbc2e1-decb-4ec7-a9a7-2b7274726569-kube-api-access-ts2h7\") pod \"observability-operator-d8bb48f5d-ttkdw\" (UID: \"0cbbc2e1-decb-4ec7-a9a7-2b7274726569\") " pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" Dec 05 12:58:29.509820 master-0 kubenswrapper[29936]: I1205 12:58:29.509772 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t\" (UID: \"c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" Dec 05 12:58:29.509912 master-0 kubenswrapper[29936]: I1205 12:58:29.509882 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cbbc2e1-decb-4ec7-a9a7-2b7274726569-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-ttkdw\" (UID: \"0cbbc2e1-decb-4ec7-a9a7-2b7274726569\") " pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" Dec 05 12:58:29.514153 master-0 kubenswrapper[29936]: I1205 12:58:29.514111 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab164ebb-1ca8-476c-be7d-b9b4cd8394d7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt\" (UID: \"ab164ebb-1ca8-476c-be7d-b9b4cd8394d7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" Dec 05 12:58:29.526259 master-0 kubenswrapper[29936]: I1205 12:58:29.526206 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t\" (UID: \"c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" Dec 05 12:58:29.529863 master-0 kubenswrapper[29936]: I1205 12:58:29.529777 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t\" (UID: \"c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" Dec 05 12:58:29.534796 master-0 kubenswrapper[29936]: I1205 12:58:29.534759 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab164ebb-1ca8-476c-be7d-b9b4cd8394d7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt\" (UID: \"ab164ebb-1ca8-476c-be7d-b9b4cd8394d7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" Dec 05 12:58:29.560565 master-0 kubenswrapper[29936]: I1205 12:58:29.559516 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" Dec 05 12:58:29.601211 master-0 kubenswrapper[29936]: I1205 12:58:29.601121 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-v5p9n"] Dec 05 12:58:29.603387 master-0 kubenswrapper[29936]: I1205 12:58:29.603365 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-v5p9n" Dec 05 12:58:29.617222 master-0 kubenswrapper[29936]: I1205 12:58:29.613410 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts2h7\" (UniqueName: \"kubernetes.io/projected/0cbbc2e1-decb-4ec7-a9a7-2b7274726569-kube-api-access-ts2h7\") pod \"observability-operator-d8bb48f5d-ttkdw\" (UID: \"0cbbc2e1-decb-4ec7-a9a7-2b7274726569\") " pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" Dec 05 12:58:29.617222 master-0 kubenswrapper[29936]: I1205 12:58:29.613518 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c-openshift-service-ca\") pod \"perses-operator-5446b9c989-v5p9n\" (UID: \"ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c\") " pod="openshift-operators/perses-operator-5446b9c989-v5p9n" Dec 05 12:58:29.617222 master-0 kubenswrapper[29936]: I1205 12:58:29.613569 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sklb5\" (UniqueName: \"kubernetes.io/projected/ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c-kube-api-access-sklb5\") pod \"perses-operator-5446b9c989-v5p9n\" (UID: \"ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c\") " pod="openshift-operators/perses-operator-5446b9c989-v5p9n" Dec 05 12:58:29.617222 master-0 kubenswrapper[29936]: I1205 12:58:29.613696 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cbbc2e1-decb-4ec7-a9a7-2b7274726569-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-ttkdw\" (UID: \"0cbbc2e1-decb-4ec7-a9a7-2b7274726569\") " pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" Dec 05 12:58:29.631968 master-0 kubenswrapper[29936]: I1205 12:58:29.618131 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cbbc2e1-decb-4ec7-a9a7-2b7274726569-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-ttkdw\" (UID: \"0cbbc2e1-decb-4ec7-a9a7-2b7274726569\") " pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" Dec 05 12:58:29.631968 master-0 kubenswrapper[29936]: I1205 12:58:29.626714 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" Dec 05 12:58:29.639870 master-0 kubenswrapper[29936]: I1205 12:58:29.639818 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-v5p9n"] Dec 05 12:58:29.656096 master-0 kubenswrapper[29936]: I1205 12:58:29.656054 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts2h7\" (UniqueName: \"kubernetes.io/projected/0cbbc2e1-decb-4ec7-a9a7-2b7274726569-kube-api-access-ts2h7\") pod \"observability-operator-d8bb48f5d-ttkdw\" (UID: \"0cbbc2e1-decb-4ec7-a9a7-2b7274726569\") " pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" Dec 05 12:58:29.716211 master-0 kubenswrapper[29936]: I1205 12:58:29.716060 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c-openshift-service-ca\") pod \"perses-operator-5446b9c989-v5p9n\" (UID: \"ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c\") " pod="openshift-operators/perses-operator-5446b9c989-v5p9n" Dec 05 12:58:29.716211 master-0 kubenswrapper[29936]: I1205 12:58:29.716142 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sklb5\" (UniqueName: \"kubernetes.io/projected/ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c-kube-api-access-sklb5\") pod \"perses-operator-5446b9c989-v5p9n\" (UID: \"ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c\") " pod="openshift-operators/perses-operator-5446b9c989-v5p9n" Dec 05 12:58:29.717056 master-0 kubenswrapper[29936]: I1205 12:58:29.716983 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c-openshift-service-ca\") pod \"perses-operator-5446b9c989-v5p9n\" (UID: \"ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c\") " pod="openshift-operators/perses-operator-5446b9c989-v5p9n" Dec 05 12:58:29.736227 master-0 kubenswrapper[29936]: I1205 12:58:29.735903 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" Dec 05 12:58:29.760737 master-0 kubenswrapper[29936]: I1205 12:58:29.760676 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sklb5\" (UniqueName: \"kubernetes.io/projected/ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c-kube-api-access-sklb5\") pod \"perses-operator-5446b9c989-v5p9n\" (UID: \"ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c\") " pod="openshift-operators/perses-operator-5446b9c989-v5p9n" Dec 05 12:58:29.963574 master-0 kubenswrapper[29936]: I1205 12:58:29.963478 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-v5p9n" Dec 05 12:58:30.085442 master-0 kubenswrapper[29936]: I1205 12:58:30.071081 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr"] Dec 05 12:58:30.085442 master-0 kubenswrapper[29936]: W1205 12:58:30.080800 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40a43e8c_a2c1_4995_9859_0bb724868066.slice/crio-fee7f808146e532d4c722894abf44c494d5ff54da8adc5025602753c6203d5bf WatchSource:0}: Error finding container fee7f808146e532d4c722894abf44c494d5ff54da8adc5025602753c6203d5bf: Status 404 returned error can't find the container with id fee7f808146e532d4c722894abf44c494d5ff54da8adc5025602753c6203d5bf Dec 05 12:58:30.194877 master-0 kubenswrapper[29936]: I1205 12:58:30.194721 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt"] Dec 05 12:58:30.434274 master-0 kubenswrapper[29936]: I1205 12:58:30.428023 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-ttkdw"] Dec 05 12:58:30.452353 master-0 kubenswrapper[29936]: I1205 12:58:30.450471 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t"] Dec 05 12:58:30.488800 master-0 kubenswrapper[29936]: I1205 12:58:30.488199 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" event={"ID":"ab164ebb-1ca8-476c-be7d-b9b4cd8394d7","Type":"ContainerStarted","Data":"615c3d187840f70c46332c25bbfe5d06f53e446e538e6c73012ce2926f608db8"} Dec 05 12:58:30.492470 master-0 kubenswrapper[29936]: I1205 12:58:30.491117 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr" event={"ID":"40a43e8c-a2c1-4995-9859-0bb724868066","Type":"ContainerStarted","Data":"fee7f808146e532d4c722894abf44c494d5ff54da8adc5025602753c6203d5bf"} Dec 05 12:58:30.554704 master-0 kubenswrapper[29936]: I1205 12:58:30.554515 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-v5p9n"] Dec 05 12:58:30.560703 master-0 kubenswrapper[29936]: W1205 12:58:30.560508 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad4ae453_3c35_4f05_8a63_b4b8eb70ed8c.slice/crio-27de09636721c9db4cc4b0c5edf77c2d7846567a8c89dadd6a415269a0c79183 WatchSource:0}: Error finding container 27de09636721c9db4cc4b0c5edf77c2d7846567a8c89dadd6a415269a0c79183: Status 404 returned error can't find the container with id 27de09636721c9db4cc4b0c5edf77c2d7846567a8c89dadd6a415269a0c79183 Dec 05 12:58:31.501389 master-0 kubenswrapper[29936]: I1205 12:58:31.501288 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" event={"ID":"c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0","Type":"ContainerStarted","Data":"37c84a9ac4295e9ea7d8c4a7de402596cf9e83184aa1ecca81434e58056b8b71"} Dec 05 12:58:31.503019 master-0 kubenswrapper[29936]: I1205 12:58:31.502973 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-v5p9n" event={"ID":"ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c","Type":"ContainerStarted","Data":"27de09636721c9db4cc4b0c5edf77c2d7846567a8c89dadd6a415269a0c79183"} Dec 05 12:58:31.505066 master-0 kubenswrapper[29936]: I1205 12:58:31.504973 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" event={"ID":"0cbbc2e1-decb-4ec7-a9a7-2b7274726569","Type":"ContainerStarted","Data":"d83d89419ef154902552ec87100aa66691fa260a2db416b33361f3e495b049fd"} Dec 05 12:58:43.231012 master-0 kubenswrapper[29936]: I1205 12:58:43.230939 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-4qlq4"] Dec 05 12:58:43.233029 master-0 kubenswrapper[29936]: I1205 12:58:43.232979 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-4qlq4" Dec 05 12:58:43.254260 master-0 kubenswrapper[29936]: I1205 12:58:43.251706 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-4qlq4"] Dec 05 12:58:43.382679 master-0 kubenswrapper[29936]: I1205 12:58:43.382592 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce07c634-ebd3-4d52-ae34-c2b4ebbaaede-bound-sa-token\") pod \"cert-manager-86cb77c54b-4qlq4\" (UID: \"ce07c634-ebd3-4d52-ae34-c2b4ebbaaede\") " pod="cert-manager/cert-manager-86cb77c54b-4qlq4" Dec 05 12:58:43.383027 master-0 kubenswrapper[29936]: I1205 12:58:43.382809 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56zz9\" (UniqueName: \"kubernetes.io/projected/ce07c634-ebd3-4d52-ae34-c2b4ebbaaede-kube-api-access-56zz9\") pod \"cert-manager-86cb77c54b-4qlq4\" (UID: \"ce07c634-ebd3-4d52-ae34-c2b4ebbaaede\") " pod="cert-manager/cert-manager-86cb77c54b-4qlq4" Dec 05 12:58:43.485122 master-0 kubenswrapper[29936]: I1205 12:58:43.484972 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce07c634-ebd3-4d52-ae34-c2b4ebbaaede-bound-sa-token\") pod \"cert-manager-86cb77c54b-4qlq4\" (UID: \"ce07c634-ebd3-4d52-ae34-c2b4ebbaaede\") " pod="cert-manager/cert-manager-86cb77c54b-4qlq4" Dec 05 12:58:43.485122 master-0 kubenswrapper[29936]: I1205 12:58:43.485047 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56zz9\" (UniqueName: \"kubernetes.io/projected/ce07c634-ebd3-4d52-ae34-c2b4ebbaaede-kube-api-access-56zz9\") pod \"cert-manager-86cb77c54b-4qlq4\" (UID: \"ce07c634-ebd3-4d52-ae34-c2b4ebbaaede\") " pod="cert-manager/cert-manager-86cb77c54b-4qlq4" Dec 05 12:58:43.510348 master-0 kubenswrapper[29936]: I1205 12:58:43.510267 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56zz9\" (UniqueName: \"kubernetes.io/projected/ce07c634-ebd3-4d52-ae34-c2b4ebbaaede-kube-api-access-56zz9\") pod \"cert-manager-86cb77c54b-4qlq4\" (UID: \"ce07c634-ebd3-4d52-ae34-c2b4ebbaaede\") " pod="cert-manager/cert-manager-86cb77c54b-4qlq4" Dec 05 12:58:43.511816 master-0 kubenswrapper[29936]: I1205 12:58:43.511767 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce07c634-ebd3-4d52-ae34-c2b4ebbaaede-bound-sa-token\") pod \"cert-manager-86cb77c54b-4qlq4\" (UID: \"ce07c634-ebd3-4d52-ae34-c2b4ebbaaede\") " pod="cert-manager/cert-manager-86cb77c54b-4qlq4" Dec 05 12:58:43.581294 master-0 kubenswrapper[29936]: I1205 12:58:43.581139 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-4qlq4" Dec 05 12:58:59.778617 master-0 kubenswrapper[29936]: I1205 12:58:59.778549 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jzqtv"] Dec 05 12:58:59.780568 master-0 kubenswrapper[29936]: I1205 12:58:59.780540 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:58:59.847532 master-0 kubenswrapper[29936]: I1205 12:58:59.847449 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jzqtv"] Dec 05 12:58:59.998508 master-0 kubenswrapper[29936]: I1205 12:58:59.998310 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-utilities\") pod \"certified-operators-jzqtv\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:58:59.998508 master-0 kubenswrapper[29936]: I1205 12:58:59.998436 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-catalog-content\") pod \"certified-operators-jzqtv\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:58:59.998508 master-0 kubenswrapper[29936]: I1205 12:58:59.998520 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmx7l\" (UniqueName: \"kubernetes.io/projected/44a851e2-64e5-4de6-94c8-510993ac2a2b-kube-api-access-gmx7l\") pod \"certified-operators-jzqtv\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:00.100711 master-0 kubenswrapper[29936]: I1205 12:59:00.099959 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-catalog-content\") pod \"certified-operators-jzqtv\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:00.100711 master-0 kubenswrapper[29936]: I1205 12:59:00.100107 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmx7l\" (UniqueName: \"kubernetes.io/projected/44a851e2-64e5-4de6-94c8-510993ac2a2b-kube-api-access-gmx7l\") pod \"certified-operators-jzqtv\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:00.100711 master-0 kubenswrapper[29936]: I1205 12:59:00.100164 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-utilities\") pod \"certified-operators-jzqtv\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:00.101028 master-0 kubenswrapper[29936]: I1205 12:59:00.100983 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-utilities\") pod \"certified-operators-jzqtv\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:00.101327 master-0 kubenswrapper[29936]: I1205 12:59:00.101302 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-catalog-content\") pod \"certified-operators-jzqtv\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:00.142683 master-0 kubenswrapper[29936]: I1205 12:59:00.135055 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmx7l\" (UniqueName: \"kubernetes.io/projected/44a851e2-64e5-4de6-94c8-510993ac2a2b-kube-api-access-gmx7l\") pod \"certified-operators-jzqtv\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:00.350244 master-0 kubenswrapper[29936]: I1205 12:59:00.349086 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv"] Dec 05 12:59:00.367629 master-0 kubenswrapper[29936]: I1205 12:59:00.364140 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.376896 master-0 kubenswrapper[29936]: I1205 12:59:00.376832 29936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 12:59:00.377059 master-0 kubenswrapper[29936]: I1205 12:59:00.376998 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 12:59:00.377260 master-0 kubenswrapper[29936]: I1205 12:59:00.377230 29936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 12:59:00.377692 master-0 kubenswrapper[29936]: I1205 12:59:00.377655 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 12:59:00.399247 master-0 kubenswrapper[29936]: I1205 12:59:00.396442 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv"] Dec 05 12:59:00.437384 master-0 kubenswrapper[29936]: I1205 12:59:00.435827 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbdx\" (UniqueName: \"kubernetes.io/projected/a9859597-6e73-4398-9adb-030bd647faa2-kube-api-access-ddbdx\") pod \"metallb-operator-controller-manager-57dccff46-h9ncv\" (UID: \"a9859597-6e73-4398-9adb-030bd647faa2\") " pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.437384 master-0 kubenswrapper[29936]: I1205 12:59:00.435912 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9859597-6e73-4398-9adb-030bd647faa2-apiservice-cert\") pod \"metallb-operator-controller-manager-57dccff46-h9ncv\" (UID: \"a9859597-6e73-4398-9adb-030bd647faa2\") " pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.437384 master-0 kubenswrapper[29936]: I1205 12:59:00.435969 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9859597-6e73-4398-9adb-030bd647faa2-webhook-cert\") pod \"metallb-operator-controller-manager-57dccff46-h9ncv\" (UID: \"a9859597-6e73-4398-9adb-030bd647faa2\") " pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.437384 master-0 kubenswrapper[29936]: I1205 12:59:00.436514 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:00.557257 master-0 kubenswrapper[29936]: I1205 12:59:00.547171 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbdx\" (UniqueName: \"kubernetes.io/projected/a9859597-6e73-4398-9adb-030bd647faa2-kube-api-access-ddbdx\") pod \"metallb-operator-controller-manager-57dccff46-h9ncv\" (UID: \"a9859597-6e73-4398-9adb-030bd647faa2\") " pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.557257 master-0 kubenswrapper[29936]: I1205 12:59:00.547280 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9859597-6e73-4398-9adb-030bd647faa2-apiservice-cert\") pod \"metallb-operator-controller-manager-57dccff46-h9ncv\" (UID: \"a9859597-6e73-4398-9adb-030bd647faa2\") " pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.557257 master-0 kubenswrapper[29936]: I1205 12:59:00.547336 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9859597-6e73-4398-9adb-030bd647faa2-webhook-cert\") pod \"metallb-operator-controller-manager-57dccff46-h9ncv\" (UID: \"a9859597-6e73-4398-9adb-030bd647faa2\") " pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.557257 master-0 kubenswrapper[29936]: I1205 12:59:00.556805 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a9859597-6e73-4398-9adb-030bd647faa2-webhook-cert\") pod \"metallb-operator-controller-manager-57dccff46-h9ncv\" (UID: \"a9859597-6e73-4398-9adb-030bd647faa2\") " pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.567213 master-0 kubenswrapper[29936]: I1205 12:59:00.558996 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a9859597-6e73-4398-9adb-030bd647faa2-apiservice-cert\") pod \"metallb-operator-controller-manager-57dccff46-h9ncv\" (UID: \"a9859597-6e73-4398-9adb-030bd647faa2\") " pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.618212 master-0 kubenswrapper[29936]: I1205 12:59:00.602554 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbdx\" (UniqueName: \"kubernetes.io/projected/a9859597-6e73-4398-9adb-030bd647faa2-kube-api-access-ddbdx\") pod \"metallb-operator-controller-manager-57dccff46-h9ncv\" (UID: \"a9859597-6e73-4398-9adb-030bd647faa2\") " pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.653207 master-0 kubenswrapper[29936]: I1205 12:59:00.650034 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-4qlq4"] Dec 05 12:59:00.858604 master-0 kubenswrapper[29936]: I1205 12:59:00.850629 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:00.989207 master-0 kubenswrapper[29936]: I1205 12:59:00.979564 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" event={"ID":"0878dccd-eef2-46f2-bf0c-91fdf0c258b6","Type":"ContainerStarted","Data":"967e0669f8336a39c8b7c83bf1e2a92a8ba8c59293c26bbaacc4862317a59c40"} Dec 05 12:59:01.011584 master-0 kubenswrapper[29936]: I1205 12:59:01.011518 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" event={"ID":"c64f9aa1-6aaf-4d61-9a35-78ed8a82ffa0","Type":"ContainerStarted","Data":"ad74ffdfa6347a89545346c26e61f60d323f332d39e8eb656f2630ab9129bfed"} Dec 05 12:59:01.048900 master-0 kubenswrapper[29936]: I1205 12:59:01.048820 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" event={"ID":"ab164ebb-1ca8-476c-be7d-b9b4cd8394d7","Type":"ContainerStarted","Data":"35781495769ce34549233e1e18286116588bf44aa740f71bef3bdb3bead76111"} Dec 05 12:59:01.119277 master-0 kubenswrapper[29936]: I1205 12:59:01.112023 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr" event={"ID":"40a43e8c-a2c1-4995-9859-0bb724868066","Type":"ContainerStarted","Data":"831fe92a6d7a288d2004c5d41b61988949bfd8e3d2b746d4ebccd83c4c1920a7"} Dec 05 12:59:01.119277 master-0 kubenswrapper[29936]: I1205 12:59:01.115640 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-v5p9n" event={"ID":"ad4ae453-3c35-4f05-8a63-b4b8eb70ed8c","Type":"ContainerStarted","Data":"871f7026d650772b64bbfc6057e97bd27248eada6a527876147500cc495ba936"} Dec 05 12:59:01.119277 master-0 kubenswrapper[29936]: I1205 12:59:01.116457 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-v5p9n" Dec 05 12:59:01.149141 master-0 kubenswrapper[29936]: I1205 12:59:01.149093 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv"] Dec 05 12:59:01.151877 master-0 kubenswrapper[29936]: I1205 12:59:01.150519 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.162467 master-0 kubenswrapper[29936]: I1205 12:59:01.153103 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-dnmf8" podStartSLOduration=2.286999932 podStartE2EDuration="37.153078231s" podCreationTimestamp="2025-12-05 12:58:24 +0000 UTC" firstStartedPulling="2025-12-05 12:58:25.220346579 +0000 UTC m=+502.352426270" lastFinishedPulling="2025-12-05 12:59:00.086424888 +0000 UTC m=+537.218504569" observedRunningTime="2025-12-05 12:59:01.020138757 +0000 UTC m=+538.152218438" watchObservedRunningTime="2025-12-05 12:59:01.153078231 +0000 UTC m=+538.285157912" Dec 05 12:59:01.170332 master-0 kubenswrapper[29936]: I1205 12:59:01.169400 29936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 12:59:01.170332 master-0 kubenswrapper[29936]: I1205 12:59:01.169914 29936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 12:59:01.171009 master-0 kubenswrapper[29936]: I1205 12:59:01.170967 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-4qlq4" event={"ID":"ce07c634-ebd3-4d52-ae34-c2b4ebbaaede","Type":"ContainerStarted","Data":"865a05984ef0355da79441059666eba8aded0ca733e81048878de50b79723678"} Dec 05 12:59:01.276150 master-0 kubenswrapper[29936]: I1205 12:59:01.249890 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" event={"ID":"4c861238-86b8-4421-9ad0-54485c8faa9a","Type":"ContainerStarted","Data":"664ecf7ff48b4eba393a49115056e6a2a1c254b71763b838521d6ca9fa2b0653"} Dec 05 12:59:01.276150 master-0 kubenswrapper[29936]: I1205 12:59:01.249954 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv"] Dec 05 12:59:01.276150 master-0 kubenswrapper[29936]: I1205 12:59:01.249986 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" Dec 05 12:59:01.276150 master-0 kubenswrapper[29936]: I1205 12:59:01.259406 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" event={"ID":"0cbbc2e1-decb-4ec7-a9a7-2b7274726569","Type":"ContainerStarted","Data":"3ddb00f9d9142518dc2d2c1a7f1f3fded536ca9ac5e19e6e2341623009bda269"} Dec 05 12:59:01.276150 master-0 kubenswrapper[29936]: I1205 12:59:01.260517 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" Dec 05 12:59:01.276150 master-0 kubenswrapper[29936]: I1205 12:59:01.268922 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66f76a59-8a5b-4727-97f0-782a7151faa0-webhook-cert\") pod \"metallb-operator-webhook-server-54f4f6554c-rg9pv\" (UID: \"66f76a59-8a5b-4727-97f0-782a7151faa0\") " pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.276150 master-0 kubenswrapper[29936]: I1205 12:59:01.268981 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sptfk\" (UniqueName: \"kubernetes.io/projected/66f76a59-8a5b-4727-97f0-782a7151faa0-kube-api-access-sptfk\") pod \"metallb-operator-webhook-server-54f4f6554c-rg9pv\" (UID: \"66f76a59-8a5b-4727-97f0-782a7151faa0\") " pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.276150 master-0 kubenswrapper[29936]: I1205 12:59:01.269063 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66f76a59-8a5b-4727-97f0-782a7151faa0-apiservice-cert\") pod \"metallb-operator-webhook-server-54f4f6554c-rg9pv\" (UID: \"66f76a59-8a5b-4727-97f0-782a7151faa0\") " pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.303644 master-0 kubenswrapper[29936]: I1205 12:59:01.291593 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" Dec 05 12:59:01.341802 master-0 kubenswrapper[29936]: I1205 12:59:01.336677 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-r7w2t" podStartSLOduration=2.735148857 podStartE2EDuration="32.336640004s" podCreationTimestamp="2025-12-05 12:58:29 +0000 UTC" firstStartedPulling="2025-12-05 12:58:30.485856946 +0000 UTC m=+507.617936627" lastFinishedPulling="2025-12-05 12:59:00.087348093 +0000 UTC m=+537.219427774" observedRunningTime="2025-12-05 12:59:01.085009375 +0000 UTC m=+538.217089056" watchObservedRunningTime="2025-12-05 12:59:01.336640004 +0000 UTC m=+538.468719685" Dec 05 12:59:01.357523 master-0 kubenswrapper[29936]: I1205 12:59:01.357405 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-86f84c677c-7lbvt" podStartSLOduration=2.645233647 podStartE2EDuration="32.357365039s" podCreationTimestamp="2025-12-05 12:58:29 +0000 UTC" firstStartedPulling="2025-12-05 12:58:30.219815025 +0000 UTC m=+507.351894706" lastFinishedPulling="2025-12-05 12:58:59.931946417 +0000 UTC m=+537.064026098" observedRunningTime="2025-12-05 12:59:01.122740483 +0000 UTC m=+538.254820174" watchObservedRunningTime="2025-12-05 12:59:01.357365039 +0000 UTC m=+538.489444720" Dec 05 12:59:01.375258 master-0 kubenswrapper[29936]: I1205 12:59:01.373374 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66f76a59-8a5b-4727-97f0-782a7151faa0-apiservice-cert\") pod \"metallb-operator-webhook-server-54f4f6554c-rg9pv\" (UID: \"66f76a59-8a5b-4727-97f0-782a7151faa0\") " pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.375258 master-0 kubenswrapper[29936]: I1205 12:59:01.373553 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66f76a59-8a5b-4727-97f0-782a7151faa0-webhook-cert\") pod \"metallb-operator-webhook-server-54f4f6554c-rg9pv\" (UID: \"66f76a59-8a5b-4727-97f0-782a7151faa0\") " pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.375258 master-0 kubenswrapper[29936]: I1205 12:59:01.373588 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sptfk\" (UniqueName: \"kubernetes.io/projected/66f76a59-8a5b-4727-97f0-782a7151faa0-kube-api-access-sptfk\") pod \"metallb-operator-webhook-server-54f4f6554c-rg9pv\" (UID: \"66f76a59-8a5b-4727-97f0-782a7151faa0\") " pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.384070 master-0 kubenswrapper[29936]: I1205 12:59:01.383696 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/66f76a59-8a5b-4727-97f0-782a7151faa0-webhook-cert\") pod \"metallb-operator-webhook-server-54f4f6554c-rg9pv\" (UID: \"66f76a59-8a5b-4727-97f0-782a7151faa0\") " pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.387906 master-0 kubenswrapper[29936]: I1205 12:59:01.387795 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-jz5qr" podStartSLOduration=2.534010546 podStartE2EDuration="32.387760618s" podCreationTimestamp="2025-12-05 12:58:29 +0000 UTC" firstStartedPulling="2025-12-05 12:58:30.083809548 +0000 UTC m=+507.215889229" lastFinishedPulling="2025-12-05 12:58:59.93755962 +0000 UTC m=+537.069639301" observedRunningTime="2025-12-05 12:59:01.167813523 +0000 UTC m=+538.299893204" watchObservedRunningTime="2025-12-05 12:59:01.387760618 +0000 UTC m=+538.519840299" Dec 05 12:59:01.392149 master-0 kubenswrapper[29936]: I1205 12:59:01.392089 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-v5p9n" podStartSLOduration=2.907067044 podStartE2EDuration="32.392074055s" podCreationTimestamp="2025-12-05 12:58:29 +0000 UTC" firstStartedPulling="2025-12-05 12:58:30.565313112 +0000 UTC m=+507.697392793" lastFinishedPulling="2025-12-05 12:59:00.050320123 +0000 UTC m=+537.182399804" observedRunningTime="2025-12-05 12:59:01.224773225 +0000 UTC m=+538.356852906" watchObservedRunningTime="2025-12-05 12:59:01.392074055 +0000 UTC m=+538.524153736" Dec 05 12:59:01.403019 master-0 kubenswrapper[29936]: I1205 12:59:01.402782 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/66f76a59-8a5b-4727-97f0-782a7151faa0-apiservice-cert\") pod \"metallb-operator-webhook-server-54f4f6554c-rg9pv\" (UID: \"66f76a59-8a5b-4727-97f0-782a7151faa0\") " pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.411414 master-0 kubenswrapper[29936]: I1205 12:59:01.411098 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sptfk\" (UniqueName: \"kubernetes.io/projected/66f76a59-8a5b-4727-97f0-782a7151faa0-kube-api-access-sptfk\") pod \"metallb-operator-webhook-server-54f4f6554c-rg9pv\" (UID: \"66f76a59-8a5b-4727-97f0-782a7151faa0\") " pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.418854 master-0 kubenswrapper[29936]: I1205 12:59:01.418605 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jzqtv"] Dec 05 12:59:01.426623 master-0 kubenswrapper[29936]: I1205 12:59:01.426509 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" podStartSLOduration=3.541724724 podStartE2EDuration="38.426474442s" podCreationTimestamp="2025-12-05 12:58:23 +0000 UTC" firstStartedPulling="2025-12-05 12:58:25.071874102 +0000 UTC m=+502.203953783" lastFinishedPulling="2025-12-05 12:58:59.95662382 +0000 UTC m=+537.088703501" observedRunningTime="2025-12-05 12:59:01.322889649 +0000 UTC m=+538.454969330" watchObservedRunningTime="2025-12-05 12:59:01.426474442 +0000 UTC m=+538.558554143" Dec 05 12:59:01.481209 master-0 kubenswrapper[29936]: I1205 12:59:01.469078 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-ttkdw" podStartSLOduration=2.868602345 podStartE2EDuration="32.469048253s" podCreationTimestamp="2025-12-05 12:58:29 +0000 UTC" firstStartedPulling="2025-12-05 12:58:30.485577509 +0000 UTC m=+507.617657190" lastFinishedPulling="2025-12-05 12:59:00.086023417 +0000 UTC m=+537.218103098" observedRunningTime="2025-12-05 12:59:01.401694028 +0000 UTC m=+538.533773719" watchObservedRunningTime="2025-12-05 12:59:01.469048253 +0000 UTC m=+538.601127934" Dec 05 12:59:01.566893 master-0 kubenswrapper[29936]: I1205 12:59:01.566740 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:01.670463 master-0 kubenswrapper[29936]: I1205 12:59:01.668202 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv"] Dec 05 12:59:02.167335 master-0 kubenswrapper[29936]: I1205 12:59:02.167156 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv"] Dec 05 12:59:02.177655 master-0 kubenswrapper[29936]: W1205 12:59:02.177550 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66f76a59_8a5b_4727_97f0_782a7151faa0.slice/crio-9aa1185dbf57dfc4c81fb9e5728fb90c9c8bdbdf2894e857287e1c3b76a4ac91 WatchSource:0}: Error finding container 9aa1185dbf57dfc4c81fb9e5728fb90c9c8bdbdf2894e857287e1c3b76a4ac91: Status 404 returned error can't find the container with id 9aa1185dbf57dfc4c81fb9e5728fb90c9c8bdbdf2894e857287e1c3b76a4ac91 Dec 05 12:59:02.292265 master-0 kubenswrapper[29936]: I1205 12:59:02.287916 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-4qlq4" event={"ID":"ce07c634-ebd3-4d52-ae34-c2b4ebbaaede","Type":"ContainerStarted","Data":"5550478773f090fd3ee3620dc5e0460314c5ef231bb288df66ee286b6d117005"} Dec 05 12:59:02.292265 master-0 kubenswrapper[29936]: I1205 12:59:02.289737 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" event={"ID":"a9859597-6e73-4398-9adb-030bd647faa2","Type":"ContainerStarted","Data":"68eb621cf214c4c1adfa6496efb6b5831865319b70ee374f4dd857a670451a9d"} Dec 05 12:59:02.292265 master-0 kubenswrapper[29936]: I1205 12:59:02.291904 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" event={"ID":"66f76a59-8a5b-4727-97f0-782a7151faa0","Type":"ContainerStarted","Data":"9aa1185dbf57dfc4c81fb9e5728fb90c9c8bdbdf2894e857287e1c3b76a4ac91"} Dec 05 12:59:02.294970 master-0 kubenswrapper[29936]: I1205 12:59:02.294919 29936 generic.go:334] "Generic (PLEG): container finished" podID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerID="41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a" exitCode=0 Dec 05 12:59:02.297137 master-0 kubenswrapper[29936]: I1205 12:59:02.296425 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqtv" event={"ID":"44a851e2-64e5-4de6-94c8-510993ac2a2b","Type":"ContainerDied","Data":"41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a"} Dec 05 12:59:02.297137 master-0 kubenswrapper[29936]: I1205 12:59:02.296503 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqtv" event={"ID":"44a851e2-64e5-4de6-94c8-510993ac2a2b","Type":"ContainerStarted","Data":"04d4458701634cc29a0f6eb3296bdaa97f98addfefdc02d037b46948c1aa3f0d"} Dec 05 12:59:02.331168 master-0 kubenswrapper[29936]: I1205 12:59:02.324912 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-4qlq4" podStartSLOduration=19.32488689 podStartE2EDuration="19.32488689s" podCreationTimestamp="2025-12-05 12:58:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 12:59:02.320654325 +0000 UTC m=+539.452734006" watchObservedRunningTime="2025-12-05 12:59:02.32488689 +0000 UTC m=+539.456966571" Dec 05 12:59:05.402544 master-0 kubenswrapper[29936]: I1205 12:59:05.402360 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqtv" event={"ID":"44a851e2-64e5-4de6-94c8-510993ac2a2b","Type":"ContainerStarted","Data":"c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41"} Dec 05 12:59:06.413624 master-0 kubenswrapper[29936]: I1205 12:59:06.413502 29936 generic.go:334] "Generic (PLEG): container finished" podID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerID="c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41" exitCode=0 Dec 05 12:59:06.413624 master-0 kubenswrapper[29936]: I1205 12:59:06.413591 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqtv" event={"ID":"44a851e2-64e5-4de6-94c8-510993ac2a2b","Type":"ContainerDied","Data":"c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41"} Dec 05 12:59:09.256592 master-0 kubenswrapper[29936]: I1205 12:59:09.256452 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-fxz9v" Dec 05 12:59:09.967696 master-0 kubenswrapper[29936]: I1205 12:59:09.967601 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-v5p9n" Dec 05 12:59:10.671201 master-0 kubenswrapper[29936]: I1205 12:59:10.670597 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4wpvs"] Dec 05 12:59:10.693205 master-0 kubenswrapper[29936]: I1205 12:59:10.692258 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:10.786783 master-0 kubenswrapper[29936]: I1205 12:59:10.786686 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-utilities\") pod \"community-operators-4wpvs\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:10.787047 master-0 kubenswrapper[29936]: I1205 12:59:10.786915 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgpjm\" (UniqueName: \"kubernetes.io/projected/620b7762-43b9-4bf4-aae2-58d46090b954-kube-api-access-qgpjm\") pod \"community-operators-4wpvs\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:10.787370 master-0 kubenswrapper[29936]: I1205 12:59:10.787340 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-catalog-content\") pod \"community-operators-4wpvs\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:10.889663 master-0 kubenswrapper[29936]: I1205 12:59:10.889589 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgpjm\" (UniqueName: \"kubernetes.io/projected/620b7762-43b9-4bf4-aae2-58d46090b954-kube-api-access-qgpjm\") pod \"community-operators-4wpvs\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:10.889932 master-0 kubenswrapper[29936]: I1205 12:59:10.889722 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-catalog-content\") pod \"community-operators-4wpvs\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:10.889932 master-0 kubenswrapper[29936]: I1205 12:59:10.889788 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-utilities\") pod \"community-operators-4wpvs\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:10.890837 master-0 kubenswrapper[29936]: I1205 12:59:10.890784 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-catalog-content\") pod \"community-operators-4wpvs\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:10.892040 master-0 kubenswrapper[29936]: I1205 12:59:10.892011 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-utilities\") pod \"community-operators-4wpvs\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:11.485372 master-0 kubenswrapper[29936]: I1205 12:59:11.484456 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4wpvs"] Dec 05 12:59:11.676210 master-0 kubenswrapper[29936]: I1205 12:59:11.676014 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgpjm\" (UniqueName: \"kubernetes.io/projected/620b7762-43b9-4bf4-aae2-58d46090b954-kube-api-access-qgpjm\") pod \"community-operators-4wpvs\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:11.955618 master-0 kubenswrapper[29936]: I1205 12:59:11.955510 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:14.952986 master-0 kubenswrapper[29936]: I1205 12:59:14.952139 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4wpvs"] Dec 05 12:59:15.641376 master-0 kubenswrapper[29936]: I1205 12:59:15.641297 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqtv" event={"ID":"44a851e2-64e5-4de6-94c8-510993ac2a2b","Type":"ContainerStarted","Data":"078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f"} Dec 05 12:59:15.645330 master-0 kubenswrapper[29936]: I1205 12:59:15.645249 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" event={"ID":"a9859597-6e73-4398-9adb-030bd647faa2","Type":"ContainerStarted","Data":"ab9474d81952a0874267597bc69764d51fcc9e1e5e96fb8a9bcad6dc4216d10d"} Dec 05 12:59:15.645602 master-0 kubenswrapper[29936]: I1205 12:59:15.645498 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:15.655032 master-0 kubenswrapper[29936]: I1205 12:59:15.654959 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" event={"ID":"66f76a59-8a5b-4727-97f0-782a7151faa0","Type":"ContainerStarted","Data":"06337f3339c767c2a25e423576d030a79e026da2b99877f097c18376e3e93f20"} Dec 05 12:59:15.656150 master-0 kubenswrapper[29936]: I1205 12:59:15.656107 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:15.658915 master-0 kubenswrapper[29936]: I1205 12:59:15.658827 29936 generic.go:334] "Generic (PLEG): container finished" podID="620b7762-43b9-4bf4-aae2-58d46090b954" containerID="3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4" exitCode=0 Dec 05 12:59:15.659017 master-0 kubenswrapper[29936]: I1205 12:59:15.658910 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wpvs" event={"ID":"620b7762-43b9-4bf4-aae2-58d46090b954","Type":"ContainerDied","Data":"3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4"} Dec 05 12:59:15.659017 master-0 kubenswrapper[29936]: I1205 12:59:15.658954 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wpvs" event={"ID":"620b7762-43b9-4bf4-aae2-58d46090b954","Type":"ContainerStarted","Data":"399377c923dab9d052017746743f1b3a3ce8e7571a0b338f32ea6de847e55dec"} Dec 05 12:59:16.013627 master-0 kubenswrapper[29936]: I1205 12:59:16.013486 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jzqtv" podStartSLOduration=5.096261993 podStartE2EDuration="17.013451187s" podCreationTimestamp="2025-12-05 12:58:59 +0000 UTC" firstStartedPulling="2025-12-05 12:59:02.298235564 +0000 UTC m=+539.430315245" lastFinishedPulling="2025-12-05 12:59:14.215424758 +0000 UTC m=+551.347504439" observedRunningTime="2025-12-05 12:59:16.006344692 +0000 UTC m=+553.138424383" watchObservedRunningTime="2025-12-05 12:59:16.013451187 +0000 UTC m=+553.145530868" Dec 05 12:59:16.143210 master-0 kubenswrapper[29936]: I1205 12:59:16.142499 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" podStartSLOduration=4.097901586 podStartE2EDuration="16.142473763s" podCreationTimestamp="2025-12-05 12:59:00 +0000 UTC" firstStartedPulling="2025-12-05 12:59:02.180746841 +0000 UTC m=+539.312826522" lastFinishedPulling="2025-12-05 12:59:14.225319018 +0000 UTC m=+551.357398699" observedRunningTime="2025-12-05 12:59:16.137653732 +0000 UTC m=+553.269733433" watchObservedRunningTime="2025-12-05 12:59:16.142473763 +0000 UTC m=+553.274553444" Dec 05 12:59:16.181792 master-0 kubenswrapper[29936]: I1205 12:59:16.181501 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" podStartSLOduration=3.682302049 podStartE2EDuration="16.181473617s" podCreationTimestamp="2025-12-05 12:59:00 +0000 UTC" firstStartedPulling="2025-12-05 12:59:01.715745997 +0000 UTC m=+538.847825678" lastFinishedPulling="2025-12-05 12:59:14.214917565 +0000 UTC m=+551.346997246" observedRunningTime="2025-12-05 12:59:16.173329594 +0000 UTC m=+553.305409286" watchObservedRunningTime="2025-12-05 12:59:16.181473617 +0000 UTC m=+553.313553308" Dec 05 12:59:20.436949 master-0 kubenswrapper[29936]: I1205 12:59:20.436853 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:20.436949 master-0 kubenswrapper[29936]: I1205 12:59:20.436953 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:20.554809 master-0 kubenswrapper[29936]: I1205 12:59:20.554755 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:20.729450 master-0 kubenswrapper[29936]: I1205 12:59:20.728548 29936 generic.go:334] "Generic (PLEG): container finished" podID="620b7762-43b9-4bf4-aae2-58d46090b954" containerID="1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a" exitCode=0 Dec 05 12:59:20.729721 master-0 kubenswrapper[29936]: I1205 12:59:20.729470 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wpvs" event={"ID":"620b7762-43b9-4bf4-aae2-58d46090b954","Type":"ContainerDied","Data":"1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a"} Dec 05 12:59:20.798237 master-0 kubenswrapper[29936]: I1205 12:59:20.796101 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:21.747309 master-0 kubenswrapper[29936]: I1205 12:59:21.747161 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wpvs" event={"ID":"620b7762-43b9-4bf4-aae2-58d46090b954","Type":"ContainerStarted","Data":"8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8"} Dec 05 12:59:22.812217 master-0 kubenswrapper[29936]: I1205 12:59:22.811556 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4wpvs" podStartSLOduration=7.005232869 podStartE2EDuration="12.811522387s" podCreationTimestamp="2025-12-05 12:59:10 +0000 UTC" firstStartedPulling="2025-12-05 12:59:15.660945029 +0000 UTC m=+552.793024710" lastFinishedPulling="2025-12-05 12:59:21.467234547 +0000 UTC m=+558.599314228" observedRunningTime="2025-12-05 12:59:22.807109307 +0000 UTC m=+559.939188988" watchObservedRunningTime="2025-12-05 12:59:22.811522387 +0000 UTC m=+559.943602068" Dec 05 12:59:23.927209 master-0 kubenswrapper[29936]: I1205 12:59:23.927109 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jzqtv"] Dec 05 12:59:23.928339 master-0 kubenswrapper[29936]: I1205 12:59:23.928306 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jzqtv" podUID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerName="registry-server" containerID="cri-o://078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f" gracePeriod=2 Dec 05 12:59:24.497205 master-0 kubenswrapper[29936]: I1205 12:59:24.496702 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:24.700398 master-0 kubenswrapper[29936]: I1205 12:59:24.700299 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-catalog-content\") pod \"44a851e2-64e5-4de6-94c8-510993ac2a2b\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " Dec 05 12:59:24.700703 master-0 kubenswrapper[29936]: I1205 12:59:24.700515 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmx7l\" (UniqueName: \"kubernetes.io/projected/44a851e2-64e5-4de6-94c8-510993ac2a2b-kube-api-access-gmx7l\") pod \"44a851e2-64e5-4de6-94c8-510993ac2a2b\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " Dec 05 12:59:24.700703 master-0 kubenswrapper[29936]: I1205 12:59:24.700556 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-utilities\") pod \"44a851e2-64e5-4de6-94c8-510993ac2a2b\" (UID: \"44a851e2-64e5-4de6-94c8-510993ac2a2b\") " Dec 05 12:59:24.702095 master-0 kubenswrapper[29936]: I1205 12:59:24.702058 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-utilities" (OuterVolumeSpecName: "utilities") pod "44a851e2-64e5-4de6-94c8-510993ac2a2b" (UID: "44a851e2-64e5-4de6-94c8-510993ac2a2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:59:24.704909 master-0 kubenswrapper[29936]: I1205 12:59:24.704845 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a851e2-64e5-4de6-94c8-510993ac2a2b-kube-api-access-gmx7l" (OuterVolumeSpecName: "kube-api-access-gmx7l") pod "44a851e2-64e5-4de6-94c8-510993ac2a2b" (UID: "44a851e2-64e5-4de6-94c8-510993ac2a2b"). InnerVolumeSpecName "kube-api-access-gmx7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:59:24.751700 master-0 kubenswrapper[29936]: I1205 12:59:24.751605 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44a851e2-64e5-4de6-94c8-510993ac2a2b" (UID: "44a851e2-64e5-4de6-94c8-510993ac2a2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:59:24.792450 master-0 kubenswrapper[29936]: I1205 12:59:24.792377 29936 generic.go:334] "Generic (PLEG): container finished" podID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerID="078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f" exitCode=0 Dec 05 12:59:24.792450 master-0 kubenswrapper[29936]: I1205 12:59:24.792436 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqtv" event={"ID":"44a851e2-64e5-4de6-94c8-510993ac2a2b","Type":"ContainerDied","Data":"078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f"} Dec 05 12:59:24.792769 master-0 kubenswrapper[29936]: I1205 12:59:24.792472 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzqtv" event={"ID":"44a851e2-64e5-4de6-94c8-510993ac2a2b","Type":"ContainerDied","Data":"04d4458701634cc29a0f6eb3296bdaa97f98addfefdc02d037b46948c1aa3f0d"} Dec 05 12:59:24.792769 master-0 kubenswrapper[29936]: I1205 12:59:24.792493 29936 scope.go:117] "RemoveContainer" containerID="078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f" Dec 05 12:59:24.792769 master-0 kubenswrapper[29936]: I1205 12:59:24.792643 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jzqtv" Dec 05 12:59:24.802447 master-0 kubenswrapper[29936]: I1205 12:59:24.802394 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 12:59:24.802447 master-0 kubenswrapper[29936]: I1205 12:59:24.802434 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmx7l\" (UniqueName: \"kubernetes.io/projected/44a851e2-64e5-4de6-94c8-510993ac2a2b-kube-api-access-gmx7l\") on node \"master-0\" DevicePath \"\"" Dec 05 12:59:24.802447 master-0 kubenswrapper[29936]: I1205 12:59:24.802448 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a851e2-64e5-4de6-94c8-510993ac2a2b-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 12:59:24.825295 master-0 kubenswrapper[29936]: I1205 12:59:24.825192 29936 scope.go:117] "RemoveContainer" containerID="c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41" Dec 05 12:59:24.847745 master-0 kubenswrapper[29936]: I1205 12:59:24.847649 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jzqtv"] Dec 05 12:59:24.850989 master-0 kubenswrapper[29936]: I1205 12:59:24.850751 29936 scope.go:117] "RemoveContainer" containerID="41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a" Dec 05 12:59:24.861587 master-0 kubenswrapper[29936]: I1205 12:59:24.861534 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jzqtv"] Dec 05 12:59:24.886931 master-0 kubenswrapper[29936]: I1205 12:59:24.886885 29936 scope.go:117] "RemoveContainer" containerID="078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f" Dec 05 12:59:24.890161 master-0 kubenswrapper[29936]: E1205 12:59:24.890126 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f\": container with ID starting with 078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f not found: ID does not exist" containerID="078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f" Dec 05 12:59:24.890275 master-0 kubenswrapper[29936]: I1205 12:59:24.890172 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f"} err="failed to get container status \"078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f\": rpc error: code = NotFound desc = could not find container \"078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f\": container with ID starting with 078cd493b8535454207fcc51a7ef14373436c49bbc38e4dde2963fce7906d28f not found: ID does not exist" Dec 05 12:59:24.890275 master-0 kubenswrapper[29936]: I1205 12:59:24.890219 29936 scope.go:117] "RemoveContainer" containerID="c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41" Dec 05 12:59:24.890615 master-0 kubenswrapper[29936]: E1205 12:59:24.890578 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41\": container with ID starting with c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41 not found: ID does not exist" containerID="c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41" Dec 05 12:59:24.890730 master-0 kubenswrapper[29936]: I1205 12:59:24.890704 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41"} err="failed to get container status \"c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41\": rpc error: code = NotFound desc = could not find container \"c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41\": container with ID starting with c8358a5e22729cd40d138226f2d83bd43c5b7b8320a18aaa1ffa32fdbbed0b41 not found: ID does not exist" Dec 05 12:59:24.890829 master-0 kubenswrapper[29936]: I1205 12:59:24.890813 29936 scope.go:117] "RemoveContainer" containerID="41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a" Dec 05 12:59:24.891301 master-0 kubenswrapper[29936]: E1205 12:59:24.891234 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a\": container with ID starting with 41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a not found: ID does not exist" containerID="41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a" Dec 05 12:59:24.891395 master-0 kubenswrapper[29936]: I1205 12:59:24.891316 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a"} err="failed to get container status \"41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a\": rpc error: code = NotFound desc = could not find container \"41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a\": container with ID starting with 41e49377b23c9e995dbaccc76750853ee3c183a7133b17638c7c97c7c0bdef5a not found: ID does not exist" Dec 05 12:59:25.212231 master-0 kubenswrapper[29936]: I1205 12:59:25.212141 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a851e2-64e5-4de6-94c8-510993ac2a2b" path="/var/lib/kubelet/pods/44a851e2-64e5-4de6-94c8-510993ac2a2b/volumes" Dec 05 12:59:31.576432 master-0 kubenswrapper[29936]: I1205 12:59:31.576375 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-54f4f6554c-rg9pv" Dec 05 12:59:31.956640 master-0 kubenswrapper[29936]: I1205 12:59:31.956489 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:31.957030 master-0 kubenswrapper[29936]: I1205 12:59:31.956678 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:32.022524 master-0 kubenswrapper[29936]: I1205 12:59:32.022418 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:32.906610 master-0 kubenswrapper[29936]: I1205 12:59:32.906509 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:33.008411 master-0 kubenswrapper[29936]: I1205 12:59:33.008321 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4wpvs"] Dec 05 12:59:34.881103 master-0 kubenswrapper[29936]: I1205 12:59:34.880990 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4wpvs" podUID="620b7762-43b9-4bf4-aae2-58d46090b954" containerName="registry-server" containerID="cri-o://8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8" gracePeriod=2 Dec 05 12:59:35.360386 master-0 kubenswrapper[29936]: I1205 12:59:35.360350 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:35.433797 master-0 kubenswrapper[29936]: I1205 12:59:35.433561 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-catalog-content\") pod \"620b7762-43b9-4bf4-aae2-58d46090b954\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " Dec 05 12:59:35.433797 master-0 kubenswrapper[29936]: I1205 12:59:35.433686 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgpjm\" (UniqueName: \"kubernetes.io/projected/620b7762-43b9-4bf4-aae2-58d46090b954-kube-api-access-qgpjm\") pod \"620b7762-43b9-4bf4-aae2-58d46090b954\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " Dec 05 12:59:35.435339 master-0 kubenswrapper[29936]: I1205 12:59:35.433858 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-utilities\") pod \"620b7762-43b9-4bf4-aae2-58d46090b954\" (UID: \"620b7762-43b9-4bf4-aae2-58d46090b954\") " Dec 05 12:59:35.435339 master-0 kubenswrapper[29936]: I1205 12:59:35.435116 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-utilities" (OuterVolumeSpecName: "utilities") pod "620b7762-43b9-4bf4-aae2-58d46090b954" (UID: "620b7762-43b9-4bf4-aae2-58d46090b954"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:59:35.437298 master-0 kubenswrapper[29936]: I1205 12:59:35.437231 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/620b7762-43b9-4bf4-aae2-58d46090b954-kube-api-access-qgpjm" (OuterVolumeSpecName: "kube-api-access-qgpjm") pod "620b7762-43b9-4bf4-aae2-58d46090b954" (UID: "620b7762-43b9-4bf4-aae2-58d46090b954"). InnerVolumeSpecName "kube-api-access-qgpjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 12:59:35.490087 master-0 kubenswrapper[29936]: I1205 12:59:35.489941 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "620b7762-43b9-4bf4-aae2-58d46090b954" (UID: "620b7762-43b9-4bf4-aae2-58d46090b954"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 12:59:35.536970 master-0 kubenswrapper[29936]: I1205 12:59:35.536902 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 12:59:35.536970 master-0 kubenswrapper[29936]: I1205 12:59:35.536958 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgpjm\" (UniqueName: \"kubernetes.io/projected/620b7762-43b9-4bf4-aae2-58d46090b954-kube-api-access-qgpjm\") on node \"master-0\" DevicePath \"\"" Dec 05 12:59:35.536970 master-0 kubenswrapper[29936]: I1205 12:59:35.536972 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/620b7762-43b9-4bf4-aae2-58d46090b954-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 12:59:35.897335 master-0 kubenswrapper[29936]: I1205 12:59:35.897174 29936 generic.go:334] "Generic (PLEG): container finished" podID="620b7762-43b9-4bf4-aae2-58d46090b954" containerID="8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8" exitCode=0 Dec 05 12:59:35.897335 master-0 kubenswrapper[29936]: I1205 12:59:35.897308 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wpvs" event={"ID":"620b7762-43b9-4bf4-aae2-58d46090b954","Type":"ContainerDied","Data":"8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8"} Dec 05 12:59:35.897335 master-0 kubenswrapper[29936]: I1205 12:59:35.897351 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4wpvs" Dec 05 12:59:35.898649 master-0 kubenswrapper[29936]: I1205 12:59:35.897378 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4wpvs" event={"ID":"620b7762-43b9-4bf4-aae2-58d46090b954","Type":"ContainerDied","Data":"399377c923dab9d052017746743f1b3a3ce8e7571a0b338f32ea6de847e55dec"} Dec 05 12:59:35.898649 master-0 kubenswrapper[29936]: I1205 12:59:35.897425 29936 scope.go:117] "RemoveContainer" containerID="8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8" Dec 05 12:59:35.925724 master-0 kubenswrapper[29936]: I1205 12:59:35.925522 29936 scope.go:117] "RemoveContainer" containerID="1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a" Dec 05 12:59:35.951495 master-0 kubenswrapper[29936]: I1205 12:59:35.951440 29936 scope.go:117] "RemoveContainer" containerID="3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4" Dec 05 12:59:35.983498 master-0 kubenswrapper[29936]: I1205 12:59:35.983422 29936 scope.go:117] "RemoveContainer" containerID="8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8" Dec 05 12:59:35.985006 master-0 kubenswrapper[29936]: E1205 12:59:35.984919 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8\": container with ID starting with 8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8 not found: ID does not exist" containerID="8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8" Dec 05 12:59:35.985112 master-0 kubenswrapper[29936]: I1205 12:59:35.985012 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8"} err="failed to get container status \"8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8\": rpc error: code = NotFound desc = could not find container \"8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8\": container with ID starting with 8369ea1600be3a0ffbcd48abe07fd6bb0ccde18721ab930d60f59659bd1b11e8 not found: ID does not exist" Dec 05 12:59:35.985112 master-0 kubenswrapper[29936]: I1205 12:59:35.985059 29936 scope.go:117] "RemoveContainer" containerID="1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a" Dec 05 12:59:35.985788 master-0 kubenswrapper[29936]: E1205 12:59:35.985724 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a\": container with ID starting with 1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a not found: ID does not exist" containerID="1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a" Dec 05 12:59:35.985788 master-0 kubenswrapper[29936]: I1205 12:59:35.985764 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a"} err="failed to get container status \"1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a\": rpc error: code = NotFound desc = could not find container \"1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a\": container with ID starting with 1665a7b1627d9aafa600db7ece5f80ab2f061862317e053d093cf5784b67ef0a not found: ID does not exist" Dec 05 12:59:35.985788 master-0 kubenswrapper[29936]: I1205 12:59:35.985784 29936 scope.go:117] "RemoveContainer" containerID="3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4" Dec 05 12:59:35.986453 master-0 kubenswrapper[29936]: E1205 12:59:35.986397 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4\": container with ID starting with 3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4 not found: ID does not exist" containerID="3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4" Dec 05 12:59:35.986453 master-0 kubenswrapper[29936]: I1205 12:59:35.986428 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4"} err="failed to get container status \"3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4\": rpc error: code = NotFound desc = could not find container \"3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4\": container with ID starting with 3795b9426a80c392bde8c572a4347c313eac1db734c25dbf9739d628e83b09b4 not found: ID does not exist" Dec 05 12:59:35.999658 master-0 kubenswrapper[29936]: I1205 12:59:35.999578 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4wpvs"] Dec 05 12:59:36.008211 master-0 kubenswrapper[29936]: I1205 12:59:36.008109 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4wpvs"] Dec 05 12:59:37.195977 master-0 kubenswrapper[29936]: I1205 12:59:37.195899 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="620b7762-43b9-4bf4-aae2-58d46090b954" path="/var/lib/kubelet/pods/620b7762-43b9-4bf4-aae2-58d46090b954/volumes" Dec 05 12:59:50.854351 master-0 kubenswrapper[29936]: I1205 12:59:50.854254 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: I1205 12:59:56.433664 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l"] Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: E1205 12:59:56.434209 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerName="registry-server" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: I1205 12:59:56.434229 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerName="registry-server" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: E1205 12:59:56.434267 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="620b7762-43b9-4bf4-aae2-58d46090b954" containerName="extract-content" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: I1205 12:59:56.434276 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="620b7762-43b9-4bf4-aae2-58d46090b954" containerName="extract-content" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: E1205 12:59:56.434297 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerName="extract-utilities" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: I1205 12:59:56.434303 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerName="extract-utilities" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: E1205 12:59:56.434329 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerName="extract-content" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: I1205 12:59:56.434334 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerName="extract-content" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: E1205 12:59:56.434350 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="620b7762-43b9-4bf4-aae2-58d46090b954" containerName="registry-server" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: I1205 12:59:56.434355 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="620b7762-43b9-4bf4-aae2-58d46090b954" containerName="registry-server" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: E1205 12:59:56.434363 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="620b7762-43b9-4bf4-aae2-58d46090b954" containerName="extract-utilities" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: I1205 12:59:56.434369 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="620b7762-43b9-4bf4-aae2-58d46090b954" containerName="extract-utilities" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: I1205 12:59:56.434557 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="620b7762-43b9-4bf4-aae2-58d46090b954" containerName="registry-server" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: I1205 12:59:56.434579 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a851e2-64e5-4de6-94c8-510993ac2a2b" containerName="registry-server" Dec 05 12:59:56.435872 master-0 kubenswrapper[29936]: I1205 12:59:56.435255 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" Dec 05 12:59:56.441201 master-0 kubenswrapper[29936]: I1205 12:59:56.438295 29936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 12:59:56.445522 master-0 kubenswrapper[29936]: I1205 12:59:56.442143 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-2fccf"] Dec 05 12:59:56.452341 master-0 kubenswrapper[29936]: I1205 12:59:56.449774 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.456200 master-0 kubenswrapper[29936]: I1205 12:59:56.453278 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 12:59:56.456200 master-0 kubenswrapper[29936]: I1205 12:59:56.453492 29936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 12:59:56.478356 master-0 kubenswrapper[29936]: I1205 12:59:56.471488 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l"] Dec 05 12:59:56.558933 master-0 kubenswrapper[29936]: I1205 12:59:56.558309 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jp5r2"] Dec 05 12:59:56.560274 master-0 kubenswrapper[29936]: I1205 12:59:56.559729 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.564479 master-0 kubenswrapper[29936]: I1205 12:59:56.564429 29936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 12:59:56.564920 master-0 kubenswrapper[29936]: I1205 12:59:56.564486 29936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 12:59:56.565239 master-0 kubenswrapper[29936]: I1205 12:59:56.564575 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 12:59:56.568500 master-0 kubenswrapper[29936]: I1205 12:59:56.568393 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d54184ec-d869-498d-ae89-be4ca52e0087-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-59k6l\" (UID: \"d54184ec-d869-498d-ae89-be4ca52e0087\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" Dec 05 12:59:56.568500 master-0 kubenswrapper[29936]: I1205 12:59:56.568450 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22d3af20-d89a-46a1-a8cc-82ca1b92e325-metrics-certs\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.568500 master-0 kubenswrapper[29936]: I1205 12:59:56.568487 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkh95\" (UniqueName: \"kubernetes.io/projected/22d3af20-d89a-46a1-a8cc-82ca1b92e325-kube-api-access-zkh95\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.568719 master-0 kubenswrapper[29936]: I1205 12:59:56.568525 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j4ps\" (UniqueName: \"kubernetes.io/projected/d54184ec-d869-498d-ae89-be4ca52e0087-kube-api-access-7j4ps\") pod \"frr-k8s-webhook-server-7fcb986d4-59k6l\" (UID: \"d54184ec-d869-498d-ae89-be4ca52e0087\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" Dec 05 12:59:56.568719 master-0 kubenswrapper[29936]: I1205 12:59:56.568563 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-reloader\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.568719 master-0 kubenswrapper[29936]: I1205 12:59:56.568585 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/22d3af20-d89a-46a1-a8cc-82ca1b92e325-frr-startup\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.568719 master-0 kubenswrapper[29936]: I1205 12:59:56.568618 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-frr-conf\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.568719 master-0 kubenswrapper[29936]: I1205 12:59:56.568638 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-frr-sockets\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.568719 master-0 kubenswrapper[29936]: I1205 12:59:56.568674 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-metrics\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.579191 master-0 kubenswrapper[29936]: I1205 12:59:56.579114 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-mz4ld"] Dec 05 12:59:56.581706 master-0 kubenswrapper[29936]: I1205 12:59:56.581660 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:56.589084 master-0 kubenswrapper[29936]: I1205 12:59:56.589030 29936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 12:59:56.600802 master-0 kubenswrapper[29936]: I1205 12:59:56.600717 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-mz4ld"] Dec 05 12:59:56.670911 master-0 kubenswrapper[29936]: I1205 12:59:56.670826 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-frr-conf\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.670911 master-0 kubenswrapper[29936]: I1205 12:59:56.670910 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-frr-sockets\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.671272 master-0 kubenswrapper[29936]: I1205 12:59:56.670970 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj479\" (UniqueName: \"kubernetes.io/projected/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-kube-api-access-lj479\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.671272 master-0 kubenswrapper[29936]: I1205 12:59:56.671046 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-memberlist\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.671272 master-0 kubenswrapper[29936]: I1205 12:59:56.671081 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-metrics\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.671272 master-0 kubenswrapper[29936]: I1205 12:59:56.671113 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5sqs\" (UniqueName: \"kubernetes.io/projected/1561fb44-afba-4a07-9718-f2e2b80b5770-kube-api-access-g5sqs\") pod \"controller-f8648f98b-mz4ld\" (UID: \"1561fb44-afba-4a07-9718-f2e2b80b5770\") " pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:56.671272 master-0 kubenswrapper[29936]: I1205 12:59:56.671134 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1561fb44-afba-4a07-9718-f2e2b80b5770-cert\") pod \"controller-f8648f98b-mz4ld\" (UID: \"1561fb44-afba-4a07-9718-f2e2b80b5770\") " pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:56.671502 master-0 kubenswrapper[29936]: I1205 12:59:56.671465 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-frr-conf\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.671543 master-0 kubenswrapper[29936]: I1205 12:59:56.671492 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1561fb44-afba-4a07-9718-f2e2b80b5770-metrics-certs\") pod \"controller-f8648f98b-mz4ld\" (UID: \"1561fb44-afba-4a07-9718-f2e2b80b5770\") " pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:56.671682 master-0 kubenswrapper[29936]: I1205 12:59:56.671625 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-metrics\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.671682 master-0 kubenswrapper[29936]: I1205 12:59:56.671472 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-frr-sockets\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.671682 master-0 kubenswrapper[29936]: I1205 12:59:56.671667 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d54184ec-d869-498d-ae89-be4ca52e0087-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-59k6l\" (UID: \"d54184ec-d869-498d-ae89-be4ca52e0087\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" Dec 05 12:59:56.671798 master-0 kubenswrapper[29936]: I1205 12:59:56.671744 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22d3af20-d89a-46a1-a8cc-82ca1b92e325-metrics-certs\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.671869 master-0 kubenswrapper[29936]: I1205 12:59:56.671844 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkh95\" (UniqueName: \"kubernetes.io/projected/22d3af20-d89a-46a1-a8cc-82ca1b92e325-kube-api-access-zkh95\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.671932 master-0 kubenswrapper[29936]: I1205 12:59:56.671910 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-metrics-certs\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.672007 master-0 kubenswrapper[29936]: I1205 12:59:56.671980 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j4ps\" (UniqueName: \"kubernetes.io/projected/d54184ec-d869-498d-ae89-be4ca52e0087-kube-api-access-7j4ps\") pod \"frr-k8s-webhook-server-7fcb986d4-59k6l\" (UID: \"d54184ec-d869-498d-ae89-be4ca52e0087\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" Dec 05 12:59:56.672234 master-0 kubenswrapper[29936]: I1205 12:59:56.672208 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-reloader\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.674377 master-0 kubenswrapper[29936]: I1205 12:59:56.672704 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/22d3af20-d89a-46a1-a8cc-82ca1b92e325-reloader\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.674377 master-0 kubenswrapper[29936]: I1205 12:59:56.672797 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-metallb-excludel2\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.674377 master-0 kubenswrapper[29936]: I1205 12:59:56.672824 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/22d3af20-d89a-46a1-a8cc-82ca1b92e325-frr-startup\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.674377 master-0 kubenswrapper[29936]: I1205 12:59:56.673721 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/22d3af20-d89a-46a1-a8cc-82ca1b92e325-frr-startup\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.676851 master-0 kubenswrapper[29936]: I1205 12:59:56.676800 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d54184ec-d869-498d-ae89-be4ca52e0087-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-59k6l\" (UID: \"d54184ec-d869-498d-ae89-be4ca52e0087\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" Dec 05 12:59:56.678380 master-0 kubenswrapper[29936]: I1205 12:59:56.678343 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22d3af20-d89a-46a1-a8cc-82ca1b92e325-metrics-certs\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.697284 master-0 kubenswrapper[29936]: I1205 12:59:56.697215 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j4ps\" (UniqueName: \"kubernetes.io/projected/d54184ec-d869-498d-ae89-be4ca52e0087-kube-api-access-7j4ps\") pod \"frr-k8s-webhook-server-7fcb986d4-59k6l\" (UID: \"d54184ec-d869-498d-ae89-be4ca52e0087\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" Dec 05 12:59:56.698232 master-0 kubenswrapper[29936]: I1205 12:59:56.698200 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkh95\" (UniqueName: \"kubernetes.io/projected/22d3af20-d89a-46a1-a8cc-82ca1b92e325-kube-api-access-zkh95\") pod \"frr-k8s-2fccf\" (UID: \"22d3af20-d89a-46a1-a8cc-82ca1b92e325\") " pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.765000 master-0 kubenswrapper[29936]: I1205 12:59:56.764274 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" Dec 05 12:59:56.774834 master-0 kubenswrapper[29936]: I1205 12:59:56.774720 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1561fb44-afba-4a07-9718-f2e2b80b5770-metrics-certs\") pod \"controller-f8648f98b-mz4ld\" (UID: \"1561fb44-afba-4a07-9718-f2e2b80b5770\") " pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:56.774834 master-0 kubenswrapper[29936]: I1205 12:59:56.774852 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-metrics-certs\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.775330 master-0 kubenswrapper[29936]: I1205 12:59:56.774902 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-metallb-excludel2\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.775330 master-0 kubenswrapper[29936]: I1205 12:59:56.774953 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj479\" (UniqueName: \"kubernetes.io/projected/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-kube-api-access-lj479\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.775330 master-0 kubenswrapper[29936]: I1205 12:59:56.774975 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-memberlist\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.775330 master-0 kubenswrapper[29936]: I1205 12:59:56.775009 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5sqs\" (UniqueName: \"kubernetes.io/projected/1561fb44-afba-4a07-9718-f2e2b80b5770-kube-api-access-g5sqs\") pod \"controller-f8648f98b-mz4ld\" (UID: \"1561fb44-afba-4a07-9718-f2e2b80b5770\") " pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:56.775330 master-0 kubenswrapper[29936]: I1205 12:59:56.775028 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1561fb44-afba-4a07-9718-f2e2b80b5770-cert\") pod \"controller-f8648f98b-mz4ld\" (UID: \"1561fb44-afba-4a07-9718-f2e2b80b5770\") " pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:56.776387 master-0 kubenswrapper[29936]: E1205 12:59:56.776069 29936 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 12:59:56.776387 master-0 kubenswrapper[29936]: E1205 12:59:56.776140 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-memberlist podName:22b3d3f6-98cb-4dee-833e-21f5b0bc97ca nodeName:}" failed. No retries permitted until 2025-12-05 12:59:57.276115562 +0000 UTC m=+594.408195243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-memberlist") pod "speaker-jp5r2" (UID: "22b3d3f6-98cb-4dee-833e-21f5b0bc97ca") : secret "metallb-memberlist" not found Dec 05 12:59:56.783335 master-0 kubenswrapper[29936]: I1205 12:59:56.780034 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-metallb-excludel2\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.783335 master-0 kubenswrapper[29936]: I1205 12:59:56.780101 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-metrics-certs\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.783335 master-0 kubenswrapper[29936]: I1205 12:59:56.781847 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1561fb44-afba-4a07-9718-f2e2b80b5770-metrics-certs\") pod \"controller-f8648f98b-mz4ld\" (UID: \"1561fb44-afba-4a07-9718-f2e2b80b5770\") " pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:56.784391 master-0 kubenswrapper[29936]: I1205 12:59:56.784135 29936 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 12:59:56.792514 master-0 kubenswrapper[29936]: I1205 12:59:56.792450 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1561fb44-afba-4a07-9718-f2e2b80b5770-cert\") pod \"controller-f8648f98b-mz4ld\" (UID: \"1561fb44-afba-4a07-9718-f2e2b80b5770\") " pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:56.801280 master-0 kubenswrapper[29936]: I1205 12:59:56.800566 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2fccf" Dec 05 12:59:56.818455 master-0 kubenswrapper[29936]: I1205 12:59:56.811925 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5sqs\" (UniqueName: \"kubernetes.io/projected/1561fb44-afba-4a07-9718-f2e2b80b5770-kube-api-access-g5sqs\") pod \"controller-f8648f98b-mz4ld\" (UID: \"1561fb44-afba-4a07-9718-f2e2b80b5770\") " pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:56.818894 master-0 kubenswrapper[29936]: I1205 12:59:56.818717 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj479\" (UniqueName: \"kubernetes.io/projected/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-kube-api-access-lj479\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:56.938229 master-0 kubenswrapper[29936]: I1205 12:59:56.937897 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 12:59:57.120232 master-0 kubenswrapper[29936]: I1205 12:59:57.114062 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2fccf" event={"ID":"22d3af20-d89a-46a1-a8cc-82ca1b92e325","Type":"ContainerStarted","Data":"62f5ec3c0b0ddd6029f3a086fbbbfe6360b92d662795bc54341776b62b1b8e2e"} Dec 05 12:59:57.316499 master-0 kubenswrapper[29936]: W1205 12:59:57.316435 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd54184ec_d869_498d_ae89_be4ca52e0087.slice/crio-786319bab249d05539cd48c3b61ef57bea918145054d8376cb7d085dbd4b8397 WatchSource:0}: Error finding container 786319bab249d05539cd48c3b61ef57bea918145054d8376cb7d085dbd4b8397: Status 404 returned error can't find the container with id 786319bab249d05539cd48c3b61ef57bea918145054d8376cb7d085dbd4b8397 Dec 05 12:59:57.319547 master-0 kubenswrapper[29936]: I1205 12:59:57.319458 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l"] Dec 05 12:59:57.322757 master-0 kubenswrapper[29936]: I1205 12:59:57.322703 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-memberlist\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:57.322940 master-0 kubenswrapper[29936]: E1205 12:59:57.322902 29936 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 12:59:57.323057 master-0 kubenswrapper[29936]: E1205 12:59:57.323014 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-memberlist podName:22b3d3f6-98cb-4dee-833e-21f5b0bc97ca nodeName:}" failed. No retries permitted until 2025-12-05 12:59:58.322992837 +0000 UTC m=+595.455072518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-memberlist") pod "speaker-jp5r2" (UID: "22b3d3f6-98cb-4dee-833e-21f5b0bc97ca") : secret "metallb-memberlist" not found Dec 05 12:59:57.526536 master-0 kubenswrapper[29936]: W1205 12:59:57.526465 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1561fb44_afba_4a07_9718_f2e2b80b5770.slice/crio-ff9713dfe7f75851beb42e6c707fec0beaa4c21c693a9ab2096007b8ce9545a6 WatchSource:0}: Error finding container ff9713dfe7f75851beb42e6c707fec0beaa4c21c693a9ab2096007b8ce9545a6: Status 404 returned error can't find the container with id ff9713dfe7f75851beb42e6c707fec0beaa4c21c693a9ab2096007b8ce9545a6 Dec 05 12:59:57.527141 master-0 kubenswrapper[29936]: I1205 12:59:57.527083 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-mz4ld"] Dec 05 12:59:58.123923 master-0 kubenswrapper[29936]: I1205 12:59:58.123835 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" event={"ID":"d54184ec-d869-498d-ae89-be4ca52e0087","Type":"ContainerStarted","Data":"786319bab249d05539cd48c3b61ef57bea918145054d8376cb7d085dbd4b8397"} Dec 05 12:59:58.125740 master-0 kubenswrapper[29936]: I1205 12:59:58.125703 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-mz4ld" event={"ID":"1561fb44-afba-4a07-9718-f2e2b80b5770","Type":"ContainerStarted","Data":"a7a29ed4a1f55733f38dafeea0b3b4c4418e16cd7820f206f9dc61ca6fc5a74f"} Dec 05 12:59:58.125740 master-0 kubenswrapper[29936]: I1205 12:59:58.125736 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-mz4ld" event={"ID":"1561fb44-afba-4a07-9718-f2e2b80b5770","Type":"ContainerStarted","Data":"ff9713dfe7f75851beb42e6c707fec0beaa4c21c693a9ab2096007b8ce9545a6"} Dec 05 12:59:58.338879 master-0 kubenswrapper[29936]: I1205 12:59:58.338764 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-memberlist\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:58.346268 master-0 kubenswrapper[29936]: I1205 12:59:58.346204 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/22b3d3f6-98cb-4dee-833e-21f5b0bc97ca-memberlist\") pod \"speaker-jp5r2\" (UID: \"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca\") " pod="metallb-system/speaker-jp5r2" Dec 05 12:59:58.417678 master-0 kubenswrapper[29936]: I1205 12:59:58.417065 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jp5r2" Dec 05 12:59:58.472521 master-0 kubenswrapper[29936]: W1205 12:59:58.472427 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22b3d3f6_98cb_4dee_833e_21f5b0bc97ca.slice/crio-3b0f150dbf97abecd2663e450957e26d49566e5c52603651a5aeb7a620304f95 WatchSource:0}: Error finding container 3b0f150dbf97abecd2663e450957e26d49566e5c52603651a5aeb7a620304f95: Status 404 returned error can't find the container with id 3b0f150dbf97abecd2663e450957e26d49566e5c52603651a5aeb7a620304f95 Dec 05 12:59:59.030470 master-0 kubenswrapper[29936]: I1205 12:59:59.030340 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn"] Dec 05 12:59:59.036317 master-0 kubenswrapper[29936]: I1205 12:59:59.032387 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" Dec 05 12:59:59.046058 master-0 kubenswrapper[29936]: I1205 12:59:59.045998 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5"] Dec 05 12:59:59.047993 master-0 kubenswrapper[29936]: I1205 12:59:59.047951 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5" Dec 05 12:59:59.053207 master-0 kubenswrapper[29936]: I1205 12:59:59.052549 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 12:59:59.071625 master-0 kubenswrapper[29936]: I1205 12:59:59.069825 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-797p2"] Dec 05 12:59:59.071625 master-0 kubenswrapper[29936]: I1205 12:59:59.071287 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.113138 master-0 kubenswrapper[29936]: I1205 12:59:59.112355 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5"] Dec 05 12:59:59.163210 master-0 kubenswrapper[29936]: I1205 12:59:59.159422 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e8fbd044-a3c2-4968-9522-90877e268aae-nmstate-lock\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.163210 master-0 kubenswrapper[29936]: I1205 12:59:59.159485 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsfzt\" (UniqueName: \"kubernetes.io/projected/430db01e-5ffc-4ef9-9b71-35b825a86bde-kube-api-access-lsfzt\") pod \"nmstate-webhook-5f6d4c5ccb-k6dxn\" (UID: \"430db01e-5ffc-4ef9-9b71-35b825a86bde\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" Dec 05 12:59:59.163210 master-0 kubenswrapper[29936]: I1205 12:59:59.159529 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcfpf\" (UniqueName: \"kubernetes.io/projected/e8fbd044-a3c2-4968-9522-90877e268aae-kube-api-access-fcfpf\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.163210 master-0 kubenswrapper[29936]: I1205 12:59:59.159571 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e8fbd044-a3c2-4968-9522-90877e268aae-dbus-socket\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.163210 master-0 kubenswrapper[29936]: I1205 12:59:59.159605 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqh5w\" (UniqueName: \"kubernetes.io/projected/a8e8677b-4d23-49eb-ad0d-5b34cecdd56d-kube-api-access-zqh5w\") pod \"nmstate-metrics-7f946cbc9-whhk5\" (UID: \"a8e8677b-4d23-49eb-ad0d-5b34cecdd56d\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5" Dec 05 12:59:59.163210 master-0 kubenswrapper[29936]: I1205 12:59:59.159632 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/430db01e-5ffc-4ef9-9b71-35b825a86bde-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-k6dxn\" (UID: \"430db01e-5ffc-4ef9-9b71-35b825a86bde\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" Dec 05 12:59:59.163210 master-0 kubenswrapper[29936]: I1205 12:59:59.159676 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e8fbd044-a3c2-4968-9522-90877e268aae-ovs-socket\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.172459 master-0 kubenswrapper[29936]: I1205 12:59:59.172400 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jp5r2" event={"ID":"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca","Type":"ContainerStarted","Data":"191daf04cbb22ec65c5121f35ec5216e814e4da5cee756a0f9433453d3a53fb5"} Dec 05 12:59:59.172459 master-0 kubenswrapper[29936]: I1205 12:59:59.172468 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jp5r2" event={"ID":"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca","Type":"ContainerStarted","Data":"3b0f150dbf97abecd2663e450957e26d49566e5c52603651a5aeb7a620304f95"} Dec 05 12:59:59.186326 master-0 kubenswrapper[29936]: I1205 12:59:59.182503 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn"] Dec 05 12:59:59.269690 master-0 kubenswrapper[29936]: I1205 12:59:59.269471 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcfpf\" (UniqueName: \"kubernetes.io/projected/e8fbd044-a3c2-4968-9522-90877e268aae-kube-api-access-fcfpf\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.269942 master-0 kubenswrapper[29936]: I1205 12:59:59.269790 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e8fbd044-a3c2-4968-9522-90877e268aae-dbus-socket\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.269942 master-0 kubenswrapper[29936]: I1205 12:59:59.269924 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqh5w\" (UniqueName: \"kubernetes.io/projected/a8e8677b-4d23-49eb-ad0d-5b34cecdd56d-kube-api-access-zqh5w\") pod \"nmstate-metrics-7f946cbc9-whhk5\" (UID: \"a8e8677b-4d23-49eb-ad0d-5b34cecdd56d\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5" Dec 05 12:59:59.270054 master-0 kubenswrapper[29936]: I1205 12:59:59.270000 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/430db01e-5ffc-4ef9-9b71-35b825a86bde-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-k6dxn\" (UID: \"430db01e-5ffc-4ef9-9b71-35b825a86bde\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" Dec 05 12:59:59.271631 master-0 kubenswrapper[29936]: I1205 12:59:59.270155 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e8fbd044-a3c2-4968-9522-90877e268aae-dbus-socket\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.271631 master-0 kubenswrapper[29936]: I1205 12:59:59.270350 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e8fbd044-a3c2-4968-9522-90877e268aae-ovs-socket\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.271631 master-0 kubenswrapper[29936]: I1205 12:59:59.270381 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e8fbd044-a3c2-4968-9522-90877e268aae-nmstate-lock\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.271631 master-0 kubenswrapper[29936]: I1205 12:59:59.270467 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsfzt\" (UniqueName: \"kubernetes.io/projected/430db01e-5ffc-4ef9-9b71-35b825a86bde-kube-api-access-lsfzt\") pod \"nmstate-webhook-5f6d4c5ccb-k6dxn\" (UID: \"430db01e-5ffc-4ef9-9b71-35b825a86bde\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" Dec 05 12:59:59.271631 master-0 kubenswrapper[29936]: I1205 12:59:59.271141 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e8fbd044-a3c2-4968-9522-90877e268aae-ovs-socket\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.271631 master-0 kubenswrapper[29936]: I1205 12:59:59.271331 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e8fbd044-a3c2-4968-9522-90877e268aae-nmstate-lock\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.286597 master-0 kubenswrapper[29936]: I1205 12:59:59.279513 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/430db01e-5ffc-4ef9-9b71-35b825a86bde-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-k6dxn\" (UID: \"430db01e-5ffc-4ef9-9b71-35b825a86bde\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" Dec 05 12:59:59.295240 master-0 kubenswrapper[29936]: I1205 12:59:59.295023 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcfpf\" (UniqueName: \"kubernetes.io/projected/e8fbd044-a3c2-4968-9522-90877e268aae-kube-api-access-fcfpf\") pod \"nmstate-handler-797p2\" (UID: \"e8fbd044-a3c2-4968-9522-90877e268aae\") " pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.302064 master-0 kubenswrapper[29936]: I1205 12:59:59.302017 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsfzt\" (UniqueName: \"kubernetes.io/projected/430db01e-5ffc-4ef9-9b71-35b825a86bde-kube-api-access-lsfzt\") pod \"nmstate-webhook-5f6d4c5ccb-k6dxn\" (UID: \"430db01e-5ffc-4ef9-9b71-35b825a86bde\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" Dec 05 12:59:59.302439 master-0 kubenswrapper[29936]: I1205 12:59:59.302410 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqh5w\" (UniqueName: \"kubernetes.io/projected/a8e8677b-4d23-49eb-ad0d-5b34cecdd56d-kube-api-access-zqh5w\") pod \"nmstate-metrics-7f946cbc9-whhk5\" (UID: \"a8e8677b-4d23-49eb-ad0d-5b34cecdd56d\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5" Dec 05 12:59:59.310816 master-0 kubenswrapper[29936]: I1205 12:59:59.310746 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29"] Dec 05 12:59:59.315447 master-0 kubenswrapper[29936]: I1205 12:59:59.315403 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.320475 master-0 kubenswrapper[29936]: I1205 12:59:59.320438 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 12:59:59.320614 master-0 kubenswrapper[29936]: I1205 12:59:59.320591 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 12:59:59.324359 master-0 kubenswrapper[29936]: I1205 12:59:59.324297 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29"] Dec 05 12:59:59.427950 master-0 kubenswrapper[29936]: I1205 12:59:59.427624 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" Dec 05 12:59:59.457501 master-0 kubenswrapper[29936]: I1205 12:59:59.457437 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5" Dec 05 12:59:59.494717 master-0 kubenswrapper[29936]: I1205 12:59:59.494569 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 12:59:59.502870 master-0 kubenswrapper[29936]: I1205 12:59:59.502808 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq6m5\" (UniqueName: \"kubernetes.io/projected/ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2-kube-api-access-bq6m5\") pod \"nmstate-console-plugin-7fbb5f6569-gnk29\" (UID: \"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.503070 master-0 kubenswrapper[29936]: I1205 12:59:59.502920 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-gnk29\" (UID: \"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.503190 master-0 kubenswrapper[29936]: I1205 12:59:59.503152 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-gnk29\" (UID: \"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.619297 master-0 kubenswrapper[29936]: I1205 12:59:59.611329 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-gnk29\" (UID: \"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.619297 master-0 kubenswrapper[29936]: I1205 12:59:59.611459 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq6m5\" (UniqueName: \"kubernetes.io/projected/ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2-kube-api-access-bq6m5\") pod \"nmstate-console-plugin-7fbb5f6569-gnk29\" (UID: \"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.619297 master-0 kubenswrapper[29936]: I1205 12:59:59.611498 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-gnk29\" (UID: \"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.619297 master-0 kubenswrapper[29936]: I1205 12:59:59.615163 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-gnk29\" (UID: \"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.630547 master-0 kubenswrapper[29936]: I1205 12:59:59.629984 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-gnk29\" (UID: \"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.644473 master-0 kubenswrapper[29936]: I1205 12:59:59.635558 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-69b56dd5fc-pp8vp"] Dec 05 12:59:59.644473 master-0 kubenswrapper[29936]: I1205 12:59:59.638443 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.663062 master-0 kubenswrapper[29936]: I1205 12:59:59.652302 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b56dd5fc-pp8vp"] Dec 05 12:59:59.668841 master-0 kubenswrapper[29936]: I1205 12:59:59.668796 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq6m5\" (UniqueName: \"kubernetes.io/projected/ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2-kube-api-access-bq6m5\") pod \"nmstate-console-plugin-7fbb5f6569-gnk29\" (UID: \"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.706235 master-0 kubenswrapper[29936]: I1205 12:59:59.699612 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" Dec 05 12:59:59.822208 master-0 kubenswrapper[29936]: I1205 12:59:59.816583 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-console-config\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.822208 master-0 kubenswrapper[29936]: I1205 12:59:59.816674 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x5tx\" (UniqueName: \"kubernetes.io/projected/33ad18ab-dbb3-4413-a750-e7e7cb589818-kube-api-access-6x5tx\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.822208 master-0 kubenswrapper[29936]: I1205 12:59:59.816718 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-trusted-ca-bundle\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.822208 master-0 kubenswrapper[29936]: I1205 12:59:59.816748 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33ad18ab-dbb3-4413-a750-e7e7cb589818-console-oauth-config\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.822208 master-0 kubenswrapper[29936]: I1205 12:59:59.816774 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-service-ca\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.822208 master-0 kubenswrapper[29936]: I1205 12:59:59.816808 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33ad18ab-dbb3-4413-a750-e7e7cb589818-console-serving-cert\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.822208 master-0 kubenswrapper[29936]: I1205 12:59:59.816880 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-oauth-serving-cert\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.926805 master-0 kubenswrapper[29936]: I1205 12:59:59.923439 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-console-config\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.926805 master-0 kubenswrapper[29936]: I1205 12:59:59.923529 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x5tx\" (UniqueName: \"kubernetes.io/projected/33ad18ab-dbb3-4413-a750-e7e7cb589818-kube-api-access-6x5tx\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.926805 master-0 kubenswrapper[29936]: I1205 12:59:59.923588 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-trusted-ca-bundle\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.926805 master-0 kubenswrapper[29936]: I1205 12:59:59.924779 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-console-config\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.926805 master-0 kubenswrapper[29936]: I1205 12:59:59.924865 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33ad18ab-dbb3-4413-a750-e7e7cb589818-console-oauth-config\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.926805 master-0 kubenswrapper[29936]: I1205 12:59:59.924903 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-service-ca\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.926805 master-0 kubenswrapper[29936]: I1205 12:59:59.924990 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33ad18ab-dbb3-4413-a750-e7e7cb589818-console-serving-cert\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.926805 master-0 kubenswrapper[29936]: I1205 12:59:59.925091 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-trusted-ca-bundle\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.926805 master-0 kubenswrapper[29936]: I1205 12:59:59.925635 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-oauth-serving-cert\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.927384 master-0 kubenswrapper[29936]: I1205 12:59:59.927069 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-oauth-serving-cert\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.927431 master-0 kubenswrapper[29936]: I1205 12:59:59.927379 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/33ad18ab-dbb3-4413-a750-e7e7cb589818-service-ca\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.933902 master-0 kubenswrapper[29936]: I1205 12:59:59.931416 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/33ad18ab-dbb3-4413-a750-e7e7cb589818-console-serving-cert\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 12:59:59.933902 master-0 kubenswrapper[29936]: I1205 12:59:59.932447 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/33ad18ab-dbb3-4413-a750-e7e7cb589818-console-oauth-config\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 13:00:00.153236 master-0 kubenswrapper[29936]: I1205 13:00:00.148574 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x5tx\" (UniqueName: \"kubernetes.io/projected/33ad18ab-dbb3-4413-a750-e7e7cb589818-kube-api-access-6x5tx\") pod \"console-69b56dd5fc-pp8vp\" (UID: \"33ad18ab-dbb3-4413-a750-e7e7cb589818\") " pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 13:00:00.174049 master-0 kubenswrapper[29936]: I1205 13:00:00.168310 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn"] Dec 05 13:00:00.187612 master-0 kubenswrapper[29936]: I1205 13:00:00.187544 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-mz4ld" event={"ID":"1561fb44-afba-4a07-9718-f2e2b80b5770","Type":"ContainerStarted","Data":"bbf84cec51ac307acb4c24a88b1892184b50f2001bbcc1a24c76d17c18e0c6e5"} Dec 05 13:00:00.187723 master-0 kubenswrapper[29936]: I1205 13:00:00.187706 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 13:00:00.190709 master-0 kubenswrapper[29936]: I1205 13:00:00.190506 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jp5r2" event={"ID":"22b3d3f6-98cb-4dee-833e-21f5b0bc97ca","Type":"ContainerStarted","Data":"5345878d59aa21addf834fbd066e0e364504be93956466cbb1b9f744a5f3ccdc"} Dec 05 13:00:00.190785 master-0 kubenswrapper[29936]: I1205 13:00:00.190771 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jp5r2" Dec 05 13:00:00.192291 master-0 kubenswrapper[29936]: I1205 13:00:00.192114 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-797p2" event={"ID":"e8fbd044-a3c2-4968-9522-90877e268aae","Type":"ContainerStarted","Data":"c4b254382c82a138047b0bcd9528c306ff7234896623be9504d996e940ab26cf"} Dec 05 13:00:00.271618 master-0 kubenswrapper[29936]: I1205 13:00:00.271446 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 13:00:00.362874 master-0 kubenswrapper[29936]: I1205 13:00:00.362329 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5"] Dec 05 13:00:00.406455 master-0 kubenswrapper[29936]: I1205 13:00:00.406154 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29"] Dec 05 13:00:00.426752 master-0 kubenswrapper[29936]: I1205 13:00:00.426630 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-mz4ld" podStartSLOduration=3.308876716 podStartE2EDuration="4.426597321s" podCreationTimestamp="2025-12-05 12:59:56 +0000 UTC" firstStartedPulling="2025-12-05 12:59:57.675361712 +0000 UTC m=+594.807441393" lastFinishedPulling="2025-12-05 12:59:58.793082317 +0000 UTC m=+595.925161998" observedRunningTime="2025-12-05 13:00:00.346866887 +0000 UTC m=+597.478946578" watchObservedRunningTime="2025-12-05 13:00:00.426597321 +0000 UTC m=+597.558677002" Dec 05 13:00:00.446087 master-0 kubenswrapper[29936]: I1205 13:00:00.445975 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz"] Dec 05 13:00:00.451868 master-0 kubenswrapper[29936]: I1205 13:00:00.451811 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:00.454417 master-0 kubenswrapper[29936]: I1205 13:00:00.453353 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz"] Dec 05 13:00:00.455329 master-0 kubenswrapper[29936]: I1205 13:00:00.455269 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jp5r2" podStartSLOduration=4.455251771 podStartE2EDuration="4.455251771s" podCreationTimestamp="2025-12-05 12:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:00:00.410572263 +0000 UTC m=+597.542651944" watchObservedRunningTime="2025-12-05 13:00:00.455251771 +0000 UTC m=+597.587331462" Dec 05 13:00:00.462359 master-0 kubenswrapper[29936]: I1205 13:00:00.462282 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-rdxkm" Dec 05 13:00:00.464100 master-0 kubenswrapper[29936]: I1205 13:00:00.464063 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 13:00:00.585450 master-0 kubenswrapper[29936]: I1205 13:00:00.585364 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-secret-volume\") pod \"collect-profiles-29415660-dfpmz\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:00.585750 master-0 kubenswrapper[29936]: I1205 13:00:00.585477 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlx55\" (UniqueName: \"kubernetes.io/projected/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-kube-api-access-zlx55\") pod \"collect-profiles-29415660-dfpmz\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:00.585750 master-0 kubenswrapper[29936]: I1205 13:00:00.585532 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-config-volume\") pod \"collect-profiles-29415660-dfpmz\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:00.687829 master-0 kubenswrapper[29936]: I1205 13:00:00.687708 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-secret-volume\") pod \"collect-profiles-29415660-dfpmz\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:00.688129 master-0 kubenswrapper[29936]: I1205 13:00:00.687865 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlx55\" (UniqueName: \"kubernetes.io/projected/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-kube-api-access-zlx55\") pod \"collect-profiles-29415660-dfpmz\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:00.688129 master-0 kubenswrapper[29936]: I1205 13:00:00.687935 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-config-volume\") pod \"collect-profiles-29415660-dfpmz\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:00.689852 master-0 kubenswrapper[29936]: I1205 13:00:00.689525 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-config-volume\") pod \"collect-profiles-29415660-dfpmz\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:00.692577 master-0 kubenswrapper[29936]: I1205 13:00:00.692532 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-secret-volume\") pod \"collect-profiles-29415660-dfpmz\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:00.721341 master-0 kubenswrapper[29936]: I1205 13:00:00.721097 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlx55\" (UniqueName: \"kubernetes.io/projected/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-kube-api-access-zlx55\") pod \"collect-profiles-29415660-dfpmz\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:00.845040 master-0 kubenswrapper[29936]: I1205 13:00:00.844818 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b56dd5fc-pp8vp"] Dec 05 13:00:00.854609 master-0 kubenswrapper[29936]: W1205 13:00:00.854525 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33ad18ab_dbb3_4413_a750_e7e7cb589818.slice/crio-58f9de99e159f15a8a2ef09f9a23fe53f01071fa200432663e99cf6b4dc0c199 WatchSource:0}: Error finding container 58f9de99e159f15a8a2ef09f9a23fe53f01071fa200432663e99cf6b4dc0c199: Status 404 returned error can't find the container with id 58f9de99e159f15a8a2ef09f9a23fe53f01071fa200432663e99cf6b4dc0c199 Dec 05 13:00:00.858603 master-0 kubenswrapper[29936]: I1205 13:00:00.858544 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:01.202961 master-0 kubenswrapper[29936]: I1205 13:00:01.202883 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5" event={"ID":"a8e8677b-4d23-49eb-ad0d-5b34cecdd56d","Type":"ContainerStarted","Data":"3b4b71278b87b6142506fa8aa5e9e3516e51cb77f9bbc4c5af4011683365b277"} Dec 05 13:00:01.204120 master-0 kubenswrapper[29936]: I1205 13:00:01.204092 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" event={"ID":"430db01e-5ffc-4ef9-9b71-35b825a86bde","Type":"ContainerStarted","Data":"921d0b6b9e29eb442d80f222c0caf50e35fab0708b2a83fe462e6cff9d0282d4"} Dec 05 13:00:01.208340 master-0 kubenswrapper[29936]: I1205 13:00:01.208308 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b56dd5fc-pp8vp" event={"ID":"33ad18ab-dbb3-4413-a750-e7e7cb589818","Type":"ContainerStarted","Data":"ee05971b3a435b512bd3df8409624052066dc9e4283302a97d09a689ba626e29"} Dec 05 13:00:01.208340 master-0 kubenswrapper[29936]: I1205 13:00:01.208337 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b56dd5fc-pp8vp" event={"ID":"33ad18ab-dbb3-4413-a750-e7e7cb589818","Type":"ContainerStarted","Data":"58f9de99e159f15a8a2ef09f9a23fe53f01071fa200432663e99cf6b4dc0c199"} Dec 05 13:00:01.211485 master-0 kubenswrapper[29936]: I1205 13:00:01.211383 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" event={"ID":"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2","Type":"ContainerStarted","Data":"4281b0c2cb516a1e2148370d84abf96e5da9660be9b7fa9869208399e74cb128"} Dec 05 13:00:01.237384 master-0 kubenswrapper[29936]: I1205 13:00:01.237010 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69b56dd5fc-pp8vp" podStartSLOduration=2.236960618 podStartE2EDuration="2.236960618s" podCreationTimestamp="2025-12-05 12:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:00:01.234724817 +0000 UTC m=+598.366804518" watchObservedRunningTime="2025-12-05 13:00:01.236960618 +0000 UTC m=+598.369040299" Dec 05 13:00:01.368113 master-0 kubenswrapper[29936]: I1205 13:00:01.368052 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz"] Dec 05 13:00:02.226581 master-0 kubenswrapper[29936]: I1205 13:00:02.226506 29936 generic.go:334] "Generic (PLEG): container finished" podID="1043a2e5-6dbc-42aa-96ed-1b46a6b5484f" containerID="dc0fca244ceb67d41e3f8bc84f55bf3d37a32dd711c9991168b5d3972cd3da3d" exitCode=0 Dec 05 13:00:02.227639 master-0 kubenswrapper[29936]: I1205 13:00:02.226701 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" event={"ID":"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f","Type":"ContainerDied","Data":"dc0fca244ceb67d41e3f8bc84f55bf3d37a32dd711c9991168b5d3972cd3da3d"} Dec 05 13:00:02.227639 master-0 kubenswrapper[29936]: I1205 13:00:02.226829 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" event={"ID":"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f","Type":"ContainerStarted","Data":"144554468a845abda4ab9e02152bfaf8a4a5a489208ae016caeb22ef9da5d831"} Dec 05 13:00:03.828911 master-0 kubenswrapper[29936]: I1205 13:00:03.828835 29936 scope.go:117] "RemoveContainer" containerID="e8917c3711bbe1adfa1dc4fa6befd9275e69d1180a7505f4e499700e3290a159" Dec 05 13:00:04.575703 master-0 kubenswrapper[29936]: I1205 13:00:04.575639 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:04.716105 master-0 kubenswrapper[29936]: I1205 13:00:04.715991 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-config-volume\") pod \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " Dec 05 13:00:04.716360 master-0 kubenswrapper[29936]: I1205 13:00:04.716148 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlx55\" (UniqueName: \"kubernetes.io/projected/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-kube-api-access-zlx55\") pod \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " Dec 05 13:00:04.716360 master-0 kubenswrapper[29936]: I1205 13:00:04.716221 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-secret-volume\") pod \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\" (UID: \"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f\") " Dec 05 13:00:04.716534 master-0 kubenswrapper[29936]: I1205 13:00:04.716486 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-config-volume" (OuterVolumeSpecName: "config-volume") pod "1043a2e5-6dbc-42aa-96ed-1b46a6b5484f" (UID: "1043a2e5-6dbc-42aa-96ed-1b46a6b5484f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:00:04.716710 master-0 kubenswrapper[29936]: I1205 13:00:04.716681 29936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:04.719157 master-0 kubenswrapper[29936]: I1205 13:00:04.719084 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-kube-api-access-zlx55" (OuterVolumeSpecName: "kube-api-access-zlx55") pod "1043a2e5-6dbc-42aa-96ed-1b46a6b5484f" (UID: "1043a2e5-6dbc-42aa-96ed-1b46a6b5484f"). InnerVolumeSpecName "kube-api-access-zlx55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:00:04.720139 master-0 kubenswrapper[29936]: I1205 13:00:04.720058 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1043a2e5-6dbc-42aa-96ed-1b46a6b5484f" (UID: "1043a2e5-6dbc-42aa-96ed-1b46a6b5484f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:00:04.818902 master-0 kubenswrapper[29936]: I1205 13:00:04.818742 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlx55\" (UniqueName: \"kubernetes.io/projected/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-kube-api-access-zlx55\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:04.818902 master-0 kubenswrapper[29936]: I1205 13:00:04.818797 29936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:05.466395 master-0 kubenswrapper[29936]: I1205 13:00:05.466324 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" event={"ID":"1043a2e5-6dbc-42aa-96ed-1b46a6b5484f","Type":"ContainerDied","Data":"144554468a845abda4ab9e02152bfaf8a4a5a489208ae016caeb22ef9da5d831"} Dec 05 13:00:05.466395 master-0 kubenswrapper[29936]: I1205 13:00:05.466397 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="144554468a845abda4ab9e02152bfaf8a4a5a489208ae016caeb22ef9da5d831" Dec 05 13:00:05.467309 master-0 kubenswrapper[29936]: I1205 13:00:05.466418 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz" Dec 05 13:00:07.487536 master-0 kubenswrapper[29936]: I1205 13:00:07.487362 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" event={"ID":"d54184ec-d869-498d-ae89-be4ca52e0087","Type":"ContainerStarted","Data":"b5c091fb0a91095c5ec21bcdcf606d39d06d27b31dcaab56959110cb10101736"} Dec 05 13:00:07.490530 master-0 kubenswrapper[29936]: I1205 13:00:07.490446 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" event={"ID":"430db01e-5ffc-4ef9-9b71-35b825a86bde","Type":"ContainerStarted","Data":"c718b2874287f15db45b861ebdc4edb0174b7737c54c935de44ca242d66aeb47"} Dec 05 13:00:08.421601 master-0 kubenswrapper[29936]: I1205 13:00:08.421549 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jp5r2" Dec 05 13:00:08.517778 master-0 kubenswrapper[29936]: I1205 13:00:08.517691 29936 generic.go:334] "Generic (PLEG): container finished" podID="22d3af20-d89a-46a1-a8cc-82ca1b92e325" containerID="567cfe825b14753b5ff48a998631864bfcf52d353dd90dc5d2f2f21c700e69ca" exitCode=0 Dec 05 13:00:08.518432 master-0 kubenswrapper[29936]: I1205 13:00:08.517830 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2fccf" event={"ID":"22d3af20-d89a-46a1-a8cc-82ca1b92e325","Type":"ContainerDied","Data":"567cfe825b14753b5ff48a998631864bfcf52d353dd90dc5d2f2f21c700e69ca"} Dec 05 13:00:08.522138 master-0 kubenswrapper[29936]: I1205 13:00:08.522045 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-797p2" event={"ID":"e8fbd044-a3c2-4968-9522-90877e268aae","Type":"ContainerStarted","Data":"2f5e881e663bb6de796c2a88016031becd7f0c9f3e1d4b90277fc4ea094cac6c"} Dec 05 13:00:08.522451 master-0 kubenswrapper[29936]: I1205 13:00:08.522409 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" Dec 05 13:00:08.522560 master-0 kubenswrapper[29936]: I1205 13:00:08.522537 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 13:00:08.576982 master-0 kubenswrapper[29936]: I1205 13:00:08.576850 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-797p2" podStartSLOduration=3.15593466 podStartE2EDuration="10.576824946s" podCreationTimestamp="2025-12-05 12:59:58 +0000 UTC" firstStartedPulling="2025-12-05 12:59:59.564731329 +0000 UTC m=+596.696811010" lastFinishedPulling="2025-12-05 13:00:06.985621615 +0000 UTC m=+604.117701296" observedRunningTime="2025-12-05 13:00:08.573637999 +0000 UTC m=+605.705717690" watchObservedRunningTime="2025-12-05 13:00:08.576824946 +0000 UTC m=+605.708904627" Dec 05 13:00:08.612898 master-0 kubenswrapper[29936]: I1205 13:00:08.611647 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" podStartSLOduration=3.750378201 podStartE2EDuration="10.611618334s" podCreationTimestamp="2025-12-05 12:59:58 +0000 UTC" firstStartedPulling="2025-12-05 13:00:00.184982975 +0000 UTC m=+597.317062646" lastFinishedPulling="2025-12-05 13:00:07.046223098 +0000 UTC m=+604.178302779" observedRunningTime="2025-12-05 13:00:08.597406507 +0000 UTC m=+605.729486198" watchObservedRunningTime="2025-12-05 13:00:08.611618334 +0000 UTC m=+605.743698015" Dec 05 13:00:08.629774 master-0 kubenswrapper[29936]: I1205 13:00:08.629678 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" podStartSLOduration=2.960487769 podStartE2EDuration="12.629655376s" podCreationTimestamp="2025-12-05 12:59:56 +0000 UTC" firstStartedPulling="2025-12-05 12:59:57.318730401 +0000 UTC m=+594.450810082" lastFinishedPulling="2025-12-05 13:00:06.987898008 +0000 UTC m=+604.119977689" observedRunningTime="2025-12-05 13:00:08.617497834 +0000 UTC m=+605.749577515" watchObservedRunningTime="2025-12-05 13:00:08.629655376 +0000 UTC m=+605.761735057" Dec 05 13:00:09.429314 master-0 kubenswrapper[29936]: I1205 13:00:09.429167 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" Dec 05 13:00:09.532972 master-0 kubenswrapper[29936]: I1205 13:00:09.532895 29936 generic.go:334] "Generic (PLEG): container finished" podID="22d3af20-d89a-46a1-a8cc-82ca1b92e325" containerID="d4d9a81e542d6b7437987e4c18f9597a353811620fb73d5d5f7c564110d30932" exitCode=0 Dec 05 13:00:09.533728 master-0 kubenswrapper[29936]: I1205 13:00:09.532989 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2fccf" event={"ID":"22d3af20-d89a-46a1-a8cc-82ca1b92e325","Type":"ContainerDied","Data":"d4d9a81e542d6b7437987e4c18f9597a353811620fb73d5d5f7c564110d30932"} Dec 05 13:00:10.272348 master-0 kubenswrapper[29936]: I1205 13:00:10.272269 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 13:00:10.272725 master-0 kubenswrapper[29936]: I1205 13:00:10.272646 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 13:00:10.280139 master-0 kubenswrapper[29936]: I1205 13:00:10.280082 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 13:00:10.546746 master-0 kubenswrapper[29936]: I1205 13:00:10.546660 29936 generic.go:334] "Generic (PLEG): container finished" podID="22d3af20-d89a-46a1-a8cc-82ca1b92e325" containerID="a228c31fa594e4b5202e1e1e15e8b60bb197cec90c54b1a04d28b3fe3a7bb713" exitCode=0 Dec 05 13:00:10.548222 master-0 kubenswrapper[29936]: I1205 13:00:10.546757 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2fccf" event={"ID":"22d3af20-d89a-46a1-a8cc-82ca1b92e325","Type":"ContainerDied","Data":"a228c31fa594e4b5202e1e1e15e8b60bb197cec90c54b1a04d28b3fe3a7bb713"} Dec 05 13:00:10.557082 master-0 kubenswrapper[29936]: I1205 13:00:10.557025 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69b56dd5fc-pp8vp" Dec 05 13:00:10.673453 master-0 kubenswrapper[29936]: I1205 13:00:10.673351 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c4c944c57-cbgzt"] Dec 05 13:00:11.603438 master-0 kubenswrapper[29936]: I1205 13:00:11.603373 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2fccf" event={"ID":"22d3af20-d89a-46a1-a8cc-82ca1b92e325","Type":"ContainerStarted","Data":"8757fa079a81cbd0fc7d25938dc22d52a87bfe15e0e6b11067160268ab4734b7"} Dec 05 13:00:11.604211 master-0 kubenswrapper[29936]: I1205 13:00:11.604120 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2fccf" event={"ID":"22d3af20-d89a-46a1-a8cc-82ca1b92e325","Type":"ContainerStarted","Data":"9b7f3fb3e9072662dd76acd0ab09eab680fb2877c56298277f00f8a1764aabf0"} Dec 05 13:00:11.604211 master-0 kubenswrapper[29936]: I1205 13:00:11.604145 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2fccf" event={"ID":"22d3af20-d89a-46a1-a8cc-82ca1b92e325","Type":"ContainerStarted","Data":"51007c62df894ce9c05871ec4e0497e50c10b6be0316aa3ca6ec557098571ba0"} Dec 05 13:00:11.604211 master-0 kubenswrapper[29936]: I1205 13:00:11.604157 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2fccf" event={"ID":"22d3af20-d89a-46a1-a8cc-82ca1b92e325","Type":"ContainerStarted","Data":"daf7466a4cb64f376fdadf2784e9fb54de99ea1937f5355070d7f7fac95bd42f"} Dec 05 13:00:11.606758 master-0 kubenswrapper[29936]: I1205 13:00:11.606689 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5" event={"ID":"a8e8677b-4d23-49eb-ad0d-5b34cecdd56d","Type":"ContainerStarted","Data":"b68a68c3c3be0d234150a5dc5d5a1f931593e3b8b052d5ab956497311a1cb932"} Dec 05 13:00:11.606831 master-0 kubenswrapper[29936]: I1205 13:00:11.606770 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5" event={"ID":"a8e8677b-4d23-49eb-ad0d-5b34cecdd56d","Type":"ContainerStarted","Data":"a9a34e5981c26e9e26ec91ccd995d89eed11c5438cb844ae242ecca1675b8e53"} Dec 05 13:00:11.636129 master-0 kubenswrapper[29936]: I1205 13:00:11.636037 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-whhk5" podStartSLOduration=3.477962637 podStartE2EDuration="13.636012028s" podCreationTimestamp="2025-12-05 12:59:58 +0000 UTC" firstStartedPulling="2025-12-05 13:00:00.3631052 +0000 UTC m=+597.495184881" lastFinishedPulling="2025-12-05 13:00:10.521154581 +0000 UTC m=+607.653234272" observedRunningTime="2025-12-05 13:00:11.632916784 +0000 UTC m=+608.764996465" watchObservedRunningTime="2025-12-05 13:00:11.636012028 +0000 UTC m=+608.768091709" Dec 05 13:00:12.620555 master-0 kubenswrapper[29936]: I1205 13:00:12.620475 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2fccf" event={"ID":"22d3af20-d89a-46a1-a8cc-82ca1b92e325","Type":"ContainerStarted","Data":"c33ac88ccb6410cdf2ee819e66c294282235dd5a5eab97c9d6ab45fcbe20c1f7"} Dec 05 13:00:12.620555 master-0 kubenswrapper[29936]: I1205 13:00:12.620554 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2fccf" event={"ID":"22d3af20-d89a-46a1-a8cc-82ca1b92e325","Type":"ContainerStarted","Data":"39f5453b60759e9ec3701db0a1ae159311f5e1a17374041ea3f09292b7e119cf"} Dec 05 13:00:12.652116 master-0 kubenswrapper[29936]: I1205 13:00:12.651999 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-2fccf" podStartSLOduration=6.603248538 podStartE2EDuration="16.651966519s" podCreationTimestamp="2025-12-05 12:59:56 +0000 UTC" firstStartedPulling="2025-12-05 12:59:56.996775076 +0000 UTC m=+594.128854757" lastFinishedPulling="2025-12-05 13:00:07.045493057 +0000 UTC m=+604.177572738" observedRunningTime="2025-12-05 13:00:12.649945674 +0000 UTC m=+609.782025355" watchObservedRunningTime="2025-12-05 13:00:12.651966519 +0000 UTC m=+609.784046220" Dec 05 13:00:13.628752 master-0 kubenswrapper[29936]: I1205 13:00:13.628640 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-2fccf" Dec 05 13:00:14.519956 master-0 kubenswrapper[29936]: I1205 13:00:14.519876 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-797p2" Dec 05 13:00:14.643374 master-0 kubenswrapper[29936]: I1205 13:00:14.642422 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" event={"ID":"ddaa0ccc-4670-461b-a9f3-2eb8412dc7d2","Type":"ContainerStarted","Data":"f31861f9b22829ec7cf93a387e3ee7bc1bcf422213d0f897e7173f6a8bceef61"} Dec 05 13:00:14.661917 master-0 kubenswrapper[29936]: I1205 13:00:14.661818 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-gnk29" podStartSLOduration=2.477101534 podStartE2EDuration="15.66179564s" podCreationTimestamp="2025-12-05 12:59:59 +0000 UTC" firstStartedPulling="2025-12-05 13:00:00.398952407 +0000 UTC m=+597.531032078" lastFinishedPulling="2025-12-05 13:00:13.583646493 +0000 UTC m=+610.715726184" observedRunningTime="2025-12-05 13:00:14.659412454 +0000 UTC m=+611.791492135" watchObservedRunningTime="2025-12-05 13:00:14.66179564 +0000 UTC m=+611.793875341" Dec 05 13:00:16.802253 master-0 kubenswrapper[29936]: I1205 13:00:16.802071 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-2fccf" Dec 05 13:00:16.853123 master-0 kubenswrapper[29936]: I1205 13:00:16.853001 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-2fccf" Dec 05 13:00:16.945101 master-0 kubenswrapper[29936]: I1205 13:00:16.945038 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-mz4ld" Dec 05 13:00:19.436886 master-0 kubenswrapper[29936]: I1205 13:00:19.436773 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-k6dxn" Dec 05 13:00:25.097433 master-0 kubenswrapper[29936]: I1205 13:00:25.097379 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-gtxbw"] Dec 05 13:00:25.098466 master-0 kubenswrapper[29936]: E1205 13:00:25.098449 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1043a2e5-6dbc-42aa-96ed-1b46a6b5484f" containerName="collect-profiles" Dec 05 13:00:25.098561 master-0 kubenswrapper[29936]: I1205 13:00:25.098550 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="1043a2e5-6dbc-42aa-96ed-1b46a6b5484f" containerName="collect-profiles" Dec 05 13:00:25.098850 master-0 kubenswrapper[29936]: I1205 13:00:25.098836 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="1043a2e5-6dbc-42aa-96ed-1b46a6b5484f" containerName="collect-profiles" Dec 05 13:00:25.099495 master-0 kubenswrapper[29936]: I1205 13:00:25.099479 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.101882 master-0 kubenswrapper[29936]: I1205 13:00:25.101867 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Dec 05 13:00:25.114939 master-0 kubenswrapper[29936]: I1205 13:00:25.114872 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-gtxbw"] Dec 05 13:00:25.228945 master-0 kubenswrapper[29936]: I1205 13:00:25.228723 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-sys\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.229564 master-0 kubenswrapper[29936]: I1205 13:00:25.229532 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-file-lock-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.229776 master-0 kubenswrapper[29936]: I1205 13:00:25.229757 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-device-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.229929 master-0 kubenswrapper[29936]: I1205 13:00:25.229912 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-lvmd-config\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.230084 master-0 kubenswrapper[29936]: I1205 13:00:25.230056 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-registration-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.230186 master-0 kubenswrapper[29936]: I1205 13:00:25.230173 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-csi-plugin-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.230333 master-0 kubenswrapper[29936]: I1205 13:00:25.230311 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9254ac6f-79be-49ca-bf5c-f17df052ff24-metrics-cert\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.230512 master-0 kubenswrapper[29936]: I1205 13:00:25.230468 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-run-udev\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.230564 master-0 kubenswrapper[29936]: I1205 13:00:25.230541 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fkbl\" (UniqueName: \"kubernetes.io/projected/9254ac6f-79be-49ca-bf5c-f17df052ff24-kube-api-access-7fkbl\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.230618 master-0 kubenswrapper[29936]: I1205 13:00:25.230594 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-pod-volumes-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.230673 master-0 kubenswrapper[29936]: I1205 13:00:25.230650 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-node-plugin-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332355 master-0 kubenswrapper[29936]: I1205 13:00:25.332277 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-run-udev\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332355 master-0 kubenswrapper[29936]: I1205 13:00:25.332341 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fkbl\" (UniqueName: \"kubernetes.io/projected/9254ac6f-79be-49ca-bf5c-f17df052ff24-kube-api-access-7fkbl\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332355 master-0 kubenswrapper[29936]: I1205 13:00:25.332367 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-pod-volumes-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332692 master-0 kubenswrapper[29936]: I1205 13:00:25.332395 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-node-plugin-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332692 master-0 kubenswrapper[29936]: I1205 13:00:25.332439 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-sys\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332692 master-0 kubenswrapper[29936]: I1205 13:00:25.332473 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-file-lock-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332692 master-0 kubenswrapper[29936]: I1205 13:00:25.332529 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-device-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332692 master-0 kubenswrapper[29936]: I1205 13:00:25.332554 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-lvmd-config\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332692 master-0 kubenswrapper[29936]: I1205 13:00:25.332578 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-registration-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332692 master-0 kubenswrapper[29936]: I1205 13:00:25.332595 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-csi-plugin-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.332692 master-0 kubenswrapper[29936]: I1205 13:00:25.332613 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9254ac6f-79be-49ca-bf5c-f17df052ff24-metrics-cert\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.333291 master-0 kubenswrapper[29936]: I1205 13:00:25.333235 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-sys\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.333487 master-0 kubenswrapper[29936]: I1205 13:00:25.333427 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-csi-plugin-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.333554 master-0 kubenswrapper[29936]: I1205 13:00:25.333495 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-file-lock-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.333554 master-0 kubenswrapper[29936]: I1205 13:00:25.333446 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-device-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.333707 master-0 kubenswrapper[29936]: I1205 13:00:25.333676 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-registration-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.333764 master-0 kubenswrapper[29936]: I1205 13:00:25.333733 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-pod-volumes-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.333804 master-0 kubenswrapper[29936]: I1205 13:00:25.333774 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-run-udev\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.333957 master-0 kubenswrapper[29936]: I1205 13:00:25.333927 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-lvmd-config\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.334198 master-0 kubenswrapper[29936]: I1205 13:00:25.334170 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9254ac6f-79be-49ca-bf5c-f17df052ff24-node-plugin-dir\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.336542 master-0 kubenswrapper[29936]: I1205 13:00:25.336502 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9254ac6f-79be-49ca-bf5c-f17df052ff24-metrics-cert\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.354952 master-0 kubenswrapper[29936]: I1205 13:00:25.354853 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fkbl\" (UniqueName: \"kubernetes.io/projected/9254ac6f-79be-49ca-bf5c-f17df052ff24-kube-api-access-7fkbl\") pod \"vg-manager-gtxbw\" (UID: \"9254ac6f-79be-49ca-bf5c-f17df052ff24\") " pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:25.441339 master-0 kubenswrapper[29936]: I1205 13:00:25.441271 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:26.020398 master-0 kubenswrapper[29936]: W1205 13:00:26.016907 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9254ac6f_79be_49ca_bf5c_f17df052ff24.slice/crio-618d495ba60ad6425e1b01440e987bd7c0dee8206cd332801e57aa1ae8f846e8 WatchSource:0}: Error finding container 618d495ba60ad6425e1b01440e987bd7c0dee8206cd332801e57aa1ae8f846e8: Status 404 returned error can't find the container with id 618d495ba60ad6425e1b01440e987bd7c0dee8206cd332801e57aa1ae8f846e8 Dec 05 13:00:26.020398 master-0 kubenswrapper[29936]: I1205 13:00:26.018311 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-gtxbw"] Dec 05 13:00:26.768900 master-0 kubenswrapper[29936]: I1205 13:00:26.768842 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-59k6l" Dec 05 13:00:26.772164 master-0 kubenswrapper[29936]: I1205 13:00:26.772126 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-gtxbw" event={"ID":"9254ac6f-79be-49ca-bf5c-f17df052ff24","Type":"ContainerStarted","Data":"c8b1392ed11acf679bda3109899f3dd94795a79a169d66377600911fe454d268"} Dec 05 13:00:26.772164 master-0 kubenswrapper[29936]: I1205 13:00:26.772166 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-gtxbw" event={"ID":"9254ac6f-79be-49ca-bf5c-f17df052ff24","Type":"ContainerStarted","Data":"618d495ba60ad6425e1b01440e987bd7c0dee8206cd332801e57aa1ae8f846e8"} Dec 05 13:00:26.810993 master-0 kubenswrapper[29936]: I1205 13:00:26.810921 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-2fccf" Dec 05 13:00:26.823232 master-0 kubenswrapper[29936]: I1205 13:00:26.817474 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-gtxbw" podStartSLOduration=1.817450224 podStartE2EDuration="1.817450224s" podCreationTimestamp="2025-12-05 13:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:00:26.813612949 +0000 UTC m=+623.945692650" watchObservedRunningTime="2025-12-05 13:00:26.817450224 +0000 UTC m=+623.949529905" Dec 05 13:00:28.112602 master-0 kubenswrapper[29936]: E1205 13:00:28.112510 29936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9254ac6f_79be_49ca_bf5c_f17df052ff24.slice/crio-c8b1392ed11acf679bda3109899f3dd94795a79a169d66377600911fe454d268.scope\": RecentStats: unable to find data in memory cache]" Dec 05 13:00:28.805672 master-0 kubenswrapper[29936]: I1205 13:00:28.805597 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-gtxbw_9254ac6f-79be-49ca-bf5c-f17df052ff24/vg-manager/0.log" Dec 05 13:00:28.805672 master-0 kubenswrapper[29936]: I1205 13:00:28.805664 29936 generic.go:334] "Generic (PLEG): container finished" podID="9254ac6f-79be-49ca-bf5c-f17df052ff24" containerID="c8b1392ed11acf679bda3109899f3dd94795a79a169d66377600911fe454d268" exitCode=1 Dec 05 13:00:28.806064 master-0 kubenswrapper[29936]: I1205 13:00:28.805704 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-gtxbw" event={"ID":"9254ac6f-79be-49ca-bf5c-f17df052ff24","Type":"ContainerDied","Data":"c8b1392ed11acf679bda3109899f3dd94795a79a169d66377600911fe454d268"} Dec 05 13:00:28.806457 master-0 kubenswrapper[29936]: I1205 13:00:28.806419 29936 scope.go:117] "RemoveContainer" containerID="c8b1392ed11acf679bda3109899f3dd94795a79a169d66377600911fe454d268" Dec 05 13:00:29.128636 master-0 kubenswrapper[29936]: I1205 13:00:29.128502 29936 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Dec 05 13:00:29.653317 master-0 kubenswrapper[29936]: I1205 13:00:29.653067 29936 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2025-12-05T13:00:29.128549305Z","Handler":null,"Name":""} Dec 05 13:00:29.656632 master-0 kubenswrapper[29936]: I1205 13:00:29.656592 29936 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Dec 05 13:00:29.656818 master-0 kubenswrapper[29936]: I1205 13:00:29.656654 29936 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Dec 05 13:00:29.823857 master-0 kubenswrapper[29936]: I1205 13:00:29.823788 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-gtxbw_9254ac6f-79be-49ca-bf5c-f17df052ff24/vg-manager/0.log" Dec 05 13:00:29.824282 master-0 kubenswrapper[29936]: I1205 13:00:29.823913 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-gtxbw" event={"ID":"9254ac6f-79be-49ca-bf5c-f17df052ff24","Type":"ContainerStarted","Data":"07993dc87f0edc3b3c1df988e2bb893012edd7ba73bee6a231e9b38e4223079c"} Dec 05 13:00:32.229844 master-0 kubenswrapper[29936]: I1205 13:00:32.229749 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7jz5q"] Dec 05 13:00:32.233940 master-0 kubenswrapper[29936]: I1205 13:00:32.233893 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7jz5q" Dec 05 13:00:32.239628 master-0 kubenswrapper[29936]: I1205 13:00:32.235843 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 13:00:32.239628 master-0 kubenswrapper[29936]: I1205 13:00:32.236248 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 13:00:32.251586 master-0 kubenswrapper[29936]: I1205 13:00:32.251329 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7jz5q"] Dec 05 13:00:32.322831 master-0 kubenswrapper[29936]: I1205 13:00:32.322753 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw7sh\" (UniqueName: \"kubernetes.io/projected/9bd37371-5c42-4675-8ce7-03bdf12c535c-kube-api-access-cw7sh\") pod \"openstack-operator-index-7jz5q\" (UID: \"9bd37371-5c42-4675-8ce7-03bdf12c535c\") " pod="openstack-operators/openstack-operator-index-7jz5q" Dec 05 13:00:32.425632 master-0 kubenswrapper[29936]: I1205 13:00:32.425520 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw7sh\" (UniqueName: \"kubernetes.io/projected/9bd37371-5c42-4675-8ce7-03bdf12c535c-kube-api-access-cw7sh\") pod \"openstack-operator-index-7jz5q\" (UID: \"9bd37371-5c42-4675-8ce7-03bdf12c535c\") " pod="openstack-operators/openstack-operator-index-7jz5q" Dec 05 13:00:32.441977 master-0 kubenswrapper[29936]: I1205 13:00:32.441866 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw7sh\" (UniqueName: \"kubernetes.io/projected/9bd37371-5c42-4675-8ce7-03bdf12c535c-kube-api-access-cw7sh\") pod \"openstack-operator-index-7jz5q\" (UID: \"9bd37371-5c42-4675-8ce7-03bdf12c535c\") " pod="openstack-operators/openstack-operator-index-7jz5q" Dec 05 13:00:32.566809 master-0 kubenswrapper[29936]: I1205 13:00:32.566660 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7jz5q" Dec 05 13:00:33.050112 master-0 kubenswrapper[29936]: I1205 13:00:33.049240 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7jz5q"] Dec 05 13:00:33.871484 master-0 kubenswrapper[29936]: I1205 13:00:33.871406 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7jz5q" event={"ID":"9bd37371-5c42-4675-8ce7-03bdf12c535c","Type":"ContainerStarted","Data":"060232cd072300655be32e1e18821cdaab539b9a54c219903d8e8ceab1830bcc"} Dec 05 13:00:34.885968 master-0 kubenswrapper[29936]: I1205 13:00:34.885878 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7jz5q" event={"ID":"9bd37371-5c42-4675-8ce7-03bdf12c535c","Type":"ContainerStarted","Data":"54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef"} Dec 05 13:00:34.913111 master-0 kubenswrapper[29936]: I1205 13:00:34.913019 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7jz5q" podStartSLOduration=2.133968466 podStartE2EDuration="2.913000399s" podCreationTimestamp="2025-12-05 13:00:32 +0000 UTC" firstStartedPulling="2025-12-05 13:00:33.066573452 +0000 UTC m=+630.198653133" lastFinishedPulling="2025-12-05 13:00:33.845605395 +0000 UTC m=+630.977685066" observedRunningTime="2025-12-05 13:00:34.910750017 +0000 UTC m=+632.042829718" watchObservedRunningTime="2025-12-05 13:00:34.913000399 +0000 UTC m=+632.045080070" Dec 05 13:00:35.388684 master-0 kubenswrapper[29936]: I1205 13:00:35.388609 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7jz5q"] Dec 05 13:00:35.442134 master-0 kubenswrapper[29936]: I1205 13:00:35.442010 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:35.444771 master-0 kubenswrapper[29936]: I1205 13:00:35.444690 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:35.740689 master-0 kubenswrapper[29936]: I1205 13:00:35.740607 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5c4c944c57-cbgzt" podUID="5c779b4b-b368-4573-9502-17ea8fc60aac" containerName="console" containerID="cri-o://2b70f89cbf491ef83a5dfaec4f3b7f047c49a3a50041133350f4e7c8b8132f0e" gracePeriod=15 Dec 05 13:00:35.908679 master-0 kubenswrapper[29936]: I1205 13:00:35.907712 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c4c944c57-cbgzt_5c779b4b-b368-4573-9502-17ea8fc60aac/console/0.log" Dec 05 13:00:35.908679 master-0 kubenswrapper[29936]: I1205 13:00:35.907842 29936 generic.go:334] "Generic (PLEG): container finished" podID="5c779b4b-b368-4573-9502-17ea8fc60aac" containerID="2b70f89cbf491ef83a5dfaec4f3b7f047c49a3a50041133350f4e7c8b8132f0e" exitCode=2 Dec 05 13:00:35.908679 master-0 kubenswrapper[29936]: I1205 13:00:35.907969 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4c944c57-cbgzt" event={"ID":"5c779b4b-b368-4573-9502-17ea8fc60aac","Type":"ContainerDied","Data":"2b70f89cbf491ef83a5dfaec4f3b7f047c49a3a50041133350f4e7c8b8132f0e"} Dec 05 13:00:35.908679 master-0 kubenswrapper[29936]: I1205 13:00:35.908390 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:35.909928 master-0 kubenswrapper[29936]: I1205 13:00:35.909869 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-gtxbw" Dec 05 13:00:36.008716 master-0 kubenswrapper[29936]: I1205 13:00:36.007091 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-scnjw"] Dec 05 13:00:36.008976 master-0 kubenswrapper[29936]: I1205 13:00:36.008913 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-scnjw" Dec 05 13:00:36.026220 master-0 kubenswrapper[29936]: I1205 13:00:36.015442 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-scnjw"] Dec 05 13:00:36.103671 master-0 kubenswrapper[29936]: I1205 13:00:36.102971 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9mk2\" (UniqueName: \"kubernetes.io/projected/b6724238-88e3-4a4d-98c9-b9344e1b9d0d-kube-api-access-n9mk2\") pod \"openstack-operator-index-scnjw\" (UID: \"b6724238-88e3-4a4d-98c9-b9344e1b9d0d\") " pod="openstack-operators/openstack-operator-index-scnjw" Dec 05 13:00:36.208552 master-0 kubenswrapper[29936]: I1205 13:00:36.205375 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9mk2\" (UniqueName: \"kubernetes.io/projected/b6724238-88e3-4a4d-98c9-b9344e1b9d0d-kube-api-access-n9mk2\") pod \"openstack-operator-index-scnjw\" (UID: \"b6724238-88e3-4a4d-98c9-b9344e1b9d0d\") " pod="openstack-operators/openstack-operator-index-scnjw" Dec 05 13:00:36.237740 master-0 kubenswrapper[29936]: I1205 13:00:36.229497 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9mk2\" (UniqueName: \"kubernetes.io/projected/b6724238-88e3-4a4d-98c9-b9344e1b9d0d-kube-api-access-n9mk2\") pod \"openstack-operator-index-scnjw\" (UID: \"b6724238-88e3-4a4d-98c9-b9344e1b9d0d\") " pod="openstack-operators/openstack-operator-index-scnjw" Dec 05 13:00:36.293828 master-0 kubenswrapper[29936]: I1205 13:00:36.293762 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c4c944c57-cbgzt_5c779b4b-b368-4573-9502-17ea8fc60aac/console/0.log" Dec 05 13:00:36.294097 master-0 kubenswrapper[29936]: I1205 13:00:36.293954 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 13:00:36.348152 master-0 kubenswrapper[29936]: I1205 13:00:36.348079 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-scnjw" Dec 05 13:00:36.410481 master-0 kubenswrapper[29936]: I1205 13:00:36.410390 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-console-config\") pod \"5c779b4b-b368-4573-9502-17ea8fc60aac\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " Dec 05 13:00:36.410731 master-0 kubenswrapper[29936]: I1205 13:00:36.410520 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rf24\" (UniqueName: \"kubernetes.io/projected/5c779b4b-b368-4573-9502-17ea8fc60aac-kube-api-access-6rf24\") pod \"5c779b4b-b368-4573-9502-17ea8fc60aac\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " Dec 05 13:00:36.410731 master-0 kubenswrapper[29936]: I1205 13:00:36.410605 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-service-ca\") pod \"5c779b4b-b368-4573-9502-17ea8fc60aac\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " Dec 05 13:00:36.410925 master-0 kubenswrapper[29936]: I1205 13:00:36.410890 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-trusted-ca-bundle\") pod \"5c779b4b-b368-4573-9502-17ea8fc60aac\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " Dec 05 13:00:36.410995 master-0 kubenswrapper[29936]: I1205 13:00:36.410932 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-oauth-config\") pod \"5c779b4b-b368-4573-9502-17ea8fc60aac\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " Dec 05 13:00:36.411392 master-0 kubenswrapper[29936]: I1205 13:00:36.411365 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-serving-cert\") pod \"5c779b4b-b368-4573-9502-17ea8fc60aac\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " Dec 05 13:00:36.411569 master-0 kubenswrapper[29936]: I1205 13:00:36.411540 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-oauth-serving-cert\") pod \"5c779b4b-b368-4573-9502-17ea8fc60aac\" (UID: \"5c779b4b-b368-4573-9502-17ea8fc60aac\") " Dec 05 13:00:36.413034 master-0 kubenswrapper[29936]: I1205 13:00:36.412996 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5c779b4b-b368-4573-9502-17ea8fc60aac" (UID: "5c779b4b-b368-4573-9502-17ea8fc60aac"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:00:36.413459 master-0 kubenswrapper[29936]: I1205 13:00:36.413423 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-service-ca" (OuterVolumeSpecName: "service-ca") pod "5c779b4b-b368-4573-9502-17ea8fc60aac" (UID: "5c779b4b-b368-4573-9502-17ea8fc60aac"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:00:36.413901 master-0 kubenswrapper[29936]: I1205 13:00:36.413865 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5c779b4b-b368-4573-9502-17ea8fc60aac" (UID: "5c779b4b-b368-4573-9502-17ea8fc60aac"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:00:36.415406 master-0 kubenswrapper[29936]: I1205 13:00:36.415363 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c779b4b-b368-4573-9502-17ea8fc60aac-kube-api-access-6rf24" (OuterVolumeSpecName: "kube-api-access-6rf24") pod "5c779b4b-b368-4573-9502-17ea8fc60aac" (UID: "5c779b4b-b368-4573-9502-17ea8fc60aac"). InnerVolumeSpecName "kube-api-access-6rf24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:00:36.415942 master-0 kubenswrapper[29936]: I1205 13:00:36.415922 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-console-config" (OuterVolumeSpecName: "console-config") pod "5c779b4b-b368-4573-9502-17ea8fc60aac" (UID: "5c779b4b-b368-4573-9502-17ea8fc60aac"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:00:36.416896 master-0 kubenswrapper[29936]: I1205 13:00:36.416850 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5c779b4b-b368-4573-9502-17ea8fc60aac" (UID: "5c779b4b-b368-4573-9502-17ea8fc60aac"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:00:36.419257 master-0 kubenswrapper[29936]: I1205 13:00:36.419232 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5c779b4b-b368-4573-9502-17ea8fc60aac" (UID: "5c779b4b-b368-4573-9502-17ea8fc60aac"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:00:36.519286 master-0 kubenswrapper[29936]: I1205 13:00:36.517129 29936 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:36.519286 master-0 kubenswrapper[29936]: I1205 13:00:36.517199 29936 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:36.519286 master-0 kubenswrapper[29936]: I1205 13:00:36.517214 29936 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5c779b4b-b368-4573-9502-17ea8fc60aac-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:36.519286 master-0 kubenswrapper[29936]: I1205 13:00:36.517227 29936 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:36.519286 master-0 kubenswrapper[29936]: I1205 13:00:36.517241 29936 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-console-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:36.519286 master-0 kubenswrapper[29936]: I1205 13:00:36.517256 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rf24\" (UniqueName: \"kubernetes.io/projected/5c779b4b-b368-4573-9502-17ea8fc60aac-kube-api-access-6rf24\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:36.519286 master-0 kubenswrapper[29936]: I1205 13:00:36.517273 29936 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5c779b4b-b368-4573-9502-17ea8fc60aac-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:36.841470 master-0 kubenswrapper[29936]: I1205 13:00:36.840788 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-scnjw"] Dec 05 13:00:36.850301 master-0 kubenswrapper[29936]: W1205 13:00:36.849092 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6724238_88e3_4a4d_98c9_b9344e1b9d0d.slice/crio-eb2c6ade767bb08501f174fea3c83acd9e20d48d1b25f2057d6ae808f4e4445f WatchSource:0}: Error finding container eb2c6ade767bb08501f174fea3c83acd9e20d48d1b25f2057d6ae808f4e4445f: Status 404 returned error can't find the container with id eb2c6ade767bb08501f174fea3c83acd9e20d48d1b25f2057d6ae808f4e4445f Dec 05 13:00:36.926315 master-0 kubenswrapper[29936]: I1205 13:00:36.926139 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-scnjw" event={"ID":"b6724238-88e3-4a4d-98c9-b9344e1b9d0d","Type":"ContainerStarted","Data":"eb2c6ade767bb08501f174fea3c83acd9e20d48d1b25f2057d6ae808f4e4445f"} Dec 05 13:00:36.928520 master-0 kubenswrapper[29936]: I1205 13:00:36.928463 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c4c944c57-cbgzt_5c779b4b-b368-4573-9502-17ea8fc60aac/console/0.log" Dec 05 13:00:36.928707 master-0 kubenswrapper[29936]: I1205 13:00:36.928599 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c4c944c57-cbgzt" event={"ID":"5c779b4b-b368-4573-9502-17ea8fc60aac","Type":"ContainerDied","Data":"e2fb0775164b3ff160de6c0f50b7f54076b8ce3a94bff44fa5df6930dfa34de1"} Dec 05 13:00:36.928771 master-0 kubenswrapper[29936]: I1205 13:00:36.928747 29936 scope.go:117] "RemoveContainer" containerID="2b70f89cbf491ef83a5dfaec4f3b7f047c49a3a50041133350f4e7c8b8132f0e" Dec 05 13:00:36.930580 master-0 kubenswrapper[29936]: I1205 13:00:36.928915 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c4c944c57-cbgzt" Dec 05 13:00:36.930580 master-0 kubenswrapper[29936]: I1205 13:00:36.929125 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7jz5q" podUID="9bd37371-5c42-4675-8ce7-03bdf12c535c" containerName="registry-server" containerID="cri-o://54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef" gracePeriod=2 Dec 05 13:00:36.978664 master-0 kubenswrapper[29936]: I1205 13:00:36.978358 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c4c944c57-cbgzt"] Dec 05 13:00:36.985981 master-0 kubenswrapper[29936]: I1205 13:00:36.985888 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c4c944c57-cbgzt"] Dec 05 13:00:37.209829 master-0 kubenswrapper[29936]: I1205 13:00:37.209758 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c779b4b-b368-4573-9502-17ea8fc60aac" path="/var/lib/kubelet/pods/5c779b4b-b368-4573-9502-17ea8fc60aac/volumes" Dec 05 13:00:37.347962 master-0 kubenswrapper[29936]: I1205 13:00:37.347896 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7jz5q" Dec 05 13:00:37.437144 master-0 kubenswrapper[29936]: I1205 13:00:37.437061 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw7sh\" (UniqueName: \"kubernetes.io/projected/9bd37371-5c42-4675-8ce7-03bdf12c535c-kube-api-access-cw7sh\") pod \"9bd37371-5c42-4675-8ce7-03bdf12c535c\" (UID: \"9bd37371-5c42-4675-8ce7-03bdf12c535c\") " Dec 05 13:00:37.444916 master-0 kubenswrapper[29936]: I1205 13:00:37.441858 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd37371-5c42-4675-8ce7-03bdf12c535c-kube-api-access-cw7sh" (OuterVolumeSpecName: "kube-api-access-cw7sh") pod "9bd37371-5c42-4675-8ce7-03bdf12c535c" (UID: "9bd37371-5c42-4675-8ce7-03bdf12c535c"). InnerVolumeSpecName "kube-api-access-cw7sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:00:37.540803 master-0 kubenswrapper[29936]: I1205 13:00:37.540716 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw7sh\" (UniqueName: \"kubernetes.io/projected/9bd37371-5c42-4675-8ce7-03bdf12c535c-kube-api-access-cw7sh\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:37.940622 master-0 kubenswrapper[29936]: I1205 13:00:37.940542 29936 generic.go:334] "Generic (PLEG): container finished" podID="9bd37371-5c42-4675-8ce7-03bdf12c535c" containerID="54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef" exitCode=0 Dec 05 13:00:37.941391 master-0 kubenswrapper[29936]: I1205 13:00:37.940628 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7jz5q" Dec 05 13:00:37.941391 master-0 kubenswrapper[29936]: I1205 13:00:37.940641 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7jz5q" event={"ID":"9bd37371-5c42-4675-8ce7-03bdf12c535c","Type":"ContainerDied","Data":"54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef"} Dec 05 13:00:37.941391 master-0 kubenswrapper[29936]: I1205 13:00:37.940828 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7jz5q" event={"ID":"9bd37371-5c42-4675-8ce7-03bdf12c535c","Type":"ContainerDied","Data":"060232cd072300655be32e1e18821cdaab539b9a54c219903d8e8ceab1830bcc"} Dec 05 13:00:37.941391 master-0 kubenswrapper[29936]: I1205 13:00:37.940879 29936 scope.go:117] "RemoveContainer" containerID="54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef" Dec 05 13:00:37.943012 master-0 kubenswrapper[29936]: I1205 13:00:37.942970 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-scnjw" event={"ID":"b6724238-88e3-4a4d-98c9-b9344e1b9d0d","Type":"ContainerStarted","Data":"262b01c78f3d463893506db743da70afce6687bd3c50f15996b837d3bca66ec3"} Dec 05 13:00:37.961605 master-0 kubenswrapper[29936]: I1205 13:00:37.961549 29936 scope.go:117] "RemoveContainer" containerID="54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef" Dec 05 13:00:37.962059 master-0 kubenswrapper[29936]: E1205 13:00:37.962014 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef\": container with ID starting with 54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef not found: ID does not exist" containerID="54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef" Dec 05 13:00:37.962125 master-0 kubenswrapper[29936]: I1205 13:00:37.962060 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef"} err="failed to get container status \"54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef\": rpc error: code = NotFound desc = could not find container \"54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef\": container with ID starting with 54a30dd02c0117c107e462b215792358a59c0f3f3af5f3739eef32df4eff22ef not found: ID does not exist" Dec 05 13:00:37.975872 master-0 kubenswrapper[29936]: I1205 13:00:37.975721 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-scnjw" podStartSLOduration=2.41047312 podStartE2EDuration="2.975668086s" podCreationTimestamp="2025-12-05 13:00:35 +0000 UTC" firstStartedPulling="2025-12-05 13:00:36.855469263 +0000 UTC m=+633.987548944" lastFinishedPulling="2025-12-05 13:00:37.420664229 +0000 UTC m=+634.552743910" observedRunningTime="2025-12-05 13:00:37.966465814 +0000 UTC m=+635.098545515" watchObservedRunningTime="2025-12-05 13:00:37.975668086 +0000 UTC m=+635.107747787" Dec 05 13:00:38.003708 master-0 kubenswrapper[29936]: I1205 13:00:38.003635 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7jz5q"] Dec 05 13:00:38.013658 master-0 kubenswrapper[29936]: I1205 13:00:38.013575 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7jz5q"] Dec 05 13:00:39.205139 master-0 kubenswrapper[29936]: I1205 13:00:39.205038 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bd37371-5c42-4675-8ce7-03bdf12c535c" path="/var/lib/kubelet/pods/9bd37371-5c42-4675-8ce7-03bdf12c535c/volumes" Dec 05 13:00:46.349209 master-0 kubenswrapper[29936]: I1205 13:00:46.349119 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-scnjw" Dec 05 13:00:46.349879 master-0 kubenswrapper[29936]: I1205 13:00:46.349257 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-scnjw" Dec 05 13:00:46.396008 master-0 kubenswrapper[29936]: I1205 13:00:46.395928 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-scnjw" Dec 05 13:00:47.060301 master-0 kubenswrapper[29936]: I1205 13:00:47.060217 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-scnjw" Dec 05 13:00:53.396601 master-0 kubenswrapper[29936]: I1205 13:00:53.396518 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb"] Dec 05 13:00:53.397363 master-0 kubenswrapper[29936]: E1205 13:00:53.396978 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd37371-5c42-4675-8ce7-03bdf12c535c" containerName="registry-server" Dec 05 13:00:53.397363 master-0 kubenswrapper[29936]: I1205 13:00:53.396994 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd37371-5c42-4675-8ce7-03bdf12c535c" containerName="registry-server" Dec 05 13:00:53.397363 master-0 kubenswrapper[29936]: E1205 13:00:53.397023 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c779b4b-b368-4573-9502-17ea8fc60aac" containerName="console" Dec 05 13:00:53.397363 master-0 kubenswrapper[29936]: I1205 13:00:53.397030 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c779b4b-b368-4573-9502-17ea8fc60aac" containerName="console" Dec 05 13:00:53.397363 master-0 kubenswrapper[29936]: I1205 13:00:53.397227 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c779b4b-b368-4573-9502-17ea8fc60aac" containerName="console" Dec 05 13:00:53.397363 master-0 kubenswrapper[29936]: I1205 13:00:53.397250 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd37371-5c42-4675-8ce7-03bdf12c535c" containerName="registry-server" Dec 05 13:00:53.398548 master-0 kubenswrapper[29936]: I1205 13:00:53.398516 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:53.408729 master-0 kubenswrapper[29936]: I1205 13:00:53.408657 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb"] Dec 05 13:00:53.495598 master-0 kubenswrapper[29936]: I1205 13:00:53.495485 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:53.495861 master-0 kubenswrapper[29936]: I1205 13:00:53.495713 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdvs4\" (UniqueName: \"kubernetes.io/projected/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-kube-api-access-mdvs4\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:53.495861 master-0 kubenswrapper[29936]: I1205 13:00:53.495778 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:53.598272 master-0 kubenswrapper[29936]: I1205 13:00:53.598172 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdvs4\" (UniqueName: \"kubernetes.io/projected/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-kube-api-access-mdvs4\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:53.598272 master-0 kubenswrapper[29936]: I1205 13:00:53.598258 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:53.598707 master-0 kubenswrapper[29936]: I1205 13:00:53.598361 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:53.599004 master-0 kubenswrapper[29936]: I1205 13:00:53.598981 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-util\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:53.599061 master-0 kubenswrapper[29936]: I1205 13:00:53.599012 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-bundle\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:53.620211 master-0 kubenswrapper[29936]: I1205 13:00:53.620125 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdvs4\" (UniqueName: \"kubernetes.io/projected/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-kube-api-access-mdvs4\") pod \"917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:53.722788 master-0 kubenswrapper[29936]: I1205 13:00:53.722684 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:54.220926 master-0 kubenswrapper[29936]: I1205 13:00:54.220563 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb"] Dec 05 13:00:55.127955 master-0 kubenswrapper[29936]: I1205 13:00:55.126085 29936 generic.go:334] "Generic (PLEG): container finished" podID="149a37ba-8d0a-4ebb-80b0-7d97c13412ad" containerID="1554aa71a411cb6033097158a0fa8263b3dab733bd38f8bad6fa4b98e0726b1c" exitCode=0 Dec 05 13:00:55.127955 master-0 kubenswrapper[29936]: I1205 13:00:55.126190 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" event={"ID":"149a37ba-8d0a-4ebb-80b0-7d97c13412ad","Type":"ContainerDied","Data":"1554aa71a411cb6033097158a0fa8263b3dab733bd38f8bad6fa4b98e0726b1c"} Dec 05 13:00:55.127955 master-0 kubenswrapper[29936]: I1205 13:00:55.126238 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" event={"ID":"149a37ba-8d0a-4ebb-80b0-7d97c13412ad","Type":"ContainerStarted","Data":"858533d4609c30ef43d9f476dac9a5b0fd1bd73dd986c0197f967cdf06bd7e3d"} Dec 05 13:00:56.141209 master-0 kubenswrapper[29936]: I1205 13:00:56.140949 29936 generic.go:334] "Generic (PLEG): container finished" podID="149a37ba-8d0a-4ebb-80b0-7d97c13412ad" containerID="8b6e3104b701eb8a5254d06697f66ccace8b48b4fd8f33535dad76e9bc00d7b2" exitCode=0 Dec 05 13:00:56.141209 master-0 kubenswrapper[29936]: I1205 13:00:56.141028 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" event={"ID":"149a37ba-8d0a-4ebb-80b0-7d97c13412ad","Type":"ContainerDied","Data":"8b6e3104b701eb8a5254d06697f66ccace8b48b4fd8f33535dad76e9bc00d7b2"} Dec 05 13:00:57.157066 master-0 kubenswrapper[29936]: I1205 13:00:57.156970 29936 generic.go:334] "Generic (PLEG): container finished" podID="149a37ba-8d0a-4ebb-80b0-7d97c13412ad" containerID="1f809b568e1afc95a55f441df11f8d911ce5e551a354e85f4dd030c45ffde08e" exitCode=0 Dec 05 13:00:57.157066 master-0 kubenswrapper[29936]: I1205 13:00:57.157045 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" event={"ID":"149a37ba-8d0a-4ebb-80b0-7d97c13412ad","Type":"ContainerDied","Data":"1f809b568e1afc95a55f441df11f8d911ce5e551a354e85f4dd030c45ffde08e"} Dec 05 13:00:58.552956 master-0 kubenswrapper[29936]: I1205 13:00:58.552875 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:58.623555 master-0 kubenswrapper[29936]: I1205 13:00:58.623428 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-util\") pod \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " Dec 05 13:00:58.623899 master-0 kubenswrapper[29936]: I1205 13:00:58.623709 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-bundle\") pod \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " Dec 05 13:00:58.623899 master-0 kubenswrapper[29936]: I1205 13:00:58.623797 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdvs4\" (UniqueName: \"kubernetes.io/projected/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-kube-api-access-mdvs4\") pod \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\" (UID: \"149a37ba-8d0a-4ebb-80b0-7d97c13412ad\") " Dec 05 13:00:58.625755 master-0 kubenswrapper[29936]: I1205 13:00:58.625639 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-bundle" (OuterVolumeSpecName: "bundle") pod "149a37ba-8d0a-4ebb-80b0-7d97c13412ad" (UID: "149a37ba-8d0a-4ebb-80b0-7d97c13412ad"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:00:58.626458 master-0 kubenswrapper[29936]: I1205 13:00:58.626415 29936 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:58.627481 master-0 kubenswrapper[29936]: I1205 13:00:58.627449 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-kube-api-access-mdvs4" (OuterVolumeSpecName: "kube-api-access-mdvs4") pod "149a37ba-8d0a-4ebb-80b0-7d97c13412ad" (UID: "149a37ba-8d0a-4ebb-80b0-7d97c13412ad"). InnerVolumeSpecName "kube-api-access-mdvs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:00:58.647525 master-0 kubenswrapper[29936]: I1205 13:00:58.647397 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-util" (OuterVolumeSpecName: "util") pod "149a37ba-8d0a-4ebb-80b0-7d97c13412ad" (UID: "149a37ba-8d0a-4ebb-80b0-7d97c13412ad"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:00:58.728625 master-0 kubenswrapper[29936]: I1205 13:00:58.728523 29936 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-util\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:58.728625 master-0 kubenswrapper[29936]: I1205 13:00:58.728589 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdvs4\" (UniqueName: \"kubernetes.io/projected/149a37ba-8d0a-4ebb-80b0-7d97c13412ad-kube-api-access-mdvs4\") on node \"master-0\" DevicePath \"\"" Dec 05 13:00:59.187084 master-0 kubenswrapper[29936]: I1205 13:00:59.186987 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" Dec 05 13:00:59.197761 master-0 kubenswrapper[29936]: I1205 13:00:59.197689 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/917aae072417a6c2fc5ddd97ca05bfedb9fc1cad89a3b1c4d989b78eafkbmvb" event={"ID":"149a37ba-8d0a-4ebb-80b0-7d97c13412ad","Type":"ContainerDied","Data":"858533d4609c30ef43d9f476dac9a5b0fd1bd73dd986c0197f967cdf06bd7e3d"} Dec 05 13:00:59.197761 master-0 kubenswrapper[29936]: I1205 13:00:59.197748 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="858533d4609c30ef43d9f476dac9a5b0fd1bd73dd986c0197f967cdf06bd7e3d" Dec 05 13:01:06.065720 master-0 kubenswrapper[29936]: I1205 13:01:06.065627 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m"] Dec 05 13:01:06.066752 master-0 kubenswrapper[29936]: E1205 13:01:06.066261 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149a37ba-8d0a-4ebb-80b0-7d97c13412ad" containerName="util" Dec 05 13:01:06.066752 master-0 kubenswrapper[29936]: I1205 13:01:06.066283 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="149a37ba-8d0a-4ebb-80b0-7d97c13412ad" containerName="util" Dec 05 13:01:06.066752 master-0 kubenswrapper[29936]: E1205 13:01:06.066308 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149a37ba-8d0a-4ebb-80b0-7d97c13412ad" containerName="extract" Dec 05 13:01:06.066752 master-0 kubenswrapper[29936]: I1205 13:01:06.066316 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="149a37ba-8d0a-4ebb-80b0-7d97c13412ad" containerName="extract" Dec 05 13:01:06.066752 master-0 kubenswrapper[29936]: E1205 13:01:06.066350 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149a37ba-8d0a-4ebb-80b0-7d97c13412ad" containerName="pull" Dec 05 13:01:06.066752 master-0 kubenswrapper[29936]: I1205 13:01:06.066359 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="149a37ba-8d0a-4ebb-80b0-7d97c13412ad" containerName="pull" Dec 05 13:01:06.066752 master-0 kubenswrapper[29936]: I1205 13:01:06.066607 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="149a37ba-8d0a-4ebb-80b0-7d97c13412ad" containerName="extract" Dec 05 13:01:06.068660 master-0 kubenswrapper[29936]: I1205 13:01:06.067440 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" Dec 05 13:01:06.134219 master-0 kubenswrapper[29936]: I1205 13:01:06.126144 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m"] Dec 05 13:01:06.179213 master-0 kubenswrapper[29936]: I1205 13:01:06.178714 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84zhb\" (UniqueName: \"kubernetes.io/projected/217c5772-7464-4ea6-b289-6111bf084d30-kube-api-access-84zhb\") pod \"openstack-operator-controller-operator-55b6fb9447-wjd5m\" (UID: \"217c5772-7464-4ea6-b289-6111bf084d30\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" Dec 05 13:01:06.283568 master-0 kubenswrapper[29936]: I1205 13:01:06.281138 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84zhb\" (UniqueName: \"kubernetes.io/projected/217c5772-7464-4ea6-b289-6111bf084d30-kube-api-access-84zhb\") pod \"openstack-operator-controller-operator-55b6fb9447-wjd5m\" (UID: \"217c5772-7464-4ea6-b289-6111bf084d30\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" Dec 05 13:01:06.558818 master-0 kubenswrapper[29936]: I1205 13:01:06.558753 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84zhb\" (UniqueName: \"kubernetes.io/projected/217c5772-7464-4ea6-b289-6111bf084d30-kube-api-access-84zhb\") pod \"openstack-operator-controller-operator-55b6fb9447-wjd5m\" (UID: \"217c5772-7464-4ea6-b289-6111bf084d30\") " pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" Dec 05 13:01:06.689502 master-0 kubenswrapper[29936]: I1205 13:01:06.689427 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" Dec 05 13:01:07.147861 master-0 kubenswrapper[29936]: I1205 13:01:07.147792 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m"] Dec 05 13:01:07.158085 master-0 kubenswrapper[29936]: W1205 13:01:07.157983 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod217c5772_7464_4ea6_b289_6111bf084d30.slice/crio-e227d5430384c15a7b144997dcc58ffc571516c5b111de52b053134ad43dd9ba WatchSource:0}: Error finding container e227d5430384c15a7b144997dcc58ffc571516c5b111de52b053134ad43dd9ba: Status 404 returned error can't find the container with id e227d5430384c15a7b144997dcc58ffc571516c5b111de52b053134ad43dd9ba Dec 05 13:01:07.271421 master-0 kubenswrapper[29936]: I1205 13:01:07.271333 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" event={"ID":"217c5772-7464-4ea6-b289-6111bf084d30","Type":"ContainerStarted","Data":"e227d5430384c15a7b144997dcc58ffc571516c5b111de52b053134ad43dd9ba"} Dec 05 13:01:13.379150 master-0 kubenswrapper[29936]: I1205 13:01:13.379060 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" event={"ID":"217c5772-7464-4ea6-b289-6111bf084d30","Type":"ContainerStarted","Data":"ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19"} Dec 05 13:01:13.379843 master-0 kubenswrapper[29936]: I1205 13:01:13.379441 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" Dec 05 13:01:13.452205 master-0 kubenswrapper[29936]: I1205 13:01:13.449984 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" podStartSLOduration=2.284068189 podStartE2EDuration="7.449954203s" podCreationTimestamp="2025-12-05 13:01:06 +0000 UTC" firstStartedPulling="2025-12-05 13:01:07.160947737 +0000 UTC m=+664.293027428" lastFinishedPulling="2025-12-05 13:01:12.326833761 +0000 UTC m=+669.458913442" observedRunningTime="2025-12-05 13:01:13.43002136 +0000 UTC m=+670.562101041" watchObservedRunningTime="2025-12-05 13:01:13.449954203 +0000 UTC m=+670.582033894" Dec 05 13:01:26.694626 master-0 kubenswrapper[29936]: I1205 13:01:26.693984 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" Dec 05 13:01:29.056402 master-0 kubenswrapper[29936]: I1205 13:01:29.056304 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt"] Dec 05 13:01:29.057549 master-0 kubenswrapper[29936]: I1205 13:01:29.057521 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt" Dec 05 13:01:29.167213 master-0 kubenswrapper[29936]: I1205 13:01:29.153097 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt"] Dec 05 13:01:29.216221 master-0 kubenswrapper[29936]: I1205 13:01:29.215733 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvlrr\" (UniqueName: \"kubernetes.io/projected/065a5c8a-03af-481b-a078-f9982d510c48-kube-api-access-wvlrr\") pod \"openstack-operator-controller-operator-589d7b4556-5m5lt\" (UID: \"065a5c8a-03af-481b-a078-f9982d510c48\") " pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt" Dec 05 13:01:29.319318 master-0 kubenswrapper[29936]: I1205 13:01:29.318422 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvlrr\" (UniqueName: \"kubernetes.io/projected/065a5c8a-03af-481b-a078-f9982d510c48-kube-api-access-wvlrr\") pod \"openstack-operator-controller-operator-589d7b4556-5m5lt\" (UID: \"065a5c8a-03af-481b-a078-f9982d510c48\") " pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt" Dec 05 13:01:29.361105 master-0 kubenswrapper[29936]: I1205 13:01:29.361039 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvlrr\" (UniqueName: \"kubernetes.io/projected/065a5c8a-03af-481b-a078-f9982d510c48-kube-api-access-wvlrr\") pod \"openstack-operator-controller-operator-589d7b4556-5m5lt\" (UID: \"065a5c8a-03af-481b-a078-f9982d510c48\") " pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt" Dec 05 13:01:29.377020 master-0 kubenswrapper[29936]: I1205 13:01:29.376968 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt" Dec 05 13:01:29.875721 master-0 kubenswrapper[29936]: I1205 13:01:29.875663 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt"] Dec 05 13:01:30.544427 master-0 kubenswrapper[29936]: I1205 13:01:30.544321 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt" event={"ID":"065a5c8a-03af-481b-a078-f9982d510c48","Type":"ContainerStarted","Data":"1600a5f8121f59f5bb43783dc8d79659e5b52acaaebe923084e72421d053cdfc"} Dec 05 13:01:30.545041 master-0 kubenswrapper[29936]: I1205 13:01:30.544452 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt" event={"ID":"065a5c8a-03af-481b-a078-f9982d510c48","Type":"ContainerStarted","Data":"22c598d8923ff9e174d01644b66eeebede30c75d5be9770f1358e1d27bcf5a35"} Dec 05 13:01:30.545041 master-0 kubenswrapper[29936]: I1205 13:01:30.544522 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt" Dec 05 13:01:30.587463 master-0 kubenswrapper[29936]: I1205 13:01:30.587338 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt" podStartSLOduration=1.587312114 podStartE2EDuration="1.587312114s" podCreationTimestamp="2025-12-05 13:01:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:01:30.578151874 +0000 UTC m=+687.710231605" watchObservedRunningTime="2025-12-05 13:01:30.587312114 +0000 UTC m=+687.719391795" Dec 05 13:01:39.381223 master-0 kubenswrapper[29936]: I1205 13:01:39.381119 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-589d7b4556-5m5lt" Dec 05 13:01:39.562042 master-0 kubenswrapper[29936]: I1205 13:01:39.561948 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m"] Dec 05 13:01:39.562485 master-0 kubenswrapper[29936]: I1205 13:01:39.562259 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" podUID="217c5772-7464-4ea6-b289-6111bf084d30" containerName="operator" containerID="cri-o://ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19" gracePeriod=10 Dec 05 13:01:40.070305 master-0 kubenswrapper[29936]: I1205 13:01:40.066908 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" Dec 05 13:01:40.096697 master-0 kubenswrapper[29936]: I1205 13:01:40.096621 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84zhb\" (UniqueName: \"kubernetes.io/projected/217c5772-7464-4ea6-b289-6111bf084d30-kube-api-access-84zhb\") pod \"217c5772-7464-4ea6-b289-6111bf084d30\" (UID: \"217c5772-7464-4ea6-b289-6111bf084d30\") " Dec 05 13:01:40.102894 master-0 kubenswrapper[29936]: I1205 13:01:40.102797 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217c5772-7464-4ea6-b289-6111bf084d30-kube-api-access-84zhb" (OuterVolumeSpecName: "kube-api-access-84zhb") pod "217c5772-7464-4ea6-b289-6111bf084d30" (UID: "217c5772-7464-4ea6-b289-6111bf084d30"). InnerVolumeSpecName "kube-api-access-84zhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:01:40.198964 master-0 kubenswrapper[29936]: I1205 13:01:40.198866 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84zhb\" (UniqueName: \"kubernetes.io/projected/217c5772-7464-4ea6-b289-6111bf084d30-kube-api-access-84zhb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:01:40.673844 master-0 kubenswrapper[29936]: I1205 13:01:40.673741 29936 generic.go:334] "Generic (PLEG): container finished" podID="217c5772-7464-4ea6-b289-6111bf084d30" containerID="ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19" exitCode=0 Dec 05 13:01:40.673844 master-0 kubenswrapper[29936]: I1205 13:01:40.673834 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" event={"ID":"217c5772-7464-4ea6-b289-6111bf084d30","Type":"ContainerDied","Data":"ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19"} Dec 05 13:01:40.674694 master-0 kubenswrapper[29936]: I1205 13:01:40.673882 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" event={"ID":"217c5772-7464-4ea6-b289-6111bf084d30","Type":"ContainerDied","Data":"e227d5430384c15a7b144997dcc58ffc571516c5b111de52b053134ad43dd9ba"} Dec 05 13:01:40.674694 master-0 kubenswrapper[29936]: I1205 13:01:40.673908 29936 scope.go:117] "RemoveContainer" containerID="ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19" Dec 05 13:01:40.674694 master-0 kubenswrapper[29936]: I1205 13:01:40.674058 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m" Dec 05 13:01:40.707550 master-0 kubenswrapper[29936]: I1205 13:01:40.707479 29936 scope.go:117] "RemoveContainer" containerID="ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19" Dec 05 13:01:40.708295 master-0 kubenswrapper[29936]: E1205 13:01:40.708239 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19\": container with ID starting with ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19 not found: ID does not exist" containerID="ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19" Dec 05 13:01:40.708375 master-0 kubenswrapper[29936]: I1205 13:01:40.708303 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19"} err="failed to get container status \"ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19\": rpc error: code = NotFound desc = could not find container \"ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19\": container with ID starting with ecedf441cfc17d08b448a619937810a104d29c06f8cffccfc52d974671910c19 not found: ID does not exist" Dec 05 13:01:40.726147 master-0 kubenswrapper[29936]: I1205 13:01:40.726056 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m"] Dec 05 13:01:40.738264 master-0 kubenswrapper[29936]: I1205 13:01:40.738157 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-55b6fb9447-wjd5m"] Dec 05 13:01:41.200323 master-0 kubenswrapper[29936]: I1205 13:01:41.200235 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217c5772-7464-4ea6-b289-6111bf084d30" path="/var/lib/kubelet/pods/217c5772-7464-4ea6-b289-6111bf084d30/volumes" Dec 05 13:02:35.136214 master-0 kubenswrapper[29936]: I1205 13:02:35.135271 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp"] Dec 05 13:02:35.136214 master-0 kubenswrapper[29936]: E1205 13:02:35.135965 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217c5772-7464-4ea6-b289-6111bf084d30" containerName="operator" Dec 05 13:02:35.136214 master-0 kubenswrapper[29936]: I1205 13:02:35.135997 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="217c5772-7464-4ea6-b289-6111bf084d30" containerName="operator" Dec 05 13:02:35.137135 master-0 kubenswrapper[29936]: I1205 13:02:35.136306 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="217c5772-7464-4ea6-b289-6111bf084d30" containerName="operator" Dec 05 13:02:35.145118 master-0 kubenswrapper[29936]: I1205 13:02:35.137815 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" Dec 05 13:02:35.153660 master-0 kubenswrapper[29936]: I1205 13:02:35.153459 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-52767"] Dec 05 13:02:35.157296 master-0 kubenswrapper[29936]: I1205 13:02:35.155389 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" Dec 05 13:02:35.157296 master-0 kubenswrapper[29936]: I1205 13:02:35.157053 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25bn2\" (UniqueName: \"kubernetes.io/projected/c19b3073-31ec-46d5-9534-61bedaf6de73-kube-api-access-25bn2\") pod \"barbican-operator-controller-manager-5cd89994b5-z5xtp\" (UID: \"c19b3073-31ec-46d5-9534-61bedaf6de73\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" Dec 05 13:02:35.233595 master-0 kubenswrapper[29936]: I1205 13:02:35.233513 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp"] Dec 05 13:02:35.233595 master-0 kubenswrapper[29936]: I1205 13:02:35.233595 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb"] Dec 05 13:02:35.237855 master-0 kubenswrapper[29936]: I1205 13:02:35.237724 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" Dec 05 13:02:35.253750 master-0 kubenswrapper[29936]: I1205 13:02:35.240887 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-52767"] Dec 05 13:02:35.261213 master-0 kubenswrapper[29936]: I1205 13:02:35.258649 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k64q4\" (UniqueName: \"kubernetes.io/projected/72464fc8-9970-443f-98ac-1c40ada39742-kube-api-access-k64q4\") pod \"cinder-operator-controller-manager-f8856dd79-52767\" (UID: \"72464fc8-9970-443f-98ac-1c40ada39742\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" Dec 05 13:02:35.261213 master-0 kubenswrapper[29936]: I1205 13:02:35.258780 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25bn2\" (UniqueName: \"kubernetes.io/projected/c19b3073-31ec-46d5-9534-61bedaf6de73-kube-api-access-25bn2\") pod \"barbican-operator-controller-manager-5cd89994b5-z5xtp\" (UID: \"c19b3073-31ec-46d5-9534-61bedaf6de73\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" Dec 05 13:02:35.261213 master-0 kubenswrapper[29936]: I1205 13:02:35.258947 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bsfj\" (UniqueName: \"kubernetes.io/projected/7c7c4bb7-eb69-4370-8539-34ba8c63383e-kube-api-access-4bsfj\") pod \"designate-operator-controller-manager-84bc9f68f5-sgjwb\" (UID: \"7c7c4bb7-eb69-4370-8539-34ba8c63383e\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" Dec 05 13:02:35.261213 master-0 kubenswrapper[29936]: I1205 13:02:35.260259 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl"] Dec 05 13:02:35.265250 master-0 kubenswrapper[29936]: I1205 13:02:35.263422 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" Dec 05 13:02:35.291138 master-0 kubenswrapper[29936]: I1205 13:02:35.291082 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb"] Dec 05 13:02:35.316201 master-0 kubenswrapper[29936]: I1205 13:02:35.310913 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25bn2\" (UniqueName: \"kubernetes.io/projected/c19b3073-31ec-46d5-9534-61bedaf6de73-kube-api-access-25bn2\") pod \"barbican-operator-controller-manager-5cd89994b5-z5xtp\" (UID: \"c19b3073-31ec-46d5-9534-61bedaf6de73\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" Dec 05 13:02:35.354618 master-0 kubenswrapper[29936]: I1205 13:02:35.354408 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl"] Dec 05 13:02:35.360684 master-0 kubenswrapper[29936]: I1205 13:02:35.360407 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k64q4\" (UniqueName: \"kubernetes.io/projected/72464fc8-9970-443f-98ac-1c40ada39742-kube-api-access-k64q4\") pod \"cinder-operator-controller-manager-f8856dd79-52767\" (UID: \"72464fc8-9970-443f-98ac-1c40ada39742\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" Dec 05 13:02:35.360684 master-0 kubenswrapper[29936]: I1205 13:02:35.360572 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrq8r\" (UniqueName: \"kubernetes.io/projected/cd1c763d-2aee-4791-a9ea-87299af8efc4-kube-api-access-xrq8r\") pod \"glance-operator-controller-manager-78cd4f7769-wdgwl\" (UID: \"cd1c763d-2aee-4791-a9ea-87299af8efc4\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" Dec 05 13:02:35.360684 master-0 kubenswrapper[29936]: I1205 13:02:35.360637 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bsfj\" (UniqueName: \"kubernetes.io/projected/7c7c4bb7-eb69-4370-8539-34ba8c63383e-kube-api-access-4bsfj\") pod \"designate-operator-controller-manager-84bc9f68f5-sgjwb\" (UID: \"7c7c4bb7-eb69-4370-8539-34ba8c63383e\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" Dec 05 13:02:35.389663 master-0 kubenswrapper[29936]: I1205 13:02:35.389489 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9"] Dec 05 13:02:35.391493 master-0 kubenswrapper[29936]: I1205 13:02:35.391436 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" Dec 05 13:02:35.400309 master-0 kubenswrapper[29936]: I1205 13:02:35.396318 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k64q4\" (UniqueName: \"kubernetes.io/projected/72464fc8-9970-443f-98ac-1c40ada39742-kube-api-access-k64q4\") pod \"cinder-operator-controller-manager-f8856dd79-52767\" (UID: \"72464fc8-9970-443f-98ac-1c40ada39742\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" Dec 05 13:02:35.403006 master-0 kubenswrapper[29936]: I1205 13:02:35.400936 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bsfj\" (UniqueName: \"kubernetes.io/projected/7c7c4bb7-eb69-4370-8539-34ba8c63383e-kube-api-access-4bsfj\") pod \"designate-operator-controller-manager-84bc9f68f5-sgjwb\" (UID: \"7c7c4bb7-eb69-4370-8539-34ba8c63383e\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" Dec 05 13:02:35.423477 master-0 kubenswrapper[29936]: I1205 13:02:35.422848 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6"] Dec 05 13:02:35.436783 master-0 kubenswrapper[29936]: I1205 13:02:35.434744 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" Dec 05 13:02:35.437805 master-0 kubenswrapper[29936]: I1205 13:02:35.437706 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9"] Dec 05 13:02:35.461466 master-0 kubenswrapper[29936]: I1205 13:02:35.459273 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6"] Dec 05 13:02:35.471215 master-0 kubenswrapper[29936]: I1205 13:02:35.470196 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqxt\" (UniqueName: \"kubernetes.io/projected/77fe28e2-671e-4a0d-b065-0746abf83306-kube-api-access-zkqxt\") pod \"horizon-operator-controller-manager-f6cc97788-vhbn6\" (UID: \"77fe28e2-671e-4a0d-b065-0746abf83306\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" Dec 05 13:02:35.471215 master-0 kubenswrapper[29936]: I1205 13:02:35.470281 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrq8r\" (UniqueName: \"kubernetes.io/projected/cd1c763d-2aee-4791-a9ea-87299af8efc4-kube-api-access-xrq8r\") pod \"glance-operator-controller-manager-78cd4f7769-wdgwl\" (UID: \"cd1c763d-2aee-4791-a9ea-87299af8efc4\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" Dec 05 13:02:35.482125 master-0 kubenswrapper[29936]: I1205 13:02:35.482065 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt"] Dec 05 13:02:35.485755 master-0 kubenswrapper[29936]: I1205 13:02:35.485686 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:35.522362 master-0 kubenswrapper[29936]: I1205 13:02:35.521561 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 13:02:35.533377 master-0 kubenswrapper[29936]: I1205 13:02:35.533308 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrq8r\" (UniqueName: \"kubernetes.io/projected/cd1c763d-2aee-4791-a9ea-87299af8efc4-kube-api-access-xrq8r\") pod \"glance-operator-controller-manager-78cd4f7769-wdgwl\" (UID: \"cd1c763d-2aee-4791-a9ea-87299af8efc4\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" Dec 05 13:02:35.538532 master-0 kubenswrapper[29936]: I1205 13:02:35.538370 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" Dec 05 13:02:35.599155 master-0 kubenswrapper[29936]: I1205 13:02:35.546835 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt"] Dec 05 13:02:35.599155 master-0 kubenswrapper[29936]: I1205 13:02:35.558028 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" Dec 05 13:02:35.599155 master-0 kubenswrapper[29936]: I1205 13:02:35.566771 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95"] Dec 05 13:02:35.599155 master-0 kubenswrapper[29936]: I1205 13:02:35.575294 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:02:35.599155 master-0 kubenswrapper[29936]: I1205 13:02:35.579201 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" Dec 05 13:02:35.603814 master-0 kubenswrapper[29936]: I1205 13:02:35.603746 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqxt\" (UniqueName: \"kubernetes.io/projected/77fe28e2-671e-4a0d-b065-0746abf83306-kube-api-access-zkqxt\") pod \"horizon-operator-controller-manager-f6cc97788-vhbn6\" (UID: \"77fe28e2-671e-4a0d-b065-0746abf83306\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" Dec 05 13:02:35.604065 master-0 kubenswrapper[29936]: I1205 13:02:35.604001 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgvfr\" (UniqueName: \"kubernetes.io/projected/11364ea4-cb4b-4398-9692-aa5dd12f8a9f-kube-api-access-mgvfr\") pod \"heat-operator-controller-manager-7fd96594c7-bmxm9\" (UID: \"11364ea4-cb4b-4398-9692-aa5dd12f8a9f\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" Dec 05 13:02:35.604163 master-0 kubenswrapper[29936]: I1205 13:02:35.604075 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9426\" (UniqueName: \"kubernetes.io/projected/ee9e12a6-a899-4e44-b4e1-d975493b6b9c-kube-api-access-k9426\") pod \"ironic-operator-controller-manager-7c9bfd6967-c6t95\" (UID: \"ee9e12a6-a899-4e44-b4e1-d975493b6b9c\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:02:35.604339 master-0 kubenswrapper[29936]: I1205 13:02:35.604257 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt27g\" (UniqueName: \"kubernetes.io/projected/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-kube-api-access-nt27g\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:35.604339 master-0 kubenswrapper[29936]: I1205 13:02:35.604323 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:35.621358 master-0 kubenswrapper[29936]: I1205 13:02:35.621270 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95"] Dec 05 13:02:35.622537 master-0 kubenswrapper[29936]: I1205 13:02:35.622028 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" Dec 05 13:02:35.635801 master-0 kubenswrapper[29936]: I1205 13:02:35.632961 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd"] Dec 05 13:02:35.635801 master-0 kubenswrapper[29936]: I1205 13:02:35.634895 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" Dec 05 13:02:35.679296 master-0 kubenswrapper[29936]: I1205 13:02:35.677565 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7"] Dec 05 13:02:35.697535 master-0 kubenswrapper[29936]: I1205 13:02:35.688562 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqxt\" (UniqueName: \"kubernetes.io/projected/77fe28e2-671e-4a0d-b065-0746abf83306-kube-api-access-zkqxt\") pod \"horizon-operator-controller-manager-f6cc97788-vhbn6\" (UID: \"77fe28e2-671e-4a0d-b065-0746abf83306\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" Dec 05 13:02:35.700370 master-0 kubenswrapper[29936]: I1205 13:02:35.699080 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" Dec 05 13:02:35.771224 master-0 kubenswrapper[29936]: I1205 13:02:35.742455 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgvfr\" (UniqueName: \"kubernetes.io/projected/11364ea4-cb4b-4398-9692-aa5dd12f8a9f-kube-api-access-mgvfr\") pod \"heat-operator-controller-manager-7fd96594c7-bmxm9\" (UID: \"11364ea4-cb4b-4398-9692-aa5dd12f8a9f\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" Dec 05 13:02:35.771224 master-0 kubenswrapper[29936]: I1205 13:02:35.742522 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9426\" (UniqueName: \"kubernetes.io/projected/ee9e12a6-a899-4e44-b4e1-d975493b6b9c-kube-api-access-k9426\") pod \"ironic-operator-controller-manager-7c9bfd6967-c6t95\" (UID: \"ee9e12a6-a899-4e44-b4e1-d975493b6b9c\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:02:35.771224 master-0 kubenswrapper[29936]: I1205 13:02:35.742577 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt27g\" (UniqueName: \"kubernetes.io/projected/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-kube-api-access-nt27g\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:35.771224 master-0 kubenswrapper[29936]: I1205 13:02:35.742609 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:35.771224 master-0 kubenswrapper[29936]: E1205 13:02:35.742800 29936 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:35.771224 master-0 kubenswrapper[29936]: E1205 13:02:35.742866 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert podName:e64b46f2-d1e1-462f-952d-a7a7f8c663e9 nodeName:}" failed. No retries permitted until 2025-12-05 13:02:36.242841122 +0000 UTC m=+753.374920803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-rprkt" (UID: "e64b46f2-d1e1-462f-952d-a7a7f8c663e9") : secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:35.804762 master-0 kubenswrapper[29936]: I1205 13:02:35.797077 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt27g\" (UniqueName: \"kubernetes.io/projected/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-kube-api-access-nt27g\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:35.813224 master-0 kubenswrapper[29936]: I1205 13:02:35.806082 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd"] Dec 05 13:02:35.813224 master-0 kubenswrapper[29936]: I1205 13:02:35.808344 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgvfr\" (UniqueName: \"kubernetes.io/projected/11364ea4-cb4b-4398-9692-aa5dd12f8a9f-kube-api-access-mgvfr\") pod \"heat-operator-controller-manager-7fd96594c7-bmxm9\" (UID: \"11364ea4-cb4b-4398-9692-aa5dd12f8a9f\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" Dec 05 13:02:35.813224 master-0 kubenswrapper[29936]: I1205 13:02:35.808592 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9426\" (UniqueName: \"kubernetes.io/projected/ee9e12a6-a899-4e44-b4e1-d975493b6b9c-kube-api-access-k9426\") pod \"ironic-operator-controller-manager-7c9bfd6967-c6t95\" (UID: \"ee9e12a6-a899-4e44-b4e1-d975493b6b9c\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:02:35.830505 master-0 kubenswrapper[29936]: I1205 13:02:35.825479 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" Dec 05 13:02:35.845395 master-0 kubenswrapper[29936]: I1205 13:02:35.845292 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbcmx\" (UniqueName: \"kubernetes.io/projected/6662695e-b061-402f-935b-c08c7eea8265-kube-api-access-wbcmx\") pod \"manila-operator-controller-manager-56f9fbf74b-sxfv7\" (UID: \"6662695e-b061-402f-935b-c08c7eea8265\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" Dec 05 13:02:35.845922 master-0 kubenswrapper[29936]: I1205 13:02:35.845891 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7mpf\" (UniqueName: \"kubernetes.io/projected/42a27f5b-3b94-49e5-902d-abd26cb141d8-kube-api-access-f7mpf\") pod \"keystone-operator-controller-manager-58b8dcc5fb-bwmdd\" (UID: \"42a27f5b-3b94-49e5-902d-abd26cb141d8\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" Dec 05 13:02:35.850809 master-0 kubenswrapper[29936]: I1205 13:02:35.850733 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff"] Dec 05 13:02:35.854317 master-0 kubenswrapper[29936]: I1205 13:02:35.853911 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" Dec 05 13:02:35.914324 master-0 kubenswrapper[29936]: I1205 13:02:35.908163 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" Dec 05 13:02:35.941283 master-0 kubenswrapper[29936]: I1205 13:02:35.932970 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7"] Dec 05 13:02:35.946935 master-0 kubenswrapper[29936]: I1205 13:02:35.944110 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff"] Dec 05 13:02:35.962627 master-0 kubenswrapper[29936]: I1205 13:02:35.953889 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7mpf\" (UniqueName: \"kubernetes.io/projected/42a27f5b-3b94-49e5-902d-abd26cb141d8-kube-api-access-f7mpf\") pod \"keystone-operator-controller-manager-58b8dcc5fb-bwmdd\" (UID: \"42a27f5b-3b94-49e5-902d-abd26cb141d8\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" Dec 05 13:02:35.962627 master-0 kubenswrapper[29936]: I1205 13:02:35.954145 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8k72\" (UniqueName: \"kubernetes.io/projected/e371fb46-339c-440c-b53a-31daf18710ef-kube-api-access-s8k72\") pod \"mariadb-operator-controller-manager-647d75769b-dbbff\" (UID: \"e371fb46-339c-440c-b53a-31daf18710ef\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" Dec 05 13:02:35.962627 master-0 kubenswrapper[29936]: I1205 13:02:35.955080 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbcmx\" (UniqueName: \"kubernetes.io/projected/6662695e-b061-402f-935b-c08c7eea8265-kube-api-access-wbcmx\") pod \"manila-operator-controller-manager-56f9fbf74b-sxfv7\" (UID: \"6662695e-b061-402f-935b-c08c7eea8265\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" Dec 05 13:02:35.962627 master-0 kubenswrapper[29936]: I1205 13:02:35.955262 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6"] Dec 05 13:02:35.962627 master-0 kubenswrapper[29936]: I1205 13:02:35.957297 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" Dec 05 13:02:35.991216 master-0 kubenswrapper[29936]: I1205 13:02:35.983448 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6"] Dec 05 13:02:35.991216 master-0 kubenswrapper[29936]: I1205 13:02:35.990778 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7mpf\" (UniqueName: \"kubernetes.io/projected/42a27f5b-3b94-49e5-902d-abd26cb141d8-kube-api-access-f7mpf\") pod \"keystone-operator-controller-manager-58b8dcc5fb-bwmdd\" (UID: \"42a27f5b-3b94-49e5-902d-abd26cb141d8\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" Dec 05 13:02:35.995972 master-0 kubenswrapper[29936]: I1205 13:02:35.995367 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbcmx\" (UniqueName: \"kubernetes.io/projected/6662695e-b061-402f-935b-c08c7eea8265-kube-api-access-wbcmx\") pod \"manila-operator-controller-manager-56f9fbf74b-sxfv7\" (UID: \"6662695e-b061-402f-935b-c08c7eea8265\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" Dec 05 13:02:35.996617 master-0 kubenswrapper[29936]: I1205 13:02:35.996578 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw"] Dec 05 13:02:35.998425 master-0 kubenswrapper[29936]: I1205 13:02:35.998391 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" Dec 05 13:02:36.031578 master-0 kubenswrapper[29936]: I1205 13:02:36.028481 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw"] Dec 05 13:02:36.057534 master-0 kubenswrapper[29936]: I1205 13:02:36.057481 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr"] Dec 05 13:02:36.059805 master-0 kubenswrapper[29936]: I1205 13:02:36.059779 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:02:36.059958 master-0 kubenswrapper[29936]: I1205 13:02:36.059898 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8k72\" (UniqueName: \"kubernetes.io/projected/e371fb46-339c-440c-b53a-31daf18710ef-kube-api-access-s8k72\") pod \"mariadb-operator-controller-manager-647d75769b-dbbff\" (UID: \"e371fb46-339c-440c-b53a-31daf18710ef\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" Dec 05 13:02:36.060145 master-0 kubenswrapper[29936]: I1205 13:02:36.060014 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdbcv\" (UniqueName: \"kubernetes.io/projected/214a3e2a-d306-45aa-8b2c-966a9953028c-kube-api-access-qdbcv\") pod \"nova-operator-controller-manager-865fc86d5b-xgptw\" (UID: \"214a3e2a-d306-45aa-8b2c-966a9953028c\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" Dec 05 13:02:36.060145 master-0 kubenswrapper[29936]: I1205 13:02:36.060053 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9x4\" (UniqueName: \"kubernetes.io/projected/ef3f1077-b60d-4960-95ae-418df7695587-kube-api-access-nf9x4\") pod \"neutron-operator-controller-manager-7cdd6b54fb-n8fr6\" (UID: \"ef3f1077-b60d-4960-95ae-418df7695587\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" Dec 05 13:02:36.103869 master-0 kubenswrapper[29936]: I1205 13:02:36.096976 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8k72\" (UniqueName: \"kubernetes.io/projected/e371fb46-339c-440c-b53a-31daf18710ef-kube-api-access-s8k72\") pod \"mariadb-operator-controller-manager-647d75769b-dbbff\" (UID: \"e371fb46-339c-440c-b53a-31daf18710ef\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" Dec 05 13:02:36.103869 master-0 kubenswrapper[29936]: I1205 13:02:36.099288 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:02:36.154398 master-0 kubenswrapper[29936]: I1205 13:02:36.150574 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" Dec 05 13:02:36.179326 master-0 kubenswrapper[29936]: I1205 13:02:36.177313 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzwbn\" (UniqueName: \"kubernetes.io/projected/49bd6523-c715-46b2-8112-070019badeed-kube-api-access-xzwbn\") pod \"octavia-operator-controller-manager-845b79dc4f-pt7jr\" (UID: \"49bd6523-c715-46b2-8112-070019badeed\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:02:36.179326 master-0 kubenswrapper[29936]: I1205 13:02:36.177459 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdbcv\" (UniqueName: \"kubernetes.io/projected/214a3e2a-d306-45aa-8b2c-966a9953028c-kube-api-access-qdbcv\") pod \"nova-operator-controller-manager-865fc86d5b-xgptw\" (UID: \"214a3e2a-d306-45aa-8b2c-966a9953028c\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" Dec 05 13:02:36.179326 master-0 kubenswrapper[29936]: I1205 13:02:36.177497 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9x4\" (UniqueName: \"kubernetes.io/projected/ef3f1077-b60d-4960-95ae-418df7695587-kube-api-access-nf9x4\") pod \"neutron-operator-controller-manager-7cdd6b54fb-n8fr6\" (UID: \"ef3f1077-b60d-4960-95ae-418df7695587\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" Dec 05 13:02:36.183254 master-0 kubenswrapper[29936]: I1205 13:02:36.180755 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" Dec 05 13:02:36.225256 master-0 kubenswrapper[29936]: I1205 13:02:36.221832 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" Dec 05 13:02:36.238203 master-0 kubenswrapper[29936]: I1205 13:02:36.235674 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9x4\" (UniqueName: \"kubernetes.io/projected/ef3f1077-b60d-4960-95ae-418df7695587-kube-api-access-nf9x4\") pod \"neutron-operator-controller-manager-7cdd6b54fb-n8fr6\" (UID: \"ef3f1077-b60d-4960-95ae-418df7695587\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" Dec 05 13:02:36.238203 master-0 kubenswrapper[29936]: I1205 13:02:36.236512 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdbcv\" (UniqueName: \"kubernetes.io/projected/214a3e2a-d306-45aa-8b2c-966a9953028c-kube-api-access-qdbcv\") pod \"nova-operator-controller-manager-865fc86d5b-xgptw\" (UID: \"214a3e2a-d306-45aa-8b2c-966a9953028c\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" Dec 05 13:02:36.246939 master-0 kubenswrapper[29936]: I1205 13:02:36.246827 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr"] Dec 05 13:02:36.258346 master-0 kubenswrapper[29936]: I1205 13:02:36.255511 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc"] Dec 05 13:02:36.258346 master-0 kubenswrapper[29936]: I1205 13:02:36.257371 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" Dec 05 13:02:36.290681 master-0 kubenswrapper[29936]: I1205 13:02:36.273740 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9"] Dec 05 13:02:36.290681 master-0 kubenswrapper[29936]: I1205 13:02:36.275553 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:36.290681 master-0 kubenswrapper[29936]: I1205 13:02:36.287534 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc"] Dec 05 13:02:36.290681 master-0 kubenswrapper[29936]: I1205 13:02:36.288748 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:36.290681 master-0 kubenswrapper[29936]: I1205 13:02:36.288816 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8cmn\" (UniqueName: \"kubernetes.io/projected/dae79856-19f7-42a9-9eb2-d4263c763a58-kube-api-access-c8cmn\") pod \"ovn-operator-controller-manager-647f96877-mm5qc\" (UID: \"dae79856-19f7-42a9-9eb2-d4263c763a58\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" Dec 05 13:02:36.290681 master-0 kubenswrapper[29936]: I1205 13:02:36.288901 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzwbn\" (UniqueName: \"kubernetes.io/projected/49bd6523-c715-46b2-8112-070019badeed-kube-api-access-xzwbn\") pod \"octavia-operator-controller-manager-845b79dc4f-pt7jr\" (UID: \"49bd6523-c715-46b2-8112-070019badeed\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:02:36.290681 master-0 kubenswrapper[29936]: E1205 13:02:36.289499 29936 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:36.290681 master-0 kubenswrapper[29936]: E1205 13:02:36.289549 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert podName:e64b46f2-d1e1-462f-952d-a7a7f8c663e9 nodeName:}" failed. No retries permitted until 2025-12-05 13:02:37.289531416 +0000 UTC m=+754.421611097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-rprkt" (UID: "e64b46f2-d1e1-462f-952d-a7a7f8c663e9") : secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:36.345349 master-0 kubenswrapper[29936]: I1205 13:02:36.343138 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j"] Dec 05 13:02:36.352203 master-0 kubenswrapper[29936]: I1205 13:02:36.346322 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" Dec 05 13:02:36.356254 master-0 kubenswrapper[29936]: I1205 13:02:36.354376 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" Dec 05 13:02:36.382893 master-0 kubenswrapper[29936]: I1205 13:02:36.381718 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9"] Dec 05 13:02:36.388825 master-0 kubenswrapper[29936]: I1205 13:02:36.387857 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 13:02:36.392202 master-0 kubenswrapper[29936]: I1205 13:02:36.391031 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfg99\" (UniqueName: \"kubernetes.io/projected/e083a41a-8cd3-458d-a55d-cf3be20f0618-kube-api-access-kfg99\") pod \"placement-operator-controller-manager-6b64f6f645-bdv2j\" (UID: \"e083a41a-8cd3-458d-a55d-cf3be20f0618\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" Dec 05 13:02:36.392202 master-0 kubenswrapper[29936]: I1205 13:02:36.391078 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8cmn\" (UniqueName: \"kubernetes.io/projected/dae79856-19f7-42a9-9eb2-d4263c763a58-kube-api-access-c8cmn\") pod \"ovn-operator-controller-manager-647f96877-mm5qc\" (UID: \"dae79856-19f7-42a9-9eb2-d4263c763a58\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" Dec 05 13:02:36.392202 master-0 kubenswrapper[29936]: I1205 13:02:36.391158 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdg5t\" (UniqueName: \"kubernetes.io/projected/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-kube-api-access-kdg5t\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:36.400593 master-0 kubenswrapper[29936]: I1205 13:02:36.398542 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:36.437647 master-0 kubenswrapper[29936]: I1205 13:02:36.437460 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-8wd8k"] Dec 05 13:02:36.439950 master-0 kubenswrapper[29936]: I1205 13:02:36.439902 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" Dec 05 13:02:36.447922 master-0 kubenswrapper[29936]: I1205 13:02:36.447827 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j"] Dec 05 13:02:36.477268 master-0 kubenswrapper[29936]: I1205 13:02:36.475157 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j"] Dec 05 13:02:36.478674 master-0 kubenswrapper[29936]: I1205 13:02:36.478632 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzwbn\" (UniqueName: \"kubernetes.io/projected/49bd6523-c715-46b2-8112-070019badeed-kube-api-access-xzwbn\") pod \"octavia-operator-controller-manager-845b79dc4f-pt7jr\" (UID: \"49bd6523-c715-46b2-8112-070019badeed\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:02:36.479916 master-0 kubenswrapper[29936]: I1205 13:02:36.479857 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8cmn\" (UniqueName: \"kubernetes.io/projected/dae79856-19f7-42a9-9eb2-d4263c763a58-kube-api-access-c8cmn\") pod \"ovn-operator-controller-manager-647f96877-mm5qc\" (UID: \"dae79856-19f7-42a9-9eb2-d4263c763a58\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" Dec 05 13:02:36.480038 master-0 kubenswrapper[29936]: I1205 13:02:36.479980 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" Dec 05 13:02:36.488699 master-0 kubenswrapper[29936]: I1205 13:02:36.488593 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-8wd8k"] Dec 05 13:02:36.501013 master-0 kubenswrapper[29936]: I1205 13:02:36.500928 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfg99\" (UniqueName: \"kubernetes.io/projected/e083a41a-8cd3-458d-a55d-cf3be20f0618-kube-api-access-kfg99\") pod \"placement-operator-controller-manager-6b64f6f645-bdv2j\" (UID: \"e083a41a-8cd3-458d-a55d-cf3be20f0618\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" Dec 05 13:02:36.509655 master-0 kubenswrapper[29936]: I1205 13:02:36.505588 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j"] Dec 05 13:02:36.520274 master-0 kubenswrapper[29936]: I1205 13:02:36.519671 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2"] Dec 05 13:02:36.522989 master-0 kubenswrapper[29936]: I1205 13:02:36.522883 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:02:36.523603 master-0 kubenswrapper[29936]: I1205 13:02:36.523510 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" Dec 05 13:02:36.525208 master-0 kubenswrapper[29936]: I1205 13:02:36.525100 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdnmd\" (UniqueName: \"kubernetes.io/projected/97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02-kube-api-access-tdnmd\") pod \"swift-operator-controller-manager-696b999796-8wd8k\" (UID: \"97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" Dec 05 13:02:36.525322 master-0 kubenswrapper[29936]: I1205 13:02:36.525297 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdg5t\" (UniqueName: \"kubernetes.io/projected/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-kube-api-access-kdg5t\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:36.525465 master-0 kubenswrapper[29936]: I1205 13:02:36.525439 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:36.525754 master-0 kubenswrapper[29936]: E1205 13:02:36.525724 29936 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 13:02:36.525813 master-0 kubenswrapper[29936]: E1205 13:02:36.525799 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert podName:f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9 nodeName:}" failed. No retries permitted until 2025-12-05 13:02:37.025773985 +0000 UTC m=+754.157853656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" (UID: "f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 13:02:36.537633 master-0 kubenswrapper[29936]: I1205 13:02:36.537529 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfg99\" (UniqueName: \"kubernetes.io/projected/e083a41a-8cd3-458d-a55d-cf3be20f0618-kube-api-access-kfg99\") pod \"placement-operator-controller-manager-6b64f6f645-bdv2j\" (UID: \"e083a41a-8cd3-458d-a55d-cf3be20f0618\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" Dec 05 13:02:36.539913 master-0 kubenswrapper[29936]: I1205 13:02:36.539826 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2"] Dec 05 13:02:36.550556 master-0 kubenswrapper[29936]: I1205 13:02:36.550144 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7"] Dec 05 13:02:36.554267 master-0 kubenswrapper[29936]: I1205 13:02:36.554171 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" Dec 05 13:02:36.564824 master-0 kubenswrapper[29936]: I1205 13:02:36.564599 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdg5t\" (UniqueName: \"kubernetes.io/projected/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-kube-api-access-kdg5t\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:36.579466 master-0 kubenswrapper[29936]: I1205 13:02:36.578204 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7"] Dec 05 13:02:36.627370 master-0 kubenswrapper[29936]: I1205 13:02:36.627310 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" Dec 05 13:02:36.631234 master-0 kubenswrapper[29936]: I1205 13:02:36.630818 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcrfr\" (UniqueName: \"kubernetes.io/projected/d74e954d-e55b-4280-a96f-14f2775fa484-kube-api-access-dcrfr\") pod \"telemetry-operator-controller-manager-7b5867bfc7-jmm5j\" (UID: \"d74e954d-e55b-4280-a96f-14f2775fa484\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" Dec 05 13:02:36.631234 master-0 kubenswrapper[29936]: I1205 13:02:36.630949 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85xws\" (UniqueName: \"kubernetes.io/projected/6e8afa75-0149-45e2-8015-1c519267961c-kube-api-access-85xws\") pod \"test-operator-controller-manager-57dfcdd5b8-t6nq2\" (UID: \"6e8afa75-0149-45e2-8015-1c519267961c\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:02:36.631234 master-0 kubenswrapper[29936]: I1205 13:02:36.630999 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk797\" (UniqueName: \"kubernetes.io/projected/960abe54-24ff-4b94-b5a6-7a4cb11a5466-kube-api-access-pk797\") pod \"watcher-operator-controller-manager-6b9b669fdb-2d5g7\" (UID: \"960abe54-24ff-4b94-b5a6-7a4cb11a5466\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" Dec 05 13:02:36.631457 master-0 kubenswrapper[29936]: I1205 13:02:36.631352 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdnmd\" (UniqueName: \"kubernetes.io/projected/97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02-kube-api-access-tdnmd\") pod \"swift-operator-controller-manager-696b999796-8wd8k\" (UID: \"97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" Dec 05 13:02:36.648556 master-0 kubenswrapper[29936]: I1205 13:02:36.648493 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph"] Dec 05 13:02:36.652428 master-0 kubenswrapper[29936]: I1205 13:02:36.652374 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:36.664751 master-0 kubenswrapper[29936]: W1205 13:02:36.664673 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19b3073_31ec_46d5_9534_61bedaf6de73.slice/crio-8ebe534c4a3ed009dec7383b9a8842e14bbf1a26b09d03183686e955e09be5e6 WatchSource:0}: Error finding container 8ebe534c4a3ed009dec7383b9a8842e14bbf1a26b09d03183686e955e09be5e6: Status 404 returned error can't find the container with id 8ebe534c4a3ed009dec7383b9a8842e14bbf1a26b09d03183686e955e09be5e6 Dec 05 13:02:36.668268 master-0 kubenswrapper[29936]: I1205 13:02:36.668026 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 13:02:36.668673 master-0 kubenswrapper[29936]: I1205 13:02:36.668241 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 13:02:36.674689 master-0 kubenswrapper[29936]: I1205 13:02:36.674592 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph"] Dec 05 13:02:36.679798 master-0 kubenswrapper[29936]: I1205 13:02:36.679745 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdnmd\" (UniqueName: \"kubernetes.io/projected/97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02-kube-api-access-tdnmd\") pod \"swift-operator-controller-manager-696b999796-8wd8k\" (UID: \"97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" Dec 05 13:02:36.683693 master-0 kubenswrapper[29936]: I1205 13:02:36.683605 29936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 13:02:36.698813 master-0 kubenswrapper[29936]: I1205 13:02:36.698732 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" Dec 05 13:02:36.744876 master-0 kubenswrapper[29936]: I1205 13:02:36.732199 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:02:36.744876 master-0 kubenswrapper[29936]: I1205 13:02:36.733583 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:36.744876 master-0 kubenswrapper[29936]: I1205 13:02:36.734166 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcrfr\" (UniqueName: \"kubernetes.io/projected/d74e954d-e55b-4280-a96f-14f2775fa484-kube-api-access-dcrfr\") pod \"telemetry-operator-controller-manager-7b5867bfc7-jmm5j\" (UID: \"d74e954d-e55b-4280-a96f-14f2775fa484\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" Dec 05 13:02:36.744876 master-0 kubenswrapper[29936]: I1205 13:02:36.734443 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85xws\" (UniqueName: \"kubernetes.io/projected/6e8afa75-0149-45e2-8015-1c519267961c-kube-api-access-85xws\") pod \"test-operator-controller-manager-57dfcdd5b8-t6nq2\" (UID: \"6e8afa75-0149-45e2-8015-1c519267961c\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:02:36.744876 master-0 kubenswrapper[29936]: I1205 13:02:36.734514 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t47wj\" (UniqueName: \"kubernetes.io/projected/19e3b314-b614-4428-85c6-9e6b9f5e01bc-kube-api-access-t47wj\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:36.744876 master-0 kubenswrapper[29936]: I1205 13:02:36.734565 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk797\" (UniqueName: \"kubernetes.io/projected/960abe54-24ff-4b94-b5a6-7a4cb11a5466-kube-api-access-pk797\") pod \"watcher-operator-controller-manager-6b9b669fdb-2d5g7\" (UID: \"960abe54-24ff-4b94-b5a6-7a4cb11a5466\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" Dec 05 13:02:36.744876 master-0 kubenswrapper[29936]: I1205 13:02:36.734822 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:36.766021 master-0 kubenswrapper[29936]: I1205 13:02:36.765953 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85xws\" (UniqueName: \"kubernetes.io/projected/6e8afa75-0149-45e2-8015-1c519267961c-kube-api-access-85xws\") pod \"test-operator-controller-manager-57dfcdd5b8-t6nq2\" (UID: \"6e8afa75-0149-45e2-8015-1c519267961c\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:02:36.769010 master-0 kubenswrapper[29936]: I1205 13:02:36.768964 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" Dec 05 13:02:36.788210 master-0 kubenswrapper[29936]: I1205 13:02:36.776597 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcrfr\" (UniqueName: \"kubernetes.io/projected/d74e954d-e55b-4280-a96f-14f2775fa484-kube-api-access-dcrfr\") pod \"telemetry-operator-controller-manager-7b5867bfc7-jmm5j\" (UID: \"d74e954d-e55b-4280-a96f-14f2775fa484\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" Dec 05 13:02:36.789765 master-0 kubenswrapper[29936]: I1205 13:02:36.789471 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk797\" (UniqueName: \"kubernetes.io/projected/960abe54-24ff-4b94-b5a6-7a4cb11a5466-kube-api-access-pk797\") pod \"watcher-operator-controller-manager-6b9b669fdb-2d5g7\" (UID: \"960abe54-24ff-4b94-b5a6-7a4cb11a5466\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" Dec 05 13:02:36.789941 master-0 kubenswrapper[29936]: I1205 13:02:36.789799 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:02:36.799626 master-0 kubenswrapper[29936]: I1205 13:02:36.799113 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs"] Dec 05 13:02:36.800891 master-0 kubenswrapper[29936]: I1205 13:02:36.800844 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" Dec 05 13:02:36.811566 master-0 kubenswrapper[29936]: I1205 13:02:36.811505 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs"] Dec 05 13:02:36.814334 master-0 kubenswrapper[29936]: I1205 13:02:36.814296 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" Dec 05 13:02:36.833218 master-0 kubenswrapper[29936]: I1205 13:02:36.833132 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp"] Dec 05 13:02:36.837252 master-0 kubenswrapper[29936]: I1205 13:02:36.837163 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:36.837371 master-0 kubenswrapper[29936]: I1205 13:02:36.837298 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:36.837696 master-0 kubenswrapper[29936]: E1205 13:02:36.837591 29936 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 13:02:36.837696 master-0 kubenswrapper[29936]: E1205 13:02:36.837685 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:02:37.33765947 +0000 UTC m=+754.469739151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "webhook-server-cert" not found Dec 05 13:02:36.837845 master-0 kubenswrapper[29936]: I1205 13:02:36.837726 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t47wj\" (UniqueName: \"kubernetes.io/projected/19e3b314-b614-4428-85c6-9e6b9f5e01bc-kube-api-access-t47wj\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:36.837845 master-0 kubenswrapper[29936]: E1205 13:02:36.837600 29936 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 13:02:36.838112 master-0 kubenswrapper[29936]: E1205 13:02:36.838072 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:02:37.33804752 +0000 UTC m=+754.470127201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "metrics-server-cert" not found Dec 05 13:02:36.906708 master-0 kubenswrapper[29936]: I1205 13:02:36.906621 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t47wj\" (UniqueName: \"kubernetes.io/projected/19e3b314-b614-4428-85c6-9e6b9f5e01bc-kube-api-access-t47wj\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:36.940628 master-0 kubenswrapper[29936]: I1205 13:02:36.940444 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfwj4\" (UniqueName: \"kubernetes.io/projected/19f12de2-a4f5-42b7-bce4-2bc439b9f61a-kube-api-access-lfwj4\") pod \"rabbitmq-cluster-operator-manager-78955d896f-wnccs\" (UID: \"19f12de2-a4f5-42b7-bce4-2bc439b9f61a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" Dec 05 13:02:37.010996 master-0 kubenswrapper[29936]: I1205 13:02:37.010619 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-52767"] Dec 05 13:02:37.043947 master-0 kubenswrapper[29936]: I1205 13:02:37.043682 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:37.043947 master-0 kubenswrapper[29936]: I1205 13:02:37.043763 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfwj4\" (UniqueName: \"kubernetes.io/projected/19f12de2-a4f5-42b7-bce4-2bc439b9f61a-kube-api-access-lfwj4\") pod \"rabbitmq-cluster-operator-manager-78955d896f-wnccs\" (UID: \"19f12de2-a4f5-42b7-bce4-2bc439b9f61a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" Dec 05 13:02:37.045217 master-0 kubenswrapper[29936]: E1205 13:02:37.044318 29936 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 13:02:37.045217 master-0 kubenswrapper[29936]: E1205 13:02:37.044428 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert podName:f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9 nodeName:}" failed. No retries permitted until 2025-12-05 13:02:38.044399937 +0000 UTC m=+755.176479618 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" (UID: "f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 13:02:37.068255 master-0 kubenswrapper[29936]: I1205 13:02:37.068155 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" Dec 05 13:02:37.088609 master-0 kubenswrapper[29936]: I1205 13:02:37.088522 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfwj4\" (UniqueName: \"kubernetes.io/projected/19f12de2-a4f5-42b7-bce4-2bc439b9f61a-kube-api-access-lfwj4\") pod \"rabbitmq-cluster-operator-manager-78955d896f-wnccs\" (UID: \"19f12de2-a4f5-42b7-bce4-2bc439b9f61a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" Dec 05 13:02:37.094504 master-0 kubenswrapper[29936]: I1205 13:02:37.094448 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl"] Dec 05 13:02:37.140646 master-0 kubenswrapper[29936]: I1205 13:02:37.138371 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" Dec 05 13:02:37.386968 master-0 kubenswrapper[29936]: I1205 13:02:37.386894 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:37.387572 master-0 kubenswrapper[29936]: I1205 13:02:37.387052 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:37.387572 master-0 kubenswrapper[29936]: I1205 13:02:37.387158 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:37.387572 master-0 kubenswrapper[29936]: E1205 13:02:37.387362 29936 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 13:02:37.387572 master-0 kubenswrapper[29936]: E1205 13:02:37.387430 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:02:38.387408607 +0000 UTC m=+755.519488288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "webhook-server-cert" not found Dec 05 13:02:37.387934 master-0 kubenswrapper[29936]: E1205 13:02:37.387829 29936 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 13:02:37.387934 master-0 kubenswrapper[29936]: E1205 13:02:37.387880 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:02:38.38786634 +0000 UTC m=+755.519946021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "metrics-server-cert" not found Dec 05 13:02:37.388310 master-0 kubenswrapper[29936]: E1205 13:02:37.388225 29936 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:37.388477 master-0 kubenswrapper[29936]: E1205 13:02:37.388422 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert podName:e64b46f2-d1e1-462f-952d-a7a7f8c663e9 nodeName:}" failed. No retries permitted until 2025-12-05 13:02:39.388381554 +0000 UTC m=+756.520461405 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-rprkt" (UID: "e64b46f2-d1e1-462f-952d-a7a7f8c663e9") : secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:37.430443 master-0 kubenswrapper[29936]: I1205 13:02:37.429997 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" event={"ID":"c19b3073-31ec-46d5-9534-61bedaf6de73","Type":"ContainerStarted","Data":"8ebe534c4a3ed009dec7383b9a8842e14bbf1a26b09d03183686e955e09be5e6"} Dec 05 13:02:37.462711 master-0 kubenswrapper[29936]: I1205 13:02:37.458001 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" event={"ID":"72464fc8-9970-443f-98ac-1c40ada39742","Type":"ContainerStarted","Data":"69737f9035450a5fc0406819698f44b27d0f7297136bd5b4d81dfe32edb3a0d3"} Dec 05 13:02:37.494263 master-0 kubenswrapper[29936]: I1205 13:02:37.488789 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6"] Dec 05 13:02:37.494263 master-0 kubenswrapper[29936]: I1205 13:02:37.493107 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" event={"ID":"cd1c763d-2aee-4791-a9ea-87299af8efc4","Type":"ContainerStarted","Data":"674aa53ac15dda82cc621ceee91cfd196b4135a44b78ed39f6fba75c878b46ee"} Dec 05 13:02:37.505022 master-0 kubenswrapper[29936]: I1205 13:02:37.499363 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9"] Dec 05 13:02:37.512621 master-0 kubenswrapper[29936]: I1205 13:02:37.509410 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb"] Dec 05 13:02:37.569691 master-0 kubenswrapper[29936]: I1205 13:02:37.563492 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95"] Dec 05 13:02:37.595896 master-0 kubenswrapper[29936]: I1205 13:02:37.585001 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7"] Dec 05 13:02:37.633097 master-0 kubenswrapper[29936]: W1205 13:02:37.633030 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6662695e_b061_402f_935b_c08c7eea8265.slice/crio-4190398af7f3c2050fdf4dff95aebff8011930a94f0b0275ccef7140e8519290 WatchSource:0}: Error finding container 4190398af7f3c2050fdf4dff95aebff8011930a94f0b0275ccef7140e8519290: Status 404 returned error can't find the container with id 4190398af7f3c2050fdf4dff95aebff8011930a94f0b0275ccef7140e8519290 Dec 05 13:02:37.649963 master-0 kubenswrapper[29936]: I1205 13:02:37.643843 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw"] Dec 05 13:02:37.751668 master-0 kubenswrapper[29936]: I1205 13:02:37.751574 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd"] Dec 05 13:02:37.846346 master-0 kubenswrapper[29936]: I1205 13:02:37.846268 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff"] Dec 05 13:02:38.047939 master-0 kubenswrapper[29936]: I1205 13:02:38.047773 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:38.048168 master-0 kubenswrapper[29936]: E1205 13:02:38.048072 29936 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 13:02:38.048258 master-0 kubenswrapper[29936]: E1205 13:02:38.048238 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert podName:f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9 nodeName:}" failed. No retries permitted until 2025-12-05 13:02:40.048207642 +0000 UTC m=+757.180287323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" (UID: "f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 13:02:38.260922 master-0 kubenswrapper[29936]: I1205 13:02:38.260859 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j"] Dec 05 13:02:38.278908 master-0 kubenswrapper[29936]: W1205 13:02:38.278808 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode083a41a_8cd3_458d_a55d_cf3be20f0618.slice/crio-58636dd49849bd467c731b0ce4637673283bf6c5bc716ac9810d0ae62f3cde64 WatchSource:0}: Error finding container 58636dd49849bd467c731b0ce4637673283bf6c5bc716ac9810d0ae62f3cde64: Status 404 returned error can't find the container with id 58636dd49849bd467c731b0ce4637673283bf6c5bc716ac9810d0ae62f3cde64 Dec 05 13:02:38.285808 master-0 kubenswrapper[29936]: W1205 13:02:38.285721 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddae79856_19f7_42a9_9eb2_d4263c763a58.slice/crio-677835dd6d04d3076a720af2577fa52727fe55c15e616ca7b1956e1a3e2deeda WatchSource:0}: Error finding container 677835dd6d04d3076a720af2577fa52727fe55c15e616ca7b1956e1a3e2deeda: Status 404 returned error can't find the container with id 677835dd6d04d3076a720af2577fa52727fe55c15e616ca7b1956e1a3e2deeda Dec 05 13:02:38.298557 master-0 kubenswrapper[29936]: I1205 13:02:38.298430 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc"] Dec 05 13:02:38.328167 master-0 kubenswrapper[29936]: I1205 13:02:38.328121 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6"] Dec 05 13:02:38.462359 master-0 kubenswrapper[29936]: I1205 13:02:38.462282 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:38.462920 master-0 kubenswrapper[29936]: I1205 13:02:38.462501 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:38.462920 master-0 kubenswrapper[29936]: E1205 13:02:38.462669 29936 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 13:02:38.462920 master-0 kubenswrapper[29936]: E1205 13:02:38.462731 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:02:40.462708924 +0000 UTC m=+757.594788605 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "metrics-server-cert" not found Dec 05 13:02:38.466132 master-0 kubenswrapper[29936]: E1205 13:02:38.464948 29936 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 13:02:38.466132 master-0 kubenswrapper[29936]: E1205 13:02:38.465068 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:02:40.465041748 +0000 UTC m=+757.597121429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "webhook-server-cert" not found Dec 05 13:02:38.518996 master-0 kubenswrapper[29936]: I1205 13:02:38.518926 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" event={"ID":"11364ea4-cb4b-4398-9692-aa5dd12f8a9f","Type":"ContainerStarted","Data":"21afab08f1583e4bb04b9a96a2fea7942635cd7b86c55b4abaf6898825e8744a"} Dec 05 13:02:38.528503 master-0 kubenswrapper[29936]: I1205 13:02:38.528152 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs"] Dec 05 13:02:38.543467 master-0 kubenswrapper[29936]: I1205 13:02:38.543382 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7"] Dec 05 13:02:38.547350 master-0 kubenswrapper[29936]: I1205 13:02:38.547305 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j"] Dec 05 13:02:38.557895 master-0 kubenswrapper[29936]: I1205 13:02:38.557753 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-8wd8k"] Dec 05 13:02:38.558560 master-0 kubenswrapper[29936]: I1205 13:02:38.558493 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" event={"ID":"ef3f1077-b60d-4960-95ae-418df7695587","Type":"ContainerStarted","Data":"4e52c5718516feb7eba30917515da1b43f91d5f19a84668d4c99a3834e5d6a5b"} Dec 05 13:02:38.566105 master-0 kubenswrapper[29936]: I1205 13:02:38.566015 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr"] Dec 05 13:02:38.580303 master-0 kubenswrapper[29936]: W1205 13:02:38.570420 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod960abe54_24ff_4b94_b5a6_7a4cb11a5466.slice/crio-a8f61328794db95cc02cb624e856842f5127f662af88b453c4f8ef928adf3e50 WatchSource:0}: Error finding container a8f61328794db95cc02cb624e856842f5127f662af88b453c4f8ef928adf3e50: Status 404 returned error can't find the container with id a8f61328794db95cc02cb624e856842f5127f662af88b453c4f8ef928adf3e50 Dec 05 13:02:38.584262 master-0 kubenswrapper[29936]: I1205 13:02:38.583507 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" event={"ID":"214a3e2a-d306-45aa-8b2c-966a9953028c","Type":"ContainerStarted","Data":"c082cbc9b688556c0b3fe0e8f3babae3c665b0c9da2c214698da6bf89fd0d389"} Dec 05 13:02:38.599510 master-0 kubenswrapper[29936]: I1205 13:02:38.596468 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" event={"ID":"e371fb46-339c-440c-b53a-31daf18710ef","Type":"ContainerStarted","Data":"8efa6688dbf2260ac19b570005054f33889cea9fa5ef3d2aeca46c87e1219ad1"} Dec 05 13:02:38.603681 master-0 kubenswrapper[29936]: I1205 13:02:38.600325 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" event={"ID":"6662695e-b061-402f-935b-c08c7eea8265","Type":"ContainerStarted","Data":"4190398af7f3c2050fdf4dff95aebff8011930a94f0b0275ccef7140e8519290"} Dec 05 13:02:38.609534 master-0 kubenswrapper[29936]: I1205 13:02:38.609355 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" event={"ID":"e083a41a-8cd3-458d-a55d-cf3be20f0618","Type":"ContainerStarted","Data":"58636dd49849bd467c731b0ce4637673283bf6c5bc716ac9810d0ae62f3cde64"} Dec 05 13:02:38.612985 master-0 kubenswrapper[29936]: I1205 13:02:38.612932 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" event={"ID":"7c7c4bb7-eb69-4370-8539-34ba8c63383e","Type":"ContainerStarted","Data":"3ba5e50228dd1f0f8f5281f3831e216a2681902512dfe741ccab044502b2706a"} Dec 05 13:02:38.615583 master-0 kubenswrapper[29936]: I1205 13:02:38.615536 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" event={"ID":"42a27f5b-3b94-49e5-902d-abd26cb141d8","Type":"ContainerStarted","Data":"3faafea2bcd76f35d5e1bcfdb64bb75a762b1b6d938c1352f389d2404141e943"} Dec 05 13:02:38.618432 master-0 kubenswrapper[29936]: I1205 13:02:38.618393 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" event={"ID":"ee9e12a6-a899-4e44-b4e1-d975493b6b9c","Type":"ContainerStarted","Data":"a51ad88ea36fa8bc0801d04e826cfdd91158bdf5889c29e404cf05f74d6e1284"} Dec 05 13:02:38.619970 master-0 kubenswrapper[29936]: I1205 13:02:38.619916 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" event={"ID":"77fe28e2-671e-4a0d-b065-0746abf83306","Type":"ContainerStarted","Data":"1eaa307bc6171fc9a3b06fc2fea89e7f92f406bd0f62f35dd6cd9ef12a777093"} Dec 05 13:02:38.627468 master-0 kubenswrapper[29936]: I1205 13:02:38.627302 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" event={"ID":"dae79856-19f7-42a9-9eb2-d4263c763a58","Type":"ContainerStarted","Data":"677835dd6d04d3076a720af2577fa52727fe55c15e616ca7b1956e1a3e2deeda"} Dec 05 13:02:38.634386 master-0 kubenswrapper[29936]: E1205 13:02:38.634298 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:50,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:30,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:10,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lfwj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000810000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-78955d896f-wnccs_openstack-operators(19f12de2-a4f5-42b7-bce4-2bc439b9f61a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:02:38.636661 master-0 kubenswrapper[29936]: E1205 13:02:38.636580 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" podUID="19f12de2-a4f5-42b7-bce4-2bc439b9f61a" Dec 05 13:02:38.637334 master-0 kubenswrapper[29936]: I1205 13:02:38.637289 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2"] Dec 05 13:02:38.666483 master-0 kubenswrapper[29936]: W1205 13:02:38.666377 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e8afa75_0149_45e2_8015_1c519267961c.slice/crio-db9550a645713c0e34a574f2aae9dc2c11dc00fe332548e934eee77d7cbfee99 WatchSource:0}: Error finding container db9550a645713c0e34a574f2aae9dc2c11dc00fe332548e934eee77d7cbfee99: Status 404 returned error can't find the container with id db9550a645713c0e34a574f2aae9dc2c11dc00fe332548e934eee77d7cbfee99 Dec 05 13:02:38.674139 master-0 kubenswrapper[29936]: E1205 13:02:38.674082 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:50,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:30,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:10,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-85xws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-57dfcdd5b8-t6nq2_openstack-operators(6e8afa75-0149-45e2-8015-1c519267961c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:02:38.676563 master-0 kubenswrapper[29936]: E1205 13:02:38.676531 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-85xws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-57dfcdd5b8-t6nq2_openstack-operators(6e8afa75-0149-45e2-8015-1c519267961c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:02:38.678338 master-0 kubenswrapper[29936]: E1205 13:02:38.678278 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" podUID="6e8afa75-0149-45e2-8015-1c519267961c" Dec 05 13:02:39.490517 master-0 kubenswrapper[29936]: I1205 13:02:39.489931 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:39.490517 master-0 kubenswrapper[29936]: E1205 13:02:39.490379 29936 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:39.490517 master-0 kubenswrapper[29936]: E1205 13:02:39.490449 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert podName:e64b46f2-d1e1-462f-952d-a7a7f8c663e9 nodeName:}" failed. No retries permitted until 2025-12-05 13:02:43.490424959 +0000 UTC m=+760.622504640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-rprkt" (UID: "e64b46f2-d1e1-462f-952d-a7a7f8c663e9") : secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:39.650859 master-0 kubenswrapper[29936]: I1205 13:02:39.650530 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" event={"ID":"19f12de2-a4f5-42b7-bce4-2bc439b9f61a","Type":"ContainerStarted","Data":"9c4bd2aa1d3ae51c981242ffa40d317f08bf3d9bc92e0c65798123f1a49a6bbc"} Dec 05 13:02:39.654000 master-0 kubenswrapper[29936]: E1205 13:02:39.653260 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" podUID="19f12de2-a4f5-42b7-bce4-2bc439b9f61a" Dec 05 13:02:39.658806 master-0 kubenswrapper[29936]: I1205 13:02:39.658632 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" event={"ID":"97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02","Type":"ContainerStarted","Data":"8e5b4404956c87e54ceb1116a23607277d808fce41de3fef48816c883823234a"} Dec 05 13:02:39.663475 master-0 kubenswrapper[29936]: I1205 13:02:39.663231 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" event={"ID":"d74e954d-e55b-4280-a96f-14f2775fa484","Type":"ContainerStarted","Data":"bb77b03b11738c771c7f7507f77822e2fd7c3db02eeb75bc77a7979d8d16dfba"} Dec 05 13:02:39.669067 master-0 kubenswrapper[29936]: I1205 13:02:39.668875 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" event={"ID":"6e8afa75-0149-45e2-8015-1c519267961c","Type":"ContainerStarted","Data":"db9550a645713c0e34a574f2aae9dc2c11dc00fe332548e934eee77d7cbfee99"} Dec 05 13:02:39.671945 master-0 kubenswrapper[29936]: I1205 13:02:39.671440 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" event={"ID":"49bd6523-c715-46b2-8112-070019badeed","Type":"ContainerStarted","Data":"ce2f519ef97e951e63d69780b35d73552b0089ca06859c35c611475a94fdc0af"} Dec 05 13:02:39.671945 master-0 kubenswrapper[29936]: E1205 13:02:39.671599 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" podUID="6e8afa75-0149-45e2-8015-1c519267961c" Dec 05 13:02:39.673838 master-0 kubenswrapper[29936]: I1205 13:02:39.673724 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" event={"ID":"960abe54-24ff-4b94-b5a6-7a4cb11a5466","Type":"ContainerStarted","Data":"a8f61328794db95cc02cb624e856842f5127f662af88b453c4f8ef928adf3e50"} Dec 05 13:02:40.106458 master-0 kubenswrapper[29936]: I1205 13:02:40.106106 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:40.106458 master-0 kubenswrapper[29936]: E1205 13:02:40.106501 29936 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 13:02:40.106458 master-0 kubenswrapper[29936]: E1205 13:02:40.106566 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert podName:f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9 nodeName:}" failed. No retries permitted until 2025-12-05 13:02:44.10654359 +0000 UTC m=+761.238623271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" (UID: "f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 13:02:40.519455 master-0 kubenswrapper[29936]: I1205 13:02:40.519359 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:40.520225 master-0 kubenswrapper[29936]: E1205 13:02:40.519656 29936 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 13:02:40.520225 master-0 kubenswrapper[29936]: I1205 13:02:40.519762 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:40.520225 master-0 kubenswrapper[29936]: E1205 13:02:40.519816 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:02:44.519783208 +0000 UTC m=+761.651863069 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "metrics-server-cert" not found Dec 05 13:02:40.520225 master-0 kubenswrapper[29936]: E1205 13:02:40.519938 29936 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 13:02:40.520225 master-0 kubenswrapper[29936]: E1205 13:02:40.520115 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:02:44.520103488 +0000 UTC m=+761.652183359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "webhook-server-cert" not found Dec 05 13:02:40.710856 master-0 kubenswrapper[29936]: E1205 13:02:40.706718 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" podUID="19f12de2-a4f5-42b7-bce4-2bc439b9f61a" Dec 05 13:02:40.710856 master-0 kubenswrapper[29936]: E1205 13:02:40.708304 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" podUID="6e8afa75-0149-45e2-8015-1c519267961c" Dec 05 13:02:43.501741 master-0 kubenswrapper[29936]: I1205 13:02:43.501635 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:43.502421 master-0 kubenswrapper[29936]: E1205 13:02:43.501895 29936 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:43.502421 master-0 kubenswrapper[29936]: E1205 13:02:43.502032 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert podName:e64b46f2-d1e1-462f-952d-a7a7f8c663e9 nodeName:}" failed. No retries permitted until 2025-12-05 13:02:51.50197905 +0000 UTC m=+768.634058731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-rprkt" (UID: "e64b46f2-d1e1-462f-952d-a7a7f8c663e9") : secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:44.119407 master-0 kubenswrapper[29936]: I1205 13:02:44.118327 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:44.119407 master-0 kubenswrapper[29936]: E1205 13:02:44.118610 29936 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 13:02:44.119407 master-0 kubenswrapper[29936]: E1205 13:02:44.118744 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert podName:f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9 nodeName:}" failed. No retries permitted until 2025-12-05 13:02:52.118717388 +0000 UTC m=+769.250797069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert") pod "openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" (UID: "f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 13:02:44.529164 master-0 kubenswrapper[29936]: I1205 13:02:44.529043 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:44.529856 master-0 kubenswrapper[29936]: I1205 13:02:44.529481 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:44.529856 master-0 kubenswrapper[29936]: E1205 13:02:44.529624 29936 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 13:02:44.529856 master-0 kubenswrapper[29936]: E1205 13:02:44.529738 29936 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 13:02:44.529856 master-0 kubenswrapper[29936]: E1205 13:02:44.529807 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:02:52.529766696 +0000 UTC m=+769.661846377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "webhook-server-cert" not found Dec 05 13:02:44.529856 master-0 kubenswrapper[29936]: E1205 13:02:44.529846 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:02:52.529821298 +0000 UTC m=+769.661901159 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "metrics-server-cert" not found Dec 05 13:02:51.567197 master-0 kubenswrapper[29936]: I1205 13:02:51.567107 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:02:51.568007 master-0 kubenswrapper[29936]: E1205 13:02:51.567616 29936 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:51.568007 master-0 kubenswrapper[29936]: E1205 13:02:51.567730 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert podName:e64b46f2-d1e1-462f-952d-a7a7f8c663e9 nodeName:}" failed. No retries permitted until 2025-12-05 13:03:07.567705209 +0000 UTC m=+784.699784890 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-rprkt" (UID: "e64b46f2-d1e1-462f-952d-a7a7f8c663e9") : secret "infra-operator-webhook-server-cert" not found Dec 05 13:02:52.184022 master-0 kubenswrapper[29936]: I1205 13:02:52.183604 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:52.192905 master-0 kubenswrapper[29936]: I1205 13:02:52.192822 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9-cert\") pod \"openstack-baremetal-operator-controller-manager-6f998f5746gr6c9\" (UID: \"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:52.282380 master-0 kubenswrapper[29936]: I1205 13:02:52.282303 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:02:52.593911 master-0 kubenswrapper[29936]: I1205 13:02:52.593824 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:52.594782 master-0 kubenswrapper[29936]: I1205 13:02:52.593956 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:02:52.594782 master-0 kubenswrapper[29936]: E1205 13:02:52.594296 29936 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 13:02:52.594782 master-0 kubenswrapper[29936]: E1205 13:02:52.594371 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:03:08.594346574 +0000 UTC m=+785.726426255 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "webhook-server-cert" not found Dec 05 13:02:52.594894 master-0 kubenswrapper[29936]: E1205 13:02:52.594784 29936 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 13:02:52.594894 master-0 kubenswrapper[29936]: E1205 13:02:52.594809 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs podName:19e3b314-b614-4428-85c6-9e6b9f5e01bc nodeName:}" failed. No retries permitted until 2025-12-05 13:03:08.594801476 +0000 UTC m=+785.726881157 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs") pod "openstack-operator-controller-manager-599cfccd85-6klph" (UID: "19e3b314-b614-4428-85c6-9e6b9f5e01bc") : secret "metrics-server-cert" not found Dec 05 13:03:07.660712 master-0 kubenswrapper[29936]: I1205 13:03:07.660634 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:03:07.668723 master-0 kubenswrapper[29936]: I1205 13:03:07.668656 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e64b46f2-d1e1-462f-952d-a7a7f8c663e9-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-rprkt\" (UID: \"e64b46f2-d1e1-462f-952d-a7a7f8c663e9\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:03:07.673252 master-0 kubenswrapper[29936]: I1205 13:03:07.673128 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9"] Dec 05 13:03:07.802809 master-0 kubenswrapper[29936]: I1205 13:03:07.802725 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:03:08.032782 master-0 kubenswrapper[29936]: I1205 13:03:08.032650 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" event={"ID":"77fe28e2-671e-4a0d-b065-0746abf83306","Type":"ContainerStarted","Data":"335d141b13bdba8eef2525252498ceedf5af0847ecc3653f4459ab1186678624"} Dec 05 13:03:08.037373 master-0 kubenswrapper[29936]: I1205 13:03:08.037301 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" event={"ID":"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9","Type":"ContainerStarted","Data":"57869abf8345885f5a76642a6420da7e3440ed1a3b57e850b2116c5a5c6d59f0"} Dec 05 13:03:08.687618 master-0 kubenswrapper[29936]: I1205 13:03:08.686567 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:03:08.687618 master-0 kubenswrapper[29936]: I1205 13:03:08.686689 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:03:08.693449 master-0 kubenswrapper[29936]: I1205 13:03:08.693392 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-webhook-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:03:08.694388 master-0 kubenswrapper[29936]: I1205 13:03:08.694318 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/19e3b314-b614-4428-85c6-9e6b9f5e01bc-metrics-certs\") pod \"openstack-operator-controller-manager-599cfccd85-6klph\" (UID: \"19e3b314-b614-4428-85c6-9e6b9f5e01bc\") " pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:03:08.791158 master-0 kubenswrapper[29936]: I1205 13:03:08.785540 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt"] Dec 05 13:03:08.930035 master-0 kubenswrapper[29936]: I1205 13:03:08.929889 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:03:08.998525 master-0 kubenswrapper[29936]: E1205 13:03:08.998357 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4bsfj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-84bc9f68f5-sgjwb_openstack-operators(7c7c4bb7-eb69-4370-8539-34ba8c63383e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:03:08.998695 master-0 kubenswrapper[29936]: E1205 13:03:08.998539 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tdnmd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000810000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-696b999796-8wd8k_openstack-operators(97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:03:09.000170 master-0 kubenswrapper[29936]: E1205 13:03:09.000127 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" podUID="97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02" Dec 05 13:03:09.000464 master-0 kubenswrapper[29936]: E1205 13:03:09.000417 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k64q4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-f8856dd79-52767_openstack-operators(72464fc8-9970-443f-98ac-1c40ada39742): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:03:09.000643 master-0 kubenswrapper[29936]: E1205 13:03:09.000530 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s8k72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-647d75769b-dbbff_openstack-operators(e371fb46-339c-440c-b53a-31daf18710ef): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:03:09.000986 master-0 kubenswrapper[29936]: E1205 13:03:09.000630 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k9426,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-7c9bfd6967-c6t95_openstack-operators(ee9e12a6-a899-4e44-b4e1-d975493b6b9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:03:09.000986 master-0 kubenswrapper[29936]: E1205 13:03:09.000703 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" podUID="7c7c4bb7-eb69-4370-8539-34ba8c63383e" Dec 05 13:03:09.000986 master-0 kubenswrapper[29936]: E1205 13:03:09.000869 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f7mpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-58b8dcc5fb-bwmdd_openstack-operators(42a27f5b-3b94-49e5-902d-abd26cb141d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:03:09.001296 master-0 kubenswrapper[29936]: E1205 13:03:09.000969 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nf9x4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7cdd6b54fb-n8fr6_openstack-operators(ef3f1077-b60d-4960-95ae-418df7695587): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:03:09.001296 master-0 kubenswrapper[29936]: E1205 13:03:09.001069 29936 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xrq8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-78cd4f7769-wdgwl_openstack-operators(cd1c763d-2aee-4791-a9ea-87299af8efc4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 13:03:09.002737 master-0 kubenswrapper[29936]: E1205 13:03:09.002417 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" podUID="cd1c763d-2aee-4791-a9ea-87299af8efc4" Dec 05 13:03:09.002737 master-0 kubenswrapper[29936]: E1205 13:03:09.002472 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" podUID="72464fc8-9970-443f-98ac-1c40ada39742" Dec 05 13:03:09.002737 master-0 kubenswrapper[29936]: E1205 13:03:09.002499 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" podUID="e371fb46-339c-440c-b53a-31daf18710ef" Dec 05 13:03:09.002737 master-0 kubenswrapper[29936]: E1205 13:03:09.002521 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" podUID="ee9e12a6-a899-4e44-b4e1-d975493b6b9c" Dec 05 13:03:09.002737 master-0 kubenswrapper[29936]: E1205 13:03:09.002552 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" podUID="42a27f5b-3b94-49e5-902d-abd26cb141d8" Dec 05 13:03:09.002737 master-0 kubenswrapper[29936]: E1205 13:03:09.002580 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" podUID="ef3f1077-b60d-4960-95ae-418df7695587" Dec 05 13:03:09.147988 master-0 kubenswrapper[29936]: I1205 13:03:09.147909 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" event={"ID":"c19b3073-31ec-46d5-9534-61bedaf6de73","Type":"ContainerStarted","Data":"bf7be2c8470ad2288113e0aa3ca619186cf62b04edab938421f74935e2e1b7bd"} Dec 05 13:03:09.161540 master-0 kubenswrapper[29936]: I1205 13:03:09.161474 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" event={"ID":"214a3e2a-d306-45aa-8b2c-966a9953028c","Type":"ContainerStarted","Data":"8c015b536eb47005ccb6b18b7c2cac1ede11d4fcd65b8b28c2a35fbae957dce8"} Dec 05 13:03:09.174691 master-0 kubenswrapper[29936]: I1205 13:03:09.170285 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" event={"ID":"960abe54-24ff-4b94-b5a6-7a4cb11a5466","Type":"ContainerStarted","Data":"ddd5f7949ffea96af40d8c3266a66b9b4ab19d671b8eba7d19eeed629b8f60bd"} Dec 05 13:03:09.194065 master-0 kubenswrapper[29936]: I1205 13:03:09.193988 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" event={"ID":"97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02","Type":"ContainerStarted","Data":"44a68b7ef7bda6486428b400f7b89a8a49f2f1df67f463c3f647c6bfa32cd15c"} Dec 05 13:03:09.194989 master-0 kubenswrapper[29936]: I1205 13:03:09.194956 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" Dec 05 13:03:09.205816 master-0 kubenswrapper[29936]: E1205 13:03:09.201414 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" podUID="97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02" Dec 05 13:03:09.258702 master-0 kubenswrapper[29936]: I1205 13:03:09.258619 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" event={"ID":"6e8afa75-0149-45e2-8015-1c519267961c","Type":"ContainerStarted","Data":"15762cebb5368bb9e3aef28eeef55563c81d2c20b15cc1b9953470093d4e003d"} Dec 05 13:03:09.274814 master-0 kubenswrapper[29936]: I1205 13:03:09.273327 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" event={"ID":"ef3f1077-b60d-4960-95ae-418df7695587","Type":"ContainerStarted","Data":"932eb123dbf506e3be78002e3ed87c37cfcefc72667869aa14a186123e8e0ae3"} Dec 05 13:03:09.274814 master-0 kubenswrapper[29936]: I1205 13:03:09.274622 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" Dec 05 13:03:09.282531 master-0 kubenswrapper[29936]: E1205 13:03:09.278600 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" podUID="ef3f1077-b60d-4960-95ae-418df7695587" Dec 05 13:03:09.286367 master-0 kubenswrapper[29936]: I1205 13:03:09.286262 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" event={"ID":"cd1c763d-2aee-4791-a9ea-87299af8efc4","Type":"ContainerStarted","Data":"ccfe6a1ecaf9858f49bf2dfa770be338a0a72b0148a81cbf8a57e6a3f17195ad"} Dec 05 13:03:09.290195 master-0 kubenswrapper[29936]: I1205 13:03:09.287536 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" Dec 05 13:03:09.297204 master-0 kubenswrapper[29936]: E1205 13:03:09.294439 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" podUID="cd1c763d-2aee-4791-a9ea-87299af8efc4" Dec 05 13:03:09.327562 master-0 kubenswrapper[29936]: I1205 13:03:09.327490 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" event={"ID":"49bd6523-c715-46b2-8112-070019badeed","Type":"ContainerStarted","Data":"45e5fcf585d5814ad5b0cbe7f317814e0dfe349afc792a92ca00c4f55bebf638"} Dec 05 13:03:09.367126 master-0 kubenswrapper[29936]: I1205 13:03:09.367033 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" event={"ID":"e64b46f2-d1e1-462f-952d-a7a7f8c663e9","Type":"ContainerStarted","Data":"77204b1d3a0cb15bc3bb886deead41ec3e5405c12dea4b8591d917490bc3bdc7"} Dec 05 13:03:09.399319 master-0 kubenswrapper[29936]: I1205 13:03:09.394439 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" event={"ID":"ee9e12a6-a899-4e44-b4e1-d975493b6b9c","Type":"ContainerStarted","Data":"96b6da81f888d72f7ddf48309b989a8dd4260e98e2589be7969922301c257b92"} Dec 05 13:03:09.399319 master-0 kubenswrapper[29936]: I1205 13:03:09.395573 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:03:09.399319 master-0 kubenswrapper[29936]: E1205 13:03:09.398507 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" podUID="ee9e12a6-a899-4e44-b4e1-d975493b6b9c" Dec 05 13:03:09.420627 master-0 kubenswrapper[29936]: I1205 13:03:09.420553 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" event={"ID":"11364ea4-cb4b-4398-9692-aa5dd12f8a9f","Type":"ContainerStarted","Data":"92c5abe6bcf1c4a9e8e972f06531a3c260bbbe101dfd454a5cc882bd6ae3bebc"} Dec 05 13:03:09.452235 master-0 kubenswrapper[29936]: I1205 13:03:09.450568 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" event={"ID":"d74e954d-e55b-4280-a96f-14f2775fa484","Type":"ContainerStarted","Data":"0ead69260845c73f0a89f0088448f292d46617bc24624b712fe6dca5960b325a"} Dec 05 13:03:09.472959 master-0 kubenswrapper[29936]: I1205 13:03:09.472811 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" event={"ID":"42a27f5b-3b94-49e5-902d-abd26cb141d8","Type":"ContainerStarted","Data":"92c43e1a05de7594b4dfc28c84e8fd4aa72eabd375cb7a2265127852b421e144"} Dec 05 13:03:09.476337 master-0 kubenswrapper[29936]: I1205 13:03:09.473685 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" Dec 05 13:03:09.484827 master-0 kubenswrapper[29936]: E1205 13:03:09.481781 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" podUID="42a27f5b-3b94-49e5-902d-abd26cb141d8" Dec 05 13:03:09.501223 master-0 kubenswrapper[29936]: I1205 13:03:09.500508 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" event={"ID":"72464fc8-9970-443f-98ac-1c40ada39742","Type":"ContainerStarted","Data":"6cdfc49a5421ad8f54ad760aa6c0438ffb7cec03ccdccc549a942a319965b13d"} Dec 05 13:03:09.501677 master-0 kubenswrapper[29936]: I1205 13:03:09.501593 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" Dec 05 13:03:09.510920 master-0 kubenswrapper[29936]: E1205 13:03:09.505759 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" podUID="72464fc8-9970-443f-98ac-1c40ada39742" Dec 05 13:03:09.557233 master-0 kubenswrapper[29936]: I1205 13:03:09.551760 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" event={"ID":"e083a41a-8cd3-458d-a55d-cf3be20f0618","Type":"ContainerStarted","Data":"8d981d60325ead78b6d35cc322c715c96723ae4324ee57bbc0d8318f13d6a9ca"} Dec 05 13:03:09.557233 master-0 kubenswrapper[29936]: I1205 13:03:09.554076 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" event={"ID":"19f12de2-a4f5-42b7-bce4-2bc439b9f61a","Type":"ContainerStarted","Data":"fd833382f0b73407ca12324b2a15a64bac12474c85a06199917b27ab30001039"} Dec 05 13:03:09.630604 master-0 kubenswrapper[29936]: I1205 13:03:09.630508 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" event={"ID":"7c7c4bb7-eb69-4370-8539-34ba8c63383e","Type":"ContainerStarted","Data":"410a46577eb9be9045bc2c479c275caf37ba6c340b9c6121a51e02ae01fdec3f"} Dec 05 13:03:09.637423 master-0 kubenswrapper[29936]: I1205 13:03:09.634260 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" Dec 05 13:03:09.649252 master-0 kubenswrapper[29936]: E1205 13:03:09.643698 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" podUID="7c7c4bb7-eb69-4370-8539-34ba8c63383e" Dec 05 13:03:09.734344 master-0 kubenswrapper[29936]: I1205 13:03:09.727796 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" event={"ID":"dae79856-19f7-42a9-9eb2-d4263c763a58","Type":"ContainerStarted","Data":"03609120c4462ce0203047428cd1ecf0aff4ad58e3a9c3cb3049921649f1f844"} Dec 05 13:03:09.800232 master-0 kubenswrapper[29936]: I1205 13:03:09.800039 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" event={"ID":"e371fb46-339c-440c-b53a-31daf18710ef","Type":"ContainerStarted","Data":"5a495d9f1303af477a3e7d2f5d24c3b5b61e466b86eea649741da7220b56a525"} Dec 05 13:03:09.816449 master-0 kubenswrapper[29936]: I1205 13:03:09.800674 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" Dec 05 13:03:09.816449 master-0 kubenswrapper[29936]: E1205 13:03:09.805939 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" podUID="e371fb46-339c-440c-b53a-31daf18710ef" Dec 05 13:03:09.886208 master-0 kubenswrapper[29936]: I1205 13:03:09.868518 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" event={"ID":"6662695e-b061-402f-935b-c08c7eea8265","Type":"ContainerStarted","Data":"5a1bd9ec6bd375dd9004b7462d21308dcbe02545997a8a46d2363fa4d2bcd476"} Dec 05 13:03:09.958053 master-0 kubenswrapper[29936]: I1205 13:03:09.943269 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-wnccs" podStartSLOduration=4.733887686 podStartE2EDuration="33.943242712s" podCreationTimestamp="2025-12-05 13:02:36 +0000 UTC" firstStartedPulling="2025-12-05 13:02:38.63403778 +0000 UTC m=+755.766117461" lastFinishedPulling="2025-12-05 13:03:07.843392816 +0000 UTC m=+784.975472487" observedRunningTime="2025-12-05 13:03:09.858779898 +0000 UTC m=+786.990859599" watchObservedRunningTime="2025-12-05 13:03:09.943242712 +0000 UTC m=+787.075322393" Dec 05 13:03:10.246558 master-0 kubenswrapper[29936]: W1205 13:03:10.239368 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19e3b314_b614_4428_85c6_9e6b9f5e01bc.slice/crio-8d7c5a75c91f597b5dc428c43b8fe0a1073d324336338b4563ab82e93159e615 WatchSource:0}: Error finding container 8d7c5a75c91f597b5dc428c43b8fe0a1073d324336338b4563ab82e93159e615: Status 404 returned error can't find the container with id 8d7c5a75c91f597b5dc428c43b8fe0a1073d324336338b4563ab82e93159e615 Dec 05 13:03:10.261401 master-0 kubenswrapper[29936]: I1205 13:03:10.261316 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph"] Dec 05 13:03:10.884682 master-0 kubenswrapper[29936]: I1205 13:03:10.884457 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" event={"ID":"19e3b314-b614-4428-85c6-9e6b9f5e01bc","Type":"ContainerStarted","Data":"8d7c5a75c91f597b5dc428c43b8fe0a1073d324336338b4563ab82e93159e615"} Dec 05 13:03:10.886063 master-0 kubenswrapper[29936]: E1205 13:03:10.885985 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" podUID="ee9e12a6-a899-4e44-b4e1-d975493b6b9c" Dec 05 13:03:10.886605 master-0 kubenswrapper[29936]: E1205 13:03:10.886354 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" podUID="e371fb46-339c-440c-b53a-31daf18710ef" Dec 05 13:03:10.886605 master-0 kubenswrapper[29936]: E1205 13:03:10.886452 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" podUID="cd1c763d-2aee-4791-a9ea-87299af8efc4" Dec 05 13:03:10.886605 master-0 kubenswrapper[29936]: E1205 13:03:10.886468 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" podUID="42a27f5b-3b94-49e5-902d-abd26cb141d8" Dec 05 13:03:10.886605 master-0 kubenswrapper[29936]: E1205 13:03:10.886566 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" podUID="7c7c4bb7-eb69-4370-8539-34ba8c63383e" Dec 05 13:03:10.886605 master-0 kubenswrapper[29936]: E1205 13:03:10.886573 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" podUID="72464fc8-9970-443f-98ac-1c40ada39742" Dec 05 13:03:10.886857 master-0 kubenswrapper[29936]: E1205 13:03:10.886633 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" podUID="97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02" Dec 05 13:03:10.887305 master-0 kubenswrapper[29936]: E1205 13:03:10.887238 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" podUID="ef3f1077-b60d-4960-95ae-418df7695587" Dec 05 13:03:12.909546 master-0 kubenswrapper[29936]: I1205 13:03:12.909304 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" event={"ID":"19e3b314-b614-4428-85c6-9e6b9f5e01bc","Type":"ContainerStarted","Data":"ea5e554398c3b1cf165470ea964bd0f18543275e688904c1ea3227b2a4037724"} Dec 05 13:03:12.909546 master-0 kubenswrapper[29936]: I1205 13:03:12.909544 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:03:14.503570 master-0 kubenswrapper[29936]: I1205 13:03:14.503468 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" podStartSLOduration=39.5034309 podStartE2EDuration="39.5034309s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:03:14.496505722 +0000 UTC m=+791.628585403" watchObservedRunningTime="2025-12-05 13:03:14.5034309 +0000 UTC m=+791.635510591" Dec 05 13:03:15.583189 master-0 kubenswrapper[29936]: I1205 13:03:15.583103 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" Dec 05 13:03:15.585720 master-0 kubenswrapper[29936]: I1205 13:03:15.585616 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" Dec 05 13:03:15.587608 master-0 kubenswrapper[29936]: E1205 13:03:15.586865 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" podUID="7c7c4bb7-eb69-4370-8539-34ba8c63383e" Dec 05 13:03:15.587608 master-0 kubenswrapper[29936]: E1205 13:03:15.587472 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" podUID="72464fc8-9970-443f-98ac-1c40ada39742" Dec 05 13:03:15.627281 master-0 kubenswrapper[29936]: I1205 13:03:15.627192 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" Dec 05 13:03:15.629670 master-0 kubenswrapper[29936]: E1205 13:03:15.629591 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" podUID="cd1c763d-2aee-4791-a9ea-87299af8efc4" Dec 05 13:03:16.105863 master-0 kubenswrapper[29936]: I1205 13:03:16.105712 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:03:16.109658 master-0 kubenswrapper[29936]: E1205 13:03:16.109567 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" podUID="ee9e12a6-a899-4e44-b4e1-d975493b6b9c" Dec 05 13:03:16.154700 master-0 kubenswrapper[29936]: I1205 13:03:16.154641 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" Dec 05 13:03:16.157398 master-0 kubenswrapper[29936]: E1205 13:03:16.157371 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" podUID="42a27f5b-3b94-49e5-902d-abd26cb141d8" Dec 05 13:03:16.228442 master-0 kubenswrapper[29936]: I1205 13:03:16.228368 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" Dec 05 13:03:16.231313 master-0 kubenswrapper[29936]: E1205 13:03:16.231241 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" podUID="e371fb46-339c-440c-b53a-31daf18710ef" Dec 05 13:03:16.360830 master-0 kubenswrapper[29936]: I1205 13:03:16.360705 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" Dec 05 13:03:16.362575 master-0 kubenswrapper[29936]: E1205 13:03:16.362539 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" podUID="ef3f1077-b60d-4960-95ae-418df7695587" Dec 05 13:03:16.773874 master-0 kubenswrapper[29936]: I1205 13:03:16.773758 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" Dec 05 13:03:16.776305 master-0 kubenswrapper[29936]: E1205 13:03:16.776260 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" podUID="97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02" Dec 05 13:03:18.942234 master-0 kubenswrapper[29936]: I1205 13:03:18.941603 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-599cfccd85-6klph" Dec 05 13:03:23.088072 master-0 kubenswrapper[29936]: I1205 13:03:23.087384 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" event={"ID":"960abe54-24ff-4b94-b5a6-7a4cb11a5466","Type":"ContainerStarted","Data":"0c5999e47979bb0fde81234503706d2bcd642d8d4c88e971d26d3a57c143a5db"} Dec 05 13:03:23.088755 master-0 kubenswrapper[29936]: I1205 13:03:23.088652 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" Dec 05 13:03:23.090121 master-0 kubenswrapper[29936]: I1205 13:03:23.090056 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" Dec 05 13:03:23.092704 master-0 kubenswrapper[29936]: I1205 13:03:23.092162 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" event={"ID":"e083a41a-8cd3-458d-a55d-cf3be20f0618","Type":"ContainerStarted","Data":"882a67434a588a6a0a34f5134c908a22f6113699c8ba018d860b889400d1fbcb"} Dec 05 13:03:23.092704 master-0 kubenswrapper[29936]: I1205 13:03:23.092566 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" Dec 05 13:03:23.094232 master-0 kubenswrapper[29936]: I1205 13:03:23.094166 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" event={"ID":"d74e954d-e55b-4280-a96f-14f2775fa484","Type":"ContainerStarted","Data":"90db6a5a8af02fd0b5a29b331485a3c1bc6458b7b1647a28b16835de51aef220"} Dec 05 13:03:23.094332 master-0 kubenswrapper[29936]: I1205 13:03:23.094263 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" Dec 05 13:03:23.094699 master-0 kubenswrapper[29936]: I1205 13:03:23.094664 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" Dec 05 13:03:23.098203 master-0 kubenswrapper[29936]: I1205 13:03:23.098149 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" Dec 05 13:03:23.100949 master-0 kubenswrapper[29936]: I1205 13:03:23.099588 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" event={"ID":"77fe28e2-671e-4a0d-b065-0746abf83306","Type":"ContainerStarted","Data":"21fed334ed36b75537b0f7c1fd29b906b5854f24bb45988450608241e5e5949a"} Dec 05 13:03:23.100949 master-0 kubenswrapper[29936]: I1205 13:03:23.099909 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" Dec 05 13:03:23.102614 master-0 kubenswrapper[29936]: I1205 13:03:23.102265 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" Dec 05 13:03:23.103552 master-0 kubenswrapper[29936]: I1205 13:03:23.103516 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" event={"ID":"e64b46f2-d1e1-462f-952d-a7a7f8c663e9","Type":"ContainerStarted","Data":"9ec1508c9df51d10e1541615455b879fb5bd264f25d7d8c59ede8e341bc96a1a"} Dec 05 13:03:23.103552 master-0 kubenswrapper[29936]: I1205 13:03:23.103555 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" event={"ID":"e64b46f2-d1e1-462f-952d-a7a7f8c663e9","Type":"ContainerStarted","Data":"de577f68714cc9a2660b28a47c900c17b621c2fb105d850971cea2a07feed55d"} Dec 05 13:03:23.104238 master-0 kubenswrapper[29936]: I1205 13:03:23.103684 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:03:23.114167 master-0 kubenswrapper[29936]: I1205 13:03:23.112172 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" event={"ID":"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9","Type":"ContainerStarted","Data":"02e2831d374e00a0bca52c19f2ce566481399c2fb24ecae1bf3ec65746d69e15"} Dec 05 13:03:23.114167 master-0 kubenswrapper[29936]: I1205 13:03:23.112252 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" event={"ID":"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9","Type":"ContainerStarted","Data":"1bf2e35c67c3218c13bc82b693bd3640510210e6eaed8961cfe29daa9ba8cb73"} Dec 05 13:03:23.114167 master-0 kubenswrapper[29936]: I1205 13:03:23.113284 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:03:23.120357 master-0 kubenswrapper[29936]: I1205 13:03:23.118885 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" event={"ID":"6e8afa75-0149-45e2-8015-1c519267961c","Type":"ContainerStarted","Data":"1540c2e3b2437892092ed3223c8f3e2ee77ee4596ac290364aa297f0ffe5a16c"} Dec 05 13:03:23.120357 master-0 kubenswrapper[29936]: I1205 13:03:23.120143 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:03:23.135908 master-0 kubenswrapper[29936]: I1205 13:03:23.135696 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:03:23.138925 master-0 kubenswrapper[29936]: I1205 13:03:23.138854 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" event={"ID":"dae79856-19f7-42a9-9eb2-d4263c763a58","Type":"ContainerStarted","Data":"bf6a9047547fff68b0138af1e914086adbfee6c1d02dafd49ee47f2dec10599f"} Dec 05 13:03:23.140154 master-0 kubenswrapper[29936]: I1205 13:03:23.140093 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" Dec 05 13:03:23.143535 master-0 kubenswrapper[29936]: I1205 13:03:23.143476 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" Dec 05 13:03:23.146634 master-0 kubenswrapper[29936]: I1205 13:03:23.146584 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" event={"ID":"49bd6523-c715-46b2-8112-070019badeed","Type":"ContainerStarted","Data":"9553f823edfc72d8aa6c1d2419dc2d2d22433de1965719cfbeec1371129b5c3b"} Dec 05 13:03:23.157537 master-0 kubenswrapper[29936]: I1205 13:03:23.148341 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:03:23.157537 master-0 kubenswrapper[29936]: I1205 13:03:23.152253 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:03:23.172366 master-0 kubenswrapper[29936]: I1205 13:03:23.172231 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-2d5g7" podStartSLOduration=4.539030471 podStartE2EDuration="48.172203385s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:38.586818087 +0000 UTC m=+755.718897768" lastFinishedPulling="2025-12-05 13:03:22.219991001 +0000 UTC m=+799.352070682" observedRunningTime="2025-12-05 13:03:23.13964702 +0000 UTC m=+800.271726701" watchObservedRunningTime="2025-12-05 13:03:23.172203385 +0000 UTC m=+800.304283076" Dec 05 13:03:23.186854 master-0 kubenswrapper[29936]: I1205 13:03:23.186735 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" event={"ID":"c19b3073-31ec-46d5-9534-61bedaf6de73","Type":"ContainerStarted","Data":"49fb78a2b26be55d43090858420efb6736d52a1945c1bad57f1a2c0f530b648c"} Dec 05 13:03:23.189863 master-0 kubenswrapper[29936]: I1205 13:03:23.189797 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" Dec 05 13:03:23.212020 master-0 kubenswrapper[29936]: I1205 13:03:23.211967 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" Dec 05 13:03:23.212020 master-0 kubenswrapper[29936]: I1205 13:03:23.212016 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" Dec 05 13:03:23.212020 master-0 kubenswrapper[29936]: I1205 13:03:23.212041 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" Dec 05 13:03:23.212474 master-0 kubenswrapper[29936]: I1205 13:03:23.212063 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" event={"ID":"6662695e-b061-402f-935b-c08c7eea8265","Type":"ContainerStarted","Data":"149420c9fa09a01bb9689c5d47093fdaa0fc609c36fa5d0eb5b3e4b02e1f87de"} Dec 05 13:03:23.320211 master-0 kubenswrapper[29936]: I1205 13:03:23.317117 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" podStartSLOduration=35.173972572 podStartE2EDuration="48.317086252s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:03:07.73314963 +0000 UTC m=+784.865229311" lastFinishedPulling="2025-12-05 13:03:20.87626331 +0000 UTC m=+798.008342991" observedRunningTime="2025-12-05 13:03:23.274865795 +0000 UTC m=+800.406945486" watchObservedRunningTime="2025-12-05 13:03:23.317086252 +0000 UTC m=+800.449165933" Dec 05 13:03:23.341205 master-0 kubenswrapper[29936]: I1205 13:03:23.335682 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" podStartSLOduration=4.618743136 podStartE2EDuration="48.335656706s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:38.673881172 +0000 UTC m=+755.805960853" lastFinishedPulling="2025-12-05 13:03:22.390794742 +0000 UTC m=+799.522874423" observedRunningTime="2025-12-05 13:03:23.305823666 +0000 UTC m=+800.437903347" watchObservedRunningTime="2025-12-05 13:03:23.335656706 +0000 UTC m=+800.467736387" Dec 05 13:03:23.377856 master-0 kubenswrapper[29936]: I1205 13:03:23.377669 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-647f96877-mm5qc" podStartSLOduration=4.515278994 podStartE2EDuration="48.377645497s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:38.322069623 +0000 UTC m=+755.454149294" lastFinishedPulling="2025-12-05 13:03:22.184436116 +0000 UTC m=+799.316515797" observedRunningTime="2025-12-05 13:03:23.343345294 +0000 UTC m=+800.475424985" watchObservedRunningTime="2025-12-05 13:03:23.377645497 +0000 UTC m=+800.509725168" Dec 05 13:03:23.402870 master-0 kubenswrapper[29936]: I1205 13:03:23.401455 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-vhbn6" podStartSLOduration=3.672854475 podStartE2EDuration="48.401424033s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:37.518394146 +0000 UTC m=+754.650473827" lastFinishedPulling="2025-12-05 13:03:22.246963704 +0000 UTC m=+799.379043385" observedRunningTime="2025-12-05 13:03:23.400908059 +0000 UTC m=+800.532987760" watchObservedRunningTime="2025-12-05 13:03:23.401424033 +0000 UTC m=+800.533503714" Dec 05 13:03:23.502210 master-0 kubenswrapper[29936]: I1205 13:03:23.498475 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" podStartSLOduration=36.305690163 podStartE2EDuration="48.498447139s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:03:08.784291992 +0000 UTC m=+785.916371673" lastFinishedPulling="2025-12-05 13:03:20.977048968 +0000 UTC m=+798.109128649" observedRunningTime="2025-12-05 13:03:23.456733455 +0000 UTC m=+800.588813146" watchObservedRunningTime="2025-12-05 13:03:23.498447139 +0000 UTC m=+800.630526810" Dec 05 13:03:23.523920 master-0 kubenswrapper[29936]: I1205 13:03:23.523808 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-bdv2j" podStartSLOduration=4.356867121 podStartE2EDuration="48.523777427s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:38.28401596 +0000 UTC m=+755.416095641" lastFinishedPulling="2025-12-05 13:03:22.450926276 +0000 UTC m=+799.583005947" observedRunningTime="2025-12-05 13:03:23.498297615 +0000 UTC m=+800.630377296" watchObservedRunningTime="2025-12-05 13:03:23.523777427 +0000 UTC m=+800.655857108" Dec 05 13:03:23.601213 master-0 kubenswrapper[29936]: I1205 13:03:23.597127 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-jmm5j" podStartSLOduration=4.969471806 podStartE2EDuration="48.59709647s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:38.586895369 +0000 UTC m=+755.718975050" lastFinishedPulling="2025-12-05 13:03:22.214520023 +0000 UTC m=+799.346599714" observedRunningTime="2025-12-05 13:03:23.543766321 +0000 UTC m=+800.675846002" watchObservedRunningTime="2025-12-05 13:03:23.59709647 +0000 UTC m=+800.729176151" Dec 05 13:03:23.618599 master-0 kubenswrapper[29936]: I1205 13:03:23.618457 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-z5xtp" podStartSLOduration=3.082215116 podStartE2EDuration="48.61843648s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:36.68348952 +0000 UTC m=+753.815569201" lastFinishedPulling="2025-12-05 13:03:22.219710874 +0000 UTC m=+799.351790565" observedRunningTime="2025-12-05 13:03:23.576050558 +0000 UTC m=+800.708130239" watchObservedRunningTime="2025-12-05 13:03:23.61843648 +0000 UTC m=+800.750516161" Dec 05 13:03:24.210096 master-0 kubenswrapper[29936]: I1205 13:03:24.210012 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" event={"ID":"11364ea4-cb4b-4398-9692-aa5dd12f8a9f","Type":"ContainerStarted","Data":"6a66687250007a61ff5640e8293575a3399dc782b03d01cf9be3620a8274a842"} Dec 05 13:03:24.210880 master-0 kubenswrapper[29936]: I1205 13:03:24.210307 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" Dec 05 13:03:24.214786 master-0 kubenswrapper[29936]: I1205 13:03:24.214728 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" Dec 05 13:03:24.216408 master-0 kubenswrapper[29936]: I1205 13:03:24.216314 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" event={"ID":"214a3e2a-d306-45aa-8b2c-966a9953028c","Type":"ContainerStarted","Data":"76d6752c6e4c076186dac27deb22a888312e7f98c1959087681f0e3e16af162d"} Dec 05 13:03:24.217699 master-0 kubenswrapper[29936]: I1205 13:03:24.217663 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" Dec 05 13:03:24.221030 master-0 kubenswrapper[29936]: I1205 13:03:24.220983 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" Dec 05 13:03:25.537978 master-0 kubenswrapper[29936]: I1205 13:03:25.534445 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-sxfv7" podStartSLOduration=6.071974311 podStartE2EDuration="50.534418559s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:37.721989418 +0000 UTC m=+754.854069099" lastFinishedPulling="2025-12-05 13:03:22.184433666 +0000 UTC m=+799.316513347" observedRunningTime="2025-12-05 13:03:25.493753245 +0000 UTC m=+802.625832946" watchObservedRunningTime="2025-12-05 13:03:25.534418559 +0000 UTC m=+802.666498240" Dec 05 13:03:25.537978 master-0 kubenswrapper[29936]: I1205 13:03:25.537131 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" podStartSLOduration=6.913131178 podStartE2EDuration="50.537114503s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:38.596063908 +0000 UTC m=+755.728143589" lastFinishedPulling="2025-12-05 13:03:22.220047223 +0000 UTC m=+799.352126914" observedRunningTime="2025-12-05 13:03:25.528133299 +0000 UTC m=+802.660212980" watchObservedRunningTime="2025-12-05 13:03:25.537114503 +0000 UTC m=+802.669194184" Dec 05 13:03:25.588057 master-0 kubenswrapper[29936]: I1205 13:03:25.587935 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-xgptw" podStartSLOduration=5.488010596 podStartE2EDuration="50.587905694s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:37.607091216 +0000 UTC m=+754.739170897" lastFinishedPulling="2025-12-05 13:03:22.706986314 +0000 UTC m=+799.839065995" observedRunningTime="2025-12-05 13:03:25.579267928 +0000 UTC m=+802.711347629" watchObservedRunningTime="2025-12-05 13:03:25.587905694 +0000 UTC m=+802.719985405" Dec 05 13:03:25.621055 master-0 kubenswrapper[29936]: I1205 13:03:25.620961 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-bmxm9" podStartSLOduration=5.469852712 podStartE2EDuration="50.620936951s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:37.538867512 +0000 UTC m=+754.670947193" lastFinishedPulling="2025-12-05 13:03:22.689951751 +0000 UTC m=+799.822031432" observedRunningTime="2025-12-05 13:03:25.618036752 +0000 UTC m=+802.750116433" watchObservedRunningTime="2025-12-05 13:03:25.620936951 +0000 UTC m=+802.753016632" Dec 05 13:03:27.810493 master-0 kubenswrapper[29936]: I1205 13:03:27.810323 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-rprkt" Dec 05 13:03:28.278397 master-0 kubenswrapper[29936]: I1205 13:03:28.277863 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" event={"ID":"72464fc8-9970-443f-98ac-1c40ada39742","Type":"ContainerStarted","Data":"ecbf6c4378c151d59bf5cd98c0b6aafea51db29715797ca1c873a15e8ccff0d2"} Dec 05 13:03:28.306211 master-0 kubenswrapper[29936]: I1205 13:03:28.305172 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-52767" podStartSLOduration=27.3017595 podStartE2EDuration="53.305123224s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:36.784083154 +0000 UTC m=+753.916162835" lastFinishedPulling="2025-12-05 13:03:02.787446878 +0000 UTC m=+779.919526559" observedRunningTime="2025-12-05 13:03:28.301042633 +0000 UTC m=+805.433122324" watchObservedRunningTime="2025-12-05 13:03:28.305123224 +0000 UTC m=+805.437202905" Dec 05 13:03:29.294081 master-0 kubenswrapper[29936]: I1205 13:03:29.293999 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" event={"ID":"ef3f1077-b60d-4960-95ae-418df7695587","Type":"ContainerStarted","Data":"5c935f0820594063a487af1df29ada86316b7501ef4ba552c7e2331b98fd148a"} Dec 05 13:03:29.299327 master-0 kubenswrapper[29936]: I1205 13:03:29.299262 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" event={"ID":"42a27f5b-3b94-49e5-902d-abd26cb141d8","Type":"ContainerStarted","Data":"1d4933c790f5d0f87b2da6520117f7ff836fd182580999e94c0214c6abb72d04"} Dec 05 13:03:29.329768 master-0 kubenswrapper[29936]: I1205 13:03:29.329621 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-n8fr6" podStartSLOduration=28.006018257 podStartE2EDuration="54.329586081s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:38.365566535 +0000 UTC m=+755.497646216" lastFinishedPulling="2025-12-05 13:03:04.689134349 +0000 UTC m=+781.821214040" observedRunningTime="2025-12-05 13:03:29.317831982 +0000 UTC m=+806.449911683" watchObservedRunningTime="2025-12-05 13:03:29.329586081 +0000 UTC m=+806.461665762" Dec 05 13:03:29.366702 master-0 kubenswrapper[29936]: I1205 13:03:29.366583 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-bwmdd" podStartSLOduration=24.439578519 podStartE2EDuration="54.366553265s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:37.734935939 +0000 UTC m=+754.867015620" lastFinishedPulling="2025-12-05 13:03:07.661910685 +0000 UTC m=+784.793990366" observedRunningTime="2025-12-05 13:03:29.357792247 +0000 UTC m=+806.489871938" watchObservedRunningTime="2025-12-05 13:03:29.366553265 +0000 UTC m=+806.498632946" Dec 05 13:03:30.316789 master-0 kubenswrapper[29936]: I1205 13:03:30.316691 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" event={"ID":"7c7c4bb7-eb69-4370-8539-34ba8c63383e","Type":"ContainerStarted","Data":"f707dbbdc90298bfd3caaa49ccb60b112870120c1356bd0ada60a58a11feb00a"} Dec 05 13:03:31.216437 master-0 kubenswrapper[29936]: I1205 13:03:31.216332 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-sgjwb" podStartSLOduration=30.975155532 podStartE2EDuration="56.216295995s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:37.548272568 +0000 UTC m=+754.680352249" lastFinishedPulling="2025-12-05 13:03:02.789413031 +0000 UTC m=+779.921492712" observedRunningTime="2025-12-05 13:03:31.191245965 +0000 UTC m=+808.323325656" watchObservedRunningTime="2025-12-05 13:03:31.216295995 +0000 UTC m=+808.348375676" Dec 05 13:03:32.290683 master-0 kubenswrapper[29936]: I1205 13:03:32.290588 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:03:32.350664 master-0 kubenswrapper[29936]: I1205 13:03:32.350515 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" event={"ID":"cd1c763d-2aee-4791-a9ea-87299af8efc4","Type":"ContainerStarted","Data":"63a8bb75abef3e4c005610f07710bd702049b2689c022ff4fe76bf3c464e2cce"} Dec 05 13:03:32.353088 master-0 kubenswrapper[29936]: I1205 13:03:32.352980 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" event={"ID":"e371fb46-339c-440c-b53a-31daf18710ef","Type":"ContainerStarted","Data":"9adf87dfa5845e121758c411781a38468ee781dc1b608739de5365124c052815"} Dec 05 13:03:32.356397 master-0 kubenswrapper[29936]: I1205 13:03:32.356312 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" event={"ID":"ee9e12a6-a899-4e44-b4e1-d975493b6b9c","Type":"ContainerStarted","Data":"830d3702094749b9ed6f4d72fdd55caf29d8c8dd58854d908f85945818bc6350"} Dec 05 13:03:33.370435 master-0 kubenswrapper[29936]: I1205 13:03:33.370347 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" event={"ID":"97d30d8c-ba5e-42e3-b5ee-8bf38b7bce02","Type":"ContainerStarted","Data":"55e21231e8737dbd0b2f48cde61c17c0b62b5bb1a0551e4084097df3013adadf"} Dec 05 13:03:36.727814 master-0 kubenswrapper[29936]: I1205 13:03:36.727679 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-696b999796-8wd8k" podStartSLOduration=32.661784729 podStartE2EDuration="1m1.727629967s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:38.59616508 +0000 UTC m=+755.728244761" lastFinishedPulling="2025-12-05 13:03:07.662010318 +0000 UTC m=+784.794089999" observedRunningTime="2025-12-05 13:03:36.724950044 +0000 UTC m=+813.857029735" watchObservedRunningTime="2025-12-05 13:03:36.727629967 +0000 UTC m=+813.859709648" Dec 05 13:03:36.786891 master-0 kubenswrapper[29936]: I1205 13:03:36.786772 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-wdgwl" podStartSLOduration=34.461618024 podStartE2EDuration="1m1.786745963s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:37.15789695 +0000 UTC m=+754.289976631" lastFinishedPulling="2025-12-05 13:03:04.483024899 +0000 UTC m=+781.615104570" observedRunningTime="2025-12-05 13:03:36.786703082 +0000 UTC m=+813.918782783" watchObservedRunningTime="2025-12-05 13:03:36.786745963 +0000 UTC m=+813.918825664" Dec 05 13:03:36.894139 master-0 kubenswrapper[29936]: I1205 13:03:36.894015 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-dbbff" podStartSLOduration=36.812605563 podStartE2EDuration="1m1.893984776s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:37.849761489 +0000 UTC m=+754.981841170" lastFinishedPulling="2025-12-05 13:03:02.931140702 +0000 UTC m=+780.063220383" observedRunningTime="2025-12-05 13:03:36.818709442 +0000 UTC m=+813.950789143" watchObservedRunningTime="2025-12-05 13:03:36.893984776 +0000 UTC m=+814.026064457" Dec 05 13:03:36.934089 master-0 kubenswrapper[29936]: I1205 13:03:36.933972 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" podStartSLOduration=36.594125348 podStartE2EDuration="1m1.933937522s" podCreationTimestamp="2025-12-05 13:02:35 +0000 UTC" firstStartedPulling="2025-12-05 13:02:37.585077518 +0000 UTC m=+754.717157199" lastFinishedPulling="2025-12-05 13:03:02.924889692 +0000 UTC m=+780.056969373" observedRunningTime="2025-12-05 13:03:36.86651166 +0000 UTC m=+813.998591341" watchObservedRunningTime="2025-12-05 13:03:36.933937522 +0000 UTC m=+814.066017193" Dec 05 13:04:18.366712 master-0 kubenswrapper[29936]: I1205 13:04:18.365756 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd"] Dec 05 13:04:18.371238 master-0 kubenswrapper[29936]: I1205 13:04:18.370793 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" Dec 05 13:04:18.382513 master-0 kubenswrapper[29936]: I1205 13:04:18.382446 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 13:04:18.383208 master-0 kubenswrapper[29936]: I1205 13:04:18.383171 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 13:04:18.383509 master-0 kubenswrapper[29936]: I1205 13:04:18.383492 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 13:04:18.396009 master-0 kubenswrapper[29936]: I1205 13:04:18.395866 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd"] Dec 05 13:04:18.470379 master-0 kubenswrapper[29936]: I1205 13:04:18.470275 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-jfj4q"] Dec 05 13:04:18.472778 master-0 kubenswrapper[29936]: I1205 13:04:18.472717 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:18.478291 master-0 kubenswrapper[29936]: I1205 13:04:18.478137 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 13:04:18.492523 master-0 kubenswrapper[29936]: I1205 13:04:18.491840 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-jfj4q"] Dec 05 13:04:18.561218 master-0 kubenswrapper[29936]: I1205 13:04:18.551535 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c44tm\" (UniqueName: \"kubernetes.io/projected/6bec31da-6702-489f-9b71-abf49a5607c9-kube-api-access-c44tm\") pod \"dnsmasq-dns-5dbfd7c4bf-n2wtd\" (UID: \"6bec31da-6702-489f-9b71-abf49a5607c9\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" Dec 05 13:04:18.561218 master-0 kubenswrapper[29936]: I1205 13:04:18.551690 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bec31da-6702-489f-9b71-abf49a5607c9-config\") pod \"dnsmasq-dns-5dbfd7c4bf-n2wtd\" (UID: \"6bec31da-6702-489f-9b71-abf49a5607c9\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" Dec 05 13:04:18.653811 master-0 kubenswrapper[29936]: I1205 13:04:18.653618 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bec31da-6702-489f-9b71-abf49a5607c9-config\") pod \"dnsmasq-dns-5dbfd7c4bf-n2wtd\" (UID: \"6bec31da-6702-489f-9b71-abf49a5607c9\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" Dec 05 13:04:18.653811 master-0 kubenswrapper[29936]: I1205 13:04:18.653754 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb2ln\" (UniqueName: \"kubernetes.io/projected/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-kube-api-access-gb2ln\") pod \"dnsmasq-dns-75d7c5dbd7-jfj4q\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:18.653811 master-0 kubenswrapper[29936]: I1205 13:04:18.653803 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c44tm\" (UniqueName: \"kubernetes.io/projected/6bec31da-6702-489f-9b71-abf49a5607c9-kube-api-access-c44tm\") pod \"dnsmasq-dns-5dbfd7c4bf-n2wtd\" (UID: \"6bec31da-6702-489f-9b71-abf49a5607c9\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" Dec 05 13:04:18.654199 master-0 kubenswrapper[29936]: I1205 13:04:18.653859 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-config\") pod \"dnsmasq-dns-75d7c5dbd7-jfj4q\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:18.654199 master-0 kubenswrapper[29936]: I1205 13:04:18.653931 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-dns-svc\") pod \"dnsmasq-dns-75d7c5dbd7-jfj4q\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:18.655311 master-0 kubenswrapper[29936]: I1205 13:04:18.655272 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bec31da-6702-489f-9b71-abf49a5607c9-config\") pod \"dnsmasq-dns-5dbfd7c4bf-n2wtd\" (UID: \"6bec31da-6702-489f-9b71-abf49a5607c9\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" Dec 05 13:04:18.675391 master-0 kubenswrapper[29936]: I1205 13:04:18.675303 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c44tm\" (UniqueName: \"kubernetes.io/projected/6bec31da-6702-489f-9b71-abf49a5607c9-kube-api-access-c44tm\") pod \"dnsmasq-dns-5dbfd7c4bf-n2wtd\" (UID: \"6bec31da-6702-489f-9b71-abf49a5607c9\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" Dec 05 13:04:18.700668 master-0 kubenswrapper[29936]: I1205 13:04:18.700567 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" Dec 05 13:04:18.755839 master-0 kubenswrapper[29936]: I1205 13:04:18.755746 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-config\") pod \"dnsmasq-dns-75d7c5dbd7-jfj4q\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:18.756147 master-0 kubenswrapper[29936]: I1205 13:04:18.755904 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-dns-svc\") pod \"dnsmasq-dns-75d7c5dbd7-jfj4q\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:18.756147 master-0 kubenswrapper[29936]: I1205 13:04:18.756018 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb2ln\" (UniqueName: \"kubernetes.io/projected/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-kube-api-access-gb2ln\") pod \"dnsmasq-dns-75d7c5dbd7-jfj4q\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:18.758567 master-0 kubenswrapper[29936]: I1205 13:04:18.757098 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-dns-svc\") pod \"dnsmasq-dns-75d7c5dbd7-jfj4q\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:18.758567 master-0 kubenswrapper[29936]: I1205 13:04:18.757147 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-config\") pod \"dnsmasq-dns-75d7c5dbd7-jfj4q\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:18.778778 master-0 kubenswrapper[29936]: I1205 13:04:18.778688 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb2ln\" (UniqueName: \"kubernetes.io/projected/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-kube-api-access-gb2ln\") pod \"dnsmasq-dns-75d7c5dbd7-jfj4q\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:18.817011 master-0 kubenswrapper[29936]: I1205 13:04:18.816930 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:19.223123 master-0 kubenswrapper[29936]: I1205 13:04:19.223064 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd"] Dec 05 13:04:19.339629 master-0 kubenswrapper[29936]: I1205 13:04:19.339407 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-jfj4q"] Dec 05 13:04:19.625293 master-0 kubenswrapper[29936]: I1205 13:04:19.625129 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd"] Dec 05 13:04:19.679743 master-0 kubenswrapper[29936]: I1205 13:04:19.679669 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9bff68687-qw5gj"] Dec 05 13:04:19.684456 master-0 kubenswrapper[29936]: I1205 13:04:19.682373 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:19.735407 master-0 kubenswrapper[29936]: I1205 13:04:19.714155 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9bff68687-qw5gj"] Dec 05 13:04:19.783686 master-0 kubenswrapper[29936]: I1205 13:04:19.783602 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59597\" (UniqueName: \"kubernetes.io/projected/dd6f05c5-1783-4c1f-83b8-130470139655-kube-api-access-59597\") pod \"dnsmasq-dns-9bff68687-qw5gj\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:19.783983 master-0 kubenswrapper[29936]: I1205 13:04:19.783721 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-config\") pod \"dnsmasq-dns-9bff68687-qw5gj\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:19.783983 master-0 kubenswrapper[29936]: I1205 13:04:19.783824 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-dns-svc\") pod \"dnsmasq-dns-9bff68687-qw5gj\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:19.886610 master-0 kubenswrapper[29936]: I1205 13:04:19.886278 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-dns-svc\") pod \"dnsmasq-dns-9bff68687-qw5gj\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:19.886610 master-0 kubenswrapper[29936]: I1205 13:04:19.886412 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59597\" (UniqueName: \"kubernetes.io/projected/dd6f05c5-1783-4c1f-83b8-130470139655-kube-api-access-59597\") pod \"dnsmasq-dns-9bff68687-qw5gj\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:19.886610 master-0 kubenswrapper[29936]: I1205 13:04:19.886487 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-config\") pod \"dnsmasq-dns-9bff68687-qw5gj\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:19.888967 master-0 kubenswrapper[29936]: I1205 13:04:19.888274 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-config\") pod \"dnsmasq-dns-9bff68687-qw5gj\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:19.888967 master-0 kubenswrapper[29936]: I1205 13:04:19.888275 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-dns-svc\") pod \"dnsmasq-dns-9bff68687-qw5gj\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:19.914325 master-0 kubenswrapper[29936]: I1205 13:04:19.914244 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59597\" (UniqueName: \"kubernetes.io/projected/dd6f05c5-1783-4c1f-83b8-130470139655-kube-api-access-59597\") pod \"dnsmasq-dns-9bff68687-qw5gj\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:19.919318 master-0 kubenswrapper[29936]: I1205 13:04:19.919273 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" event={"ID":"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18","Type":"ContainerStarted","Data":"0abeaedaf44fb02842cfc2f9b3844218c08c4ef65a140d4884cc817c68378705"} Dec 05 13:04:19.921441 master-0 kubenswrapper[29936]: I1205 13:04:19.921420 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" event={"ID":"6bec31da-6702-489f-9b71-abf49a5607c9","Type":"ContainerStarted","Data":"37480124f66d030ae536fdc7a790f51e72de9587206a9acfb72d3c565f86947b"} Dec 05 13:04:20.076881 master-0 kubenswrapper[29936]: I1205 13:04:20.076786 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:20.564290 master-0 kubenswrapper[29936]: I1205 13:04:20.564210 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-jfj4q"] Dec 05 13:04:20.614633 master-0 kubenswrapper[29936]: I1205 13:04:20.611641 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-f5crt"] Dec 05 13:04:20.619820 master-0 kubenswrapper[29936]: I1205 13:04:20.616669 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:20.642368 master-0 kubenswrapper[29936]: I1205 13:04:20.627979 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-f5crt"] Dec 05 13:04:20.665294 master-0 kubenswrapper[29936]: I1205 13:04:20.663462 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9bff68687-qw5gj"] Dec 05 13:04:20.687935 master-0 kubenswrapper[29936]: W1205 13:04:20.687805 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd6f05c5_1783_4c1f_83b8_130470139655.slice/crio-eb59d1e9d1713d7debb8653fad051f97d3b5718e956cafbca8f3bbedf4803720 WatchSource:0}: Error finding container eb59d1e9d1713d7debb8653fad051f97d3b5718e956cafbca8f3bbedf4803720: Status 404 returned error can't find the container with id eb59d1e9d1713d7debb8653fad051f97d3b5718e956cafbca8f3bbedf4803720 Dec 05 13:04:20.756655 master-0 kubenswrapper[29936]: I1205 13:04:20.756578 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwmj2\" (UniqueName: \"kubernetes.io/projected/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-kube-api-access-vwmj2\") pod \"dnsmasq-dns-658bb5765c-f5crt\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:20.756754 master-0 kubenswrapper[29936]: I1205 13:04:20.756726 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-config\") pod \"dnsmasq-dns-658bb5765c-f5crt\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:20.756820 master-0 kubenswrapper[29936]: I1205 13:04:20.756784 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-dns-svc\") pod \"dnsmasq-dns-658bb5765c-f5crt\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:20.858675 master-0 kubenswrapper[29936]: I1205 13:04:20.858594 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-config\") pod \"dnsmasq-dns-658bb5765c-f5crt\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:20.858675 master-0 kubenswrapper[29936]: I1205 13:04:20.858675 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-dns-svc\") pod \"dnsmasq-dns-658bb5765c-f5crt\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:20.859093 master-0 kubenswrapper[29936]: I1205 13:04:20.858785 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwmj2\" (UniqueName: \"kubernetes.io/projected/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-kube-api-access-vwmj2\") pod \"dnsmasq-dns-658bb5765c-f5crt\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:20.860267 master-0 kubenswrapper[29936]: I1205 13:04:20.860233 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-config\") pod \"dnsmasq-dns-658bb5765c-f5crt\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:20.865434 master-0 kubenswrapper[29936]: I1205 13:04:20.865377 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-dns-svc\") pod \"dnsmasq-dns-658bb5765c-f5crt\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:20.896300 master-0 kubenswrapper[29936]: I1205 13:04:20.896239 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwmj2\" (UniqueName: \"kubernetes.io/projected/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-kube-api-access-vwmj2\") pod \"dnsmasq-dns-658bb5765c-f5crt\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:20.952579 master-0 kubenswrapper[29936]: I1205 13:04:20.952017 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bff68687-qw5gj" event={"ID":"dd6f05c5-1783-4c1f-83b8-130470139655","Type":"ContainerStarted","Data":"eb59d1e9d1713d7debb8653fad051f97d3b5718e956cafbca8f3bbedf4803720"} Dec 05 13:04:20.967325 master-0 kubenswrapper[29936]: I1205 13:04:20.967090 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:21.534166 master-0 kubenswrapper[29936]: I1205 13:04:21.527145 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-f5crt"] Dec 05 13:04:21.989234 master-0 kubenswrapper[29936]: I1205 13:04:21.989083 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" event={"ID":"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9","Type":"ContainerStarted","Data":"333f7afe247c70d1cd8e079a3c4d4b7d0323c4df655b38b3810109a6584b584e"} Dec 05 13:04:23.911780 master-0 kubenswrapper[29936]: I1205 13:04:23.911444 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 13:04:23.915845 master-0 kubenswrapper[29936]: I1205 13:04:23.915799 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:23.919222 master-0 kubenswrapper[29936]: I1205 13:04:23.919119 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 13:04:23.919413 master-0 kubenswrapper[29936]: I1205 13:04:23.919388 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 13:04:23.919669 master-0 kubenswrapper[29936]: I1205 13:04:23.919648 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 13:04:23.924366 master-0 kubenswrapper[29936]: I1205 13:04:23.919841 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 13:04:23.924366 master-0 kubenswrapper[29936]: I1205 13:04:23.919972 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 13:04:23.924366 master-0 kubenswrapper[29936]: I1205 13:04:23.920592 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 13:04:23.953154 master-0 kubenswrapper[29936]: I1205 13:04:23.952863 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 13:04:24.010488 master-0 kubenswrapper[29936]: I1205 13:04:24.010398 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.010816 master-0 kubenswrapper[29936]: I1205 13:04:24.010731 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jqlw\" (UniqueName: \"kubernetes.io/projected/a34de370-e400-45de-adbb-09995d9c1953-kube-api-access-6jqlw\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.010868 master-0 kubenswrapper[29936]: I1205 13:04:24.010815 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.010912 master-0 kubenswrapper[29936]: I1205 13:04:24.010869 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a34de370-e400-45de-adbb-09995d9c1953-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.010965 master-0 kubenswrapper[29936]: I1205 13:04:24.010934 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a34de370-e400-45de-adbb-09995d9c1953-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.011081 master-0 kubenswrapper[29936]: I1205 13:04:24.011000 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a34de370-e400-45de-adbb-09995d9c1953-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.011081 master-0 kubenswrapper[29936]: I1205 13:04:24.011041 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.011167 master-0 kubenswrapper[29936]: I1205 13:04:24.011146 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.011430 master-0 kubenswrapper[29936]: I1205 13:04:24.011240 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a34de370-e400-45de-adbb-09995d9c1953-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.011430 master-0 kubenswrapper[29936]: I1205 13:04:24.011316 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a34de370-e400-45de-adbb-09995d9c1953-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.011430 master-0 kubenswrapper[29936]: I1205 13:04:24.011384 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d0ab9488-01db-480e-ab9c-14671d2f09bd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^18e58751-aa66-4088-9eaf-b4ebdb8d4632\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.115859 master-0 kubenswrapper[29936]: I1205 13:04:24.115735 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.115859 master-0 kubenswrapper[29936]: I1205 13:04:24.115820 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a34de370-e400-45de-adbb-09995d9c1953-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.116381 master-0 kubenswrapper[29936]: I1205 13:04:24.115896 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a34de370-e400-45de-adbb-09995d9c1953-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.116381 master-0 kubenswrapper[29936]: I1205 13:04:24.115942 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a34de370-e400-45de-adbb-09995d9c1953-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.116381 master-0 kubenswrapper[29936]: I1205 13:04:24.115969 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.116381 master-0 kubenswrapper[29936]: I1205 13:04:24.115993 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.116381 master-0 kubenswrapper[29936]: I1205 13:04:24.116015 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a34de370-e400-45de-adbb-09995d9c1953-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.116381 master-0 kubenswrapper[29936]: I1205 13:04:24.116051 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a34de370-e400-45de-adbb-09995d9c1953-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.116381 master-0 kubenswrapper[29936]: I1205 13:04:24.116110 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.116381 master-0 kubenswrapper[29936]: I1205 13:04:24.116133 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jqlw\" (UniqueName: \"kubernetes.io/projected/a34de370-e400-45de-adbb-09995d9c1953-kube-api-access-6jqlw\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.120291 master-0 kubenswrapper[29936]: I1205 13:04:24.119921 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.120291 master-0 kubenswrapper[29936]: I1205 13:04:24.120247 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.120974 master-0 kubenswrapper[29936]: I1205 13:04:24.120922 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a34de370-e400-45de-adbb-09995d9c1953-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.121909 master-0 kubenswrapper[29936]: I1205 13:04:24.121827 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a34de370-e400-45de-adbb-09995d9c1953-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.123780 master-0 kubenswrapper[29936]: I1205 13:04:24.123575 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a34de370-e400-45de-adbb-09995d9c1953-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.129981 master-0 kubenswrapper[29936]: I1205 13:04:24.129920 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.132946 master-0 kubenswrapper[29936]: I1205 13:04:24.132902 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a34de370-e400-45de-adbb-09995d9c1953-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.147612 master-0 kubenswrapper[29936]: I1205 13:04:24.147522 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a34de370-e400-45de-adbb-09995d9c1953-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.162020 master-0 kubenswrapper[29936]: I1205 13:04:24.161879 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a34de370-e400-45de-adbb-09995d9c1953-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.223429 master-0 kubenswrapper[29936]: I1205 13:04:24.223143 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d0ab9488-01db-480e-ab9c-14671d2f09bd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^18e58751-aa66-4088-9eaf-b4ebdb8d4632\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.234334 master-0 kubenswrapper[29936]: I1205 13:04:24.234272 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:04:24.234334 master-0 kubenswrapper[29936]: I1205 13:04:24.234334 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d0ab9488-01db-480e-ab9c-14671d2f09bd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^18e58751-aa66-4088-9eaf-b4ebdb8d4632\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/5e112da5e5251c8f8d87270ea134722a8623059a297db46a3a6c31cc030bd586/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.253482 master-0 kubenswrapper[29936]: I1205 13:04:24.243417 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jqlw\" (UniqueName: \"kubernetes.io/projected/a34de370-e400-45de-adbb-09995d9c1953-kube-api-access-6jqlw\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:24.747356 master-0 kubenswrapper[29936]: I1205 13:04:24.740040 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 13:04:24.747356 master-0 kubenswrapper[29936]: I1205 13:04:24.741902 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 13:04:24.757018 master-0 kubenswrapper[29936]: I1205 13:04:24.756956 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 13:04:24.780312 master-0 kubenswrapper[29936]: I1205 13:04:24.780198 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 13:04:24.817865 master-0 kubenswrapper[29936]: I1205 13:04:24.817715 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 13:04:24.858807 master-0 kubenswrapper[29936]: I1205 13:04:24.858707 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b8fe631-5d8c-457a-bfb2-279b0526552f-config-data\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.858807 master-0 kubenswrapper[29936]: I1205 13:04:24.858788 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8fe631-5d8c-457a-bfb2-279b0526552f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.859221 master-0 kubenswrapper[29936]: I1205 13:04:24.858925 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzm8m\" (UniqueName: \"kubernetes.io/projected/8b8fe631-5d8c-457a-bfb2-279b0526552f-kube-api-access-xzm8m\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.859221 master-0 kubenswrapper[29936]: I1205 13:04:24.858944 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b8fe631-5d8c-457a-bfb2-279b0526552f-kolla-config\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.859221 master-0 kubenswrapper[29936]: I1205 13:04:24.858965 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8fe631-5d8c-457a-bfb2-279b0526552f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.890211 master-0 kubenswrapper[29936]: I1205 13:04:24.889988 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 13:04:24.975867 master-0 kubenswrapper[29936]: I1205 13:04:24.975744 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b8fe631-5d8c-457a-bfb2-279b0526552f-kolla-config\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.975867 master-0 kubenswrapper[29936]: I1205 13:04:24.975808 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8fe631-5d8c-457a-bfb2-279b0526552f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.975867 master-0 kubenswrapper[29936]: I1205 13:04:24.975858 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b8fe631-5d8c-457a-bfb2-279b0526552f-config-data\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.976669 master-0 kubenswrapper[29936]: I1205 13:04:24.975890 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8fe631-5d8c-457a-bfb2-279b0526552f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.976669 master-0 kubenswrapper[29936]: I1205 13:04:24.976009 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzm8m\" (UniqueName: \"kubernetes.io/projected/8b8fe631-5d8c-457a-bfb2-279b0526552f-kube-api-access-xzm8m\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.977485 master-0 kubenswrapper[29936]: I1205 13:04:24.977433 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b8fe631-5d8c-457a-bfb2-279b0526552f-kolla-config\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:24.978328 master-0 kubenswrapper[29936]: I1205 13:04:24.978061 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b8fe631-5d8c-457a-bfb2-279b0526552f-config-data\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:25.000949 master-0 kubenswrapper[29936]: I1205 13:04:25.000798 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8fe631-5d8c-457a-bfb2-279b0526552f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:25.002080 master-0 kubenswrapper[29936]: I1205 13:04:25.002004 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8fe631-5d8c-457a-bfb2-279b0526552f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:25.013014 master-0 kubenswrapper[29936]: I1205 13:04:25.012826 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzm8m\" (UniqueName: \"kubernetes.io/projected/8b8fe631-5d8c-457a-bfb2-279b0526552f-kube-api-access-xzm8m\") pod \"memcached-0\" (UID: \"8b8fe631-5d8c-457a-bfb2-279b0526552f\") " pod="openstack/memcached-0" Dec 05 13:04:25.205373 master-0 kubenswrapper[29936]: I1205 13:04:25.204983 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 13:04:26.069211 master-0 kubenswrapper[29936]: I1205 13:04:26.069075 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d0ab9488-01db-480e-ab9c-14671d2f09bd\" (UniqueName: \"kubernetes.io/csi/topolvm.io^18e58751-aa66-4088-9eaf-b4ebdb8d4632\") pod \"rabbitmq-cell1-server-0\" (UID: \"a34de370-e400-45de-adbb-09995d9c1953\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:26.134291 master-0 kubenswrapper[29936]: I1205 13:04:26.132318 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 13:04:26.363372 master-0 kubenswrapper[29936]: I1205 13:04:26.363257 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:04:26.791522 master-0 kubenswrapper[29936]: I1205 13:04:26.791430 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 13:04:26.794379 master-0 kubenswrapper[29936]: I1205 13:04:26.794274 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.798709 master-0 kubenswrapper[29936]: I1205 13:04:26.798525 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 13:04:26.800053 master-0 kubenswrapper[29936]: I1205 13:04:26.798922 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 13:04:26.800053 master-0 kubenswrapper[29936]: I1205 13:04:26.799138 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 13:04:26.800646 master-0 kubenswrapper[29936]: I1205 13:04:26.800364 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 13:04:26.801684 master-0 kubenswrapper[29936]: I1205 13:04:26.801295 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 13:04:26.801684 master-0 kubenswrapper[29936]: I1205 13:04:26.801514 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 13:04:26.831846 master-0 kubenswrapper[29936]: I1205 13:04:26.829308 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 13:04:26.978781 master-0 kubenswrapper[29936]: I1205 13:04:26.978710 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac9589fd-c5fd-41de-8018-16c6e272d05e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.979404 master-0 kubenswrapper[29936]: I1205 13:04:26.979372 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.979631 master-0 kubenswrapper[29936]: I1205 13:04:26.979608 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9kq\" (UniqueName: \"kubernetes.io/projected/ac9589fd-c5fd-41de-8018-16c6e272d05e-kube-api-access-wg9kq\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.983015 master-0 kubenswrapper[29936]: I1205 13:04:26.979801 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.984290 master-0 kubenswrapper[29936]: I1205 13:04:26.983420 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac9589fd-c5fd-41de-8018-16c6e272d05e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.984486 master-0 kubenswrapper[29936]: I1205 13:04:26.984456 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac9589fd-c5fd-41de-8018-16c6e272d05e-config-data\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.984636 master-0 kubenswrapper[29936]: I1205 13:04:26.984618 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.984743 master-0 kubenswrapper[29936]: I1205 13:04:26.984729 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac9589fd-c5fd-41de-8018-16c6e272d05e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.984855 master-0 kubenswrapper[29936]: I1205 13:04:26.984831 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.985079 master-0 kubenswrapper[29936]: I1205 13:04:26.985065 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac9589fd-c5fd-41de-8018-16c6e272d05e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:26.989253 master-0 kubenswrapper[29936]: I1205 13:04:26.988953 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6b8e1316-38c5-4242-ad7f-1e42a03669e5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^da10cece-679d-415a-ad59-c9d9c9192399\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.093340 master-0 kubenswrapper[29936]: I1205 13:04:27.092123 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac9589fd-c5fd-41de-8018-16c6e272d05e-config-data\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.093340 master-0 kubenswrapper[29936]: I1205 13:04:27.092427 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.093340 master-0 kubenswrapper[29936]: I1205 13:04:27.092522 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac9589fd-c5fd-41de-8018-16c6e272d05e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.093340 master-0 kubenswrapper[29936]: I1205 13:04:27.092613 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.093340 master-0 kubenswrapper[29936]: I1205 13:04:27.092731 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac9589fd-c5fd-41de-8018-16c6e272d05e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.093340 master-0 kubenswrapper[29936]: I1205 13:04:27.092994 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6b8e1316-38c5-4242-ad7f-1e42a03669e5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^da10cece-679d-415a-ad59-c9d9c9192399\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.093340 master-0 kubenswrapper[29936]: I1205 13:04:27.093059 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac9589fd-c5fd-41de-8018-16c6e272d05e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.093340 master-0 kubenswrapper[29936]: I1205 13:04:27.093166 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.093340 master-0 kubenswrapper[29936]: I1205 13:04:27.093201 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ac9589fd-c5fd-41de-8018-16c6e272d05e-config-data\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.096161 master-0 kubenswrapper[29936]: I1205 13:04:27.093244 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9kq\" (UniqueName: \"kubernetes.io/projected/ac9589fd-c5fd-41de-8018-16c6e272d05e-kube-api-access-wg9kq\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.096161 master-0 kubenswrapper[29936]: I1205 13:04:27.094738 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.096161 master-0 kubenswrapper[29936]: I1205 13:04:27.094861 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac9589fd-c5fd-41de-8018-16c6e272d05e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.096161 master-0 kubenswrapper[29936]: I1205 13:04:27.094586 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ac9589fd-c5fd-41de-8018-16c6e272d05e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.096161 master-0 kubenswrapper[29936]: I1205 13:04:27.093370 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.096161 master-0 kubenswrapper[29936]: I1205 13:04:27.093691 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.096161 master-0 kubenswrapper[29936]: I1205 13:04:27.095978 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ac9589fd-c5fd-41de-8018-16c6e272d05e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.098106 master-0 kubenswrapper[29936]: I1205 13:04:27.097249 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:04:27.098106 master-0 kubenswrapper[29936]: I1205 13:04:27.097282 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6b8e1316-38c5-4242-ad7f-1e42a03669e5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^da10cece-679d-415a-ad59-c9d9c9192399\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/2666e5aac4069254d2e51cc009d678f2b83be4f17b799d44936f7c68067f4669/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.098286 master-0 kubenswrapper[29936]: I1205 13:04:27.098173 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ac9589fd-c5fd-41de-8018-16c6e272d05e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.104019 master-0 kubenswrapper[29936]: I1205 13:04:27.100701 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.104019 master-0 kubenswrapper[29936]: I1205 13:04:27.101441 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ac9589fd-c5fd-41de-8018-16c6e272d05e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.104019 master-0 kubenswrapper[29936]: I1205 13:04:27.101824 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ac9589fd-c5fd-41de-8018-16c6e272d05e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:27.129966 master-0 kubenswrapper[29936]: I1205 13:04:27.129919 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9kq\" (UniqueName: \"kubernetes.io/projected/ac9589fd-c5fd-41de-8018-16c6e272d05e-kube-api-access-wg9kq\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:28.229068 master-0 kubenswrapper[29936]: I1205 13:04:28.228870 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 13:04:28.231669 master-0 kubenswrapper[29936]: I1205 13:04:28.231620 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 13:04:28.240069 master-0 kubenswrapper[29936]: I1205 13:04:28.239616 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 13:04:28.240069 master-0 kubenswrapper[29936]: I1205 13:04:28.240054 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 13:04:28.241717 master-0 kubenswrapper[29936]: I1205 13:04:28.240959 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 13:04:28.260472 master-0 kubenswrapper[29936]: I1205 13:04:28.259954 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 13:04:28.269700 master-0 kubenswrapper[29936]: I1205 13:04:28.268930 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-741478e1-cf27-4a67-94ef-d4c4340c5081\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ebbcbd0c-5572-4263-8e27-73d748c52fed\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.270430 master-0 kubenswrapper[29936]: I1205 13:04:28.270392 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.270508 master-0 kubenswrapper[29936]: I1205 13:04:28.270459 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-kolla-config\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.270561 master-0 kubenswrapper[29936]: I1205 13:04:28.270528 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.270613 master-0 kubenswrapper[29936]: I1205 13:04:28.270560 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.270753 master-0 kubenswrapper[29936]: I1205 13:04:28.270720 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.273352 master-0 kubenswrapper[29936]: I1205 13:04:28.271239 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jvsq\" (UniqueName: \"kubernetes.io/projected/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-kube-api-access-5jvsq\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.273352 master-0 kubenswrapper[29936]: I1205 13:04:28.272112 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-config-data-default\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.376013 master-0 kubenswrapper[29936]: I1205 13:04:28.375933 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-config-data-default\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.376425 master-0 kubenswrapper[29936]: I1205 13:04:28.376056 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-741478e1-cf27-4a67-94ef-d4c4340c5081\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ebbcbd0c-5572-4263-8e27-73d748c52fed\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.376425 master-0 kubenswrapper[29936]: I1205 13:04:28.376092 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.376425 master-0 kubenswrapper[29936]: I1205 13:04:28.376117 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-kolla-config\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.376425 master-0 kubenswrapper[29936]: I1205 13:04:28.376147 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.376425 master-0 kubenswrapper[29936]: I1205 13:04:28.376173 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.376425 master-0 kubenswrapper[29936]: I1205 13:04:28.376213 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.376425 master-0 kubenswrapper[29936]: I1205 13:04:28.376251 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jvsq\" (UniqueName: \"kubernetes.io/projected/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-kube-api-access-5jvsq\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.378761 master-0 kubenswrapper[29936]: I1205 13:04:28.377635 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-config-data-default\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.378761 master-0 kubenswrapper[29936]: I1205 13:04:28.378383 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-kolla-config\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.380491 master-0 kubenswrapper[29936]: I1205 13:04:28.380225 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:04:28.380491 master-0 kubenswrapper[29936]: I1205 13:04:28.380257 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-741478e1-cf27-4a67-94ef-d4c4340c5081\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ebbcbd0c-5572-4263-8e27-73d748c52fed\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/398449ab7137a8490d44406af4b803012c77301835c335129c3c8017130371f5/globalmount\"" pod="openstack/openstack-galera-0" Dec 05 13:04:28.380491 master-0 kubenswrapper[29936]: I1205 13:04:28.380423 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.384055 master-0 kubenswrapper[29936]: I1205 13:04:28.384006 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.384727 master-0 kubenswrapper[29936]: I1205 13:04:28.384683 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.388820 master-0 kubenswrapper[29936]: I1205 13:04:28.388774 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.402700 master-0 kubenswrapper[29936]: I1205 13:04:28.402416 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jvsq\" (UniqueName: \"kubernetes.io/projected/e05f116e-3a9f-481a-8ca7-e9f4715f5d7f-kube-api-access-5jvsq\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:28.604120 master-0 kubenswrapper[29936]: I1205 13:04:28.604049 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 13:04:28.615492 master-0 kubenswrapper[29936]: I1205 13:04:28.614278 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.622847 master-0 kubenswrapper[29936]: I1205 13:04:28.622703 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 13:04:28.623661 master-0 kubenswrapper[29936]: I1205 13:04:28.623547 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 13:04:28.624775 master-0 kubenswrapper[29936]: I1205 13:04:28.624668 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 13:04:28.631026 master-0 kubenswrapper[29936]: I1205 13:04:28.630925 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 13:04:28.665744 master-0 kubenswrapper[29936]: I1205 13:04:28.665645 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6b8e1316-38c5-4242-ad7f-1e42a03669e5\" (UniqueName: \"kubernetes.io/csi/topolvm.io^da10cece-679d-415a-ad59-c9d9c9192399\") pod \"rabbitmq-server-0\" (UID: \"ac9589fd-c5fd-41de-8018-16c6e272d05e\") " pod="openstack/rabbitmq-server-0" Dec 05 13:04:28.697944 master-0 kubenswrapper[29936]: I1205 13:04:28.697878 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fb7bb661-3aba-4919-a9ee-32b999cb4f05-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.700821 master-0 kubenswrapper[29936]: I1205 13:04:28.700775 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7bb661-3aba-4919-a9ee-32b999cb4f05-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.701370 master-0 kubenswrapper[29936]: I1205 13:04:28.701257 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-68288a88-7228-4e1b-b6b1-1ca23bff38cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^636a1581-4e45-4003-a935-a6d404cfee53\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.701686 master-0 kubenswrapper[29936]: I1205 13:04:28.701638 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvc9j\" (UniqueName: \"kubernetes.io/projected/fb7bb661-3aba-4919-a9ee-32b999cb4f05-kube-api-access-xvc9j\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.701882 master-0 kubenswrapper[29936]: I1205 13:04:28.701811 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb7bb661-3aba-4919-a9ee-32b999cb4f05-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.702136 master-0 kubenswrapper[29936]: I1205 13:04:28.702121 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb7bb661-3aba-4919-a9ee-32b999cb4f05-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.702398 master-0 kubenswrapper[29936]: I1205 13:04:28.702363 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fb7bb661-3aba-4919-a9ee-32b999cb4f05-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.702546 master-0 kubenswrapper[29936]: I1205 13:04:28.702530 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fb7bb661-3aba-4919-a9ee-32b999cb4f05-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.804964 master-0 kubenswrapper[29936]: I1205 13:04:28.804709 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb7bb661-3aba-4919-a9ee-32b999cb4f05-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.804964 master-0 kubenswrapper[29936]: I1205 13:04:28.804845 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb7bb661-3aba-4919-a9ee-32b999cb4f05-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.804964 master-0 kubenswrapper[29936]: I1205 13:04:28.804911 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fb7bb661-3aba-4919-a9ee-32b999cb4f05-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.804964 master-0 kubenswrapper[29936]: I1205 13:04:28.804948 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fb7bb661-3aba-4919-a9ee-32b999cb4f05-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.805454 master-0 kubenswrapper[29936]: I1205 13:04:28.805016 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7bb661-3aba-4919-a9ee-32b999cb4f05-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.805454 master-0 kubenswrapper[29936]: I1205 13:04:28.805045 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fb7bb661-3aba-4919-a9ee-32b999cb4f05-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.805454 master-0 kubenswrapper[29936]: I1205 13:04:28.805092 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-68288a88-7228-4e1b-b6b1-1ca23bff38cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^636a1581-4e45-4003-a935-a6d404cfee53\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.805454 master-0 kubenswrapper[29936]: I1205 13:04:28.805160 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvc9j\" (UniqueName: \"kubernetes.io/projected/fb7bb661-3aba-4919-a9ee-32b999cb4f05-kube-api-access-xvc9j\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.811607 master-0 kubenswrapper[29936]: I1205 13:04:28.807971 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb7bb661-3aba-4919-a9ee-32b999cb4f05-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.811607 master-0 kubenswrapper[29936]: I1205 13:04:28.808648 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fb7bb661-3aba-4919-a9ee-32b999cb4f05-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.811607 master-0 kubenswrapper[29936]: I1205 13:04:28.809745 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fb7bb661-3aba-4919-a9ee-32b999cb4f05-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.811607 master-0 kubenswrapper[29936]: I1205 13:04:28.810583 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fb7bb661-3aba-4919-a9ee-32b999cb4f05-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.811980 master-0 kubenswrapper[29936]: I1205 13:04:28.811663 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:04:28.811980 master-0 kubenswrapper[29936]: I1205 13:04:28.811690 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-68288a88-7228-4e1b-b6b1-1ca23bff38cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^636a1581-4e45-4003-a935-a6d404cfee53\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/4e0f8348039ad0dbe1bf19b3c45006e556fce329c89949a9f80fdde160a2a115/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.813230 master-0 kubenswrapper[29936]: I1205 13:04:28.813147 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7bb661-3aba-4919-a9ee-32b999cb4f05-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.817343 master-0 kubenswrapper[29936]: I1205 13:04:28.817297 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb7bb661-3aba-4919-a9ee-32b999cb4f05-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.833563 master-0 kubenswrapper[29936]: I1205 13:04:28.833493 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvc9j\" (UniqueName: \"kubernetes.io/projected/fb7bb661-3aba-4919-a9ee-32b999cb4f05-kube-api-access-xvc9j\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:28.975569 master-0 kubenswrapper[29936]: I1205 13:04:28.973997 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 13:04:29.715333 master-0 kubenswrapper[29936]: I1205 13:04:29.714643 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-741478e1-cf27-4a67-94ef-d4c4340c5081\" (UniqueName: \"kubernetes.io/csi/topolvm.io^ebbcbd0c-5572-4263-8e27-73d748c52fed\") pod \"openstack-galera-0\" (UID: \"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f\") " pod="openstack/openstack-galera-0" Dec 05 13:04:30.779761 master-0 kubenswrapper[29936]: I1205 13:04:30.779679 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-68288a88-7228-4e1b-b6b1-1ca23bff38cb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^636a1581-4e45-4003-a935-a6d404cfee53\") pod \"openstack-cell1-galera-0\" (UID: \"fb7bb661-3aba-4919-a9ee-32b999cb4f05\") " pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:32.478458 master-0 kubenswrapper[29936]: I1205 13:04:32.478347 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 13:04:36.786608 master-0 kubenswrapper[29936]: I1205 13:04:36.786498 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 13:04:37.522503 master-0 kubenswrapper[29936]: W1205 13:04:37.522411 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b8fe631_5d8c_457a_bfb2_279b0526552f.slice/crio-f31ee47516a83ca8775a2b7f819096e4dd243b48d6f639c05062d835cdcae64b WatchSource:0}: Error finding container f31ee47516a83ca8775a2b7f819096e4dd243b48d6f639c05062d835cdcae64b: Status 404 returned error can't find the container with id f31ee47516a83ca8775a2b7f819096e4dd243b48d6f639c05062d835cdcae64b Dec 05 13:04:38.113906 master-0 kubenswrapper[29936]: I1205 13:04:38.113807 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-f8fxj"] Dec 05 13:04:38.118204 master-0 kubenswrapper[29936]: I1205 13:04:38.115632 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.127210 master-0 kubenswrapper[29936]: I1205 13:04:38.123308 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 13:04:38.127210 master-0 kubenswrapper[29936]: I1205 13:04:38.123574 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 13:04:38.139141 master-0 kubenswrapper[29936]: I1205 13:04:38.137979 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-jj4cf"] Dec 05 13:04:38.141585 master-0 kubenswrapper[29936]: I1205 13:04:38.141523 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.153323 master-0 kubenswrapper[29936]: I1205 13:04:38.153243 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f8fxj"] Dec 05 13:04:38.169642 master-0 kubenswrapper[29936]: I1205 13:04:38.169562 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jj4cf"] Dec 05 13:04:38.174640 master-0 kubenswrapper[29936]: I1205 13:04:38.174170 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1aaf3eff-076d-42cd-a86c-9e5af7a38664-var-log-ovn\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.174640 master-0 kubenswrapper[29936]: I1205 13:04:38.174290 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh7hx\" (UniqueName: \"kubernetes.io/projected/1aaf3eff-076d-42cd-a86c-9e5af7a38664-kube-api-access-dh7hx\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.174640 master-0 kubenswrapper[29936]: I1205 13:04:38.174339 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1aaf3eff-076d-42cd-a86c-9e5af7a38664-var-run-ovn\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.174640 master-0 kubenswrapper[29936]: I1205 13:04:38.174371 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aaf3eff-076d-42cd-a86c-9e5af7a38664-ovn-controller-tls-certs\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.174640 master-0 kubenswrapper[29936]: I1205 13:04:38.174403 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1aaf3eff-076d-42cd-a86c-9e5af7a38664-var-run\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.174640 master-0 kubenswrapper[29936]: I1205 13:04:38.174460 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aaf3eff-076d-42cd-a86c-9e5af7a38664-combined-ca-bundle\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.174640 master-0 kubenswrapper[29936]: I1205 13:04:38.174485 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1aaf3eff-076d-42cd-a86c-9e5af7a38664-scripts\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.276804 master-0 kubenswrapper[29936]: I1205 13:04:38.276743 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-var-log\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.277170 master-0 kubenswrapper[29936]: I1205 13:04:38.276823 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1aaf3eff-076d-42cd-a86c-9e5af7a38664-var-log-ovn\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.277170 master-0 kubenswrapper[29936]: I1205 13:04:38.277059 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-var-run\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.277266 master-0 kubenswrapper[29936]: I1205 13:04:38.277238 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh7hx\" (UniqueName: \"kubernetes.io/projected/1aaf3eff-076d-42cd-a86c-9e5af7a38664-kube-api-access-dh7hx\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.277441 master-0 kubenswrapper[29936]: I1205 13:04:38.277418 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1aaf3eff-076d-42cd-a86c-9e5af7a38664-var-run-ovn\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.277490 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aaf3eff-076d-42cd-a86c-9e5af7a38664-ovn-controller-tls-certs\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.277553 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdf8b04c-25f3-49a7-be24-553190212e0d-scripts\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.277575 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1aaf3eff-076d-42cd-a86c-9e5af7a38664-var-run\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.277614 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-var-lib\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.277643 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4skzs\" (UniqueName: \"kubernetes.io/projected/fdf8b04c-25f3-49a7-be24-553190212e0d-kube-api-access-4skzs\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.277876 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aaf3eff-076d-42cd-a86c-9e5af7a38664-combined-ca-bundle\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.277912 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1aaf3eff-076d-42cd-a86c-9e5af7a38664-scripts\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.277976 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1aaf3eff-076d-42cd-a86c-9e5af7a38664-var-run-ovn\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.278314 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1aaf3eff-076d-42cd-a86c-9e5af7a38664-var-run\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.278350 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1aaf3eff-076d-42cd-a86c-9e5af7a38664-var-log-ovn\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.278603 master-0 kubenswrapper[29936]: I1205 13:04:38.278371 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-etc-ovs\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.280473 master-0 kubenswrapper[29936]: I1205 13:04:38.280421 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1aaf3eff-076d-42cd-a86c-9e5af7a38664-scripts\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.281914 master-0 kubenswrapper[29936]: I1205 13:04:38.281876 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1aaf3eff-076d-42cd-a86c-9e5af7a38664-ovn-controller-tls-certs\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.283209 master-0 kubenswrapper[29936]: I1205 13:04:38.283130 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aaf3eff-076d-42cd-a86c-9e5af7a38664-combined-ca-bundle\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:38.428558 master-0 kubenswrapper[29936]: I1205 13:04:38.425747 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdf8b04c-25f3-49a7-be24-553190212e0d-scripts\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.428558 master-0 kubenswrapper[29936]: I1205 13:04:38.425836 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-var-lib\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.428558 master-0 kubenswrapper[29936]: I1205 13:04:38.425868 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4skzs\" (UniqueName: \"kubernetes.io/projected/fdf8b04c-25f3-49a7-be24-553190212e0d-kube-api-access-4skzs\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.428558 master-0 kubenswrapper[29936]: I1205 13:04:38.425959 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-etc-ovs\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.428558 master-0 kubenswrapper[29936]: I1205 13:04:38.425982 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-var-log\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.428558 master-0 kubenswrapper[29936]: I1205 13:04:38.426039 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-var-run\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.428558 master-0 kubenswrapper[29936]: I1205 13:04:38.426190 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-var-run\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.428558 master-0 kubenswrapper[29936]: I1205 13:04:38.426400 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-etc-ovs\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.428558 master-0 kubenswrapper[29936]: I1205 13:04:38.427081 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-var-lib\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.428558 master-0 kubenswrapper[29936]: I1205 13:04:38.427253 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/fdf8b04c-25f3-49a7-be24-553190212e0d-var-log\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.429395 master-0 kubenswrapper[29936]: I1205 13:04:38.429351 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fdf8b04c-25f3-49a7-be24-553190212e0d-scripts\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:38.582082 master-0 kubenswrapper[29936]: I1205 13:04:38.581997 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8b8fe631-5d8c-457a-bfb2-279b0526552f","Type":"ContainerStarted","Data":"f31ee47516a83ca8775a2b7f819096e4dd243b48d6f639c05062d835cdcae64b"} Dec 05 13:04:39.297252 master-0 kubenswrapper[29936]: I1205 13:04:39.296069 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4skzs\" (UniqueName: \"kubernetes.io/projected/fdf8b04c-25f3-49a7-be24-553190212e0d-kube-api-access-4skzs\") pod \"ovn-controller-ovs-jj4cf\" (UID: \"fdf8b04c-25f3-49a7-be24-553190212e0d\") " pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:39.299527 master-0 kubenswrapper[29936]: I1205 13:04:39.298355 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh7hx\" (UniqueName: \"kubernetes.io/projected/1aaf3eff-076d-42cd-a86c-9e5af7a38664-kube-api-access-dh7hx\") pod \"ovn-controller-f8fxj\" (UID: \"1aaf3eff-076d-42cd-a86c-9e5af7a38664\") " pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:39.463731 master-0 kubenswrapper[29936]: I1205 13:04:39.462989 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8fxj" Dec 05 13:04:39.464138 master-0 kubenswrapper[29936]: I1205 13:04:39.463825 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:04:44.171979 master-0 kubenswrapper[29936]: I1205 13:04:44.171864 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 13:04:44.179262 master-0 kubenswrapper[29936]: I1205 13:04:44.179198 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.182776 master-0 kubenswrapper[29936]: I1205 13:04:44.182705 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 13:04:44.283623 master-0 kubenswrapper[29936]: I1205 13:04:44.283351 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 13:04:44.384963 master-0 kubenswrapper[29936]: I1205 13:04:44.384770 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 13:04:44.384963 master-0 kubenswrapper[29936]: I1205 13:04:44.384978 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 13:04:44.386965 master-0 kubenswrapper[29936]: I1205 13:04:44.386884 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 13:04:44.518546 master-0 kubenswrapper[29936]: I1205 13:04:44.518461 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/819b38a4-91fa-4657-92a7-f00475bdb566-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.518903 master-0 kubenswrapper[29936]: I1205 13:04:44.518668 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/819b38a4-91fa-4657-92a7-f00475bdb566-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.519002 master-0 kubenswrapper[29936]: I1205 13:04:44.518942 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/819b38a4-91fa-4657-92a7-f00475bdb566-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.519054 master-0 kubenswrapper[29936]: I1205 13:04:44.519041 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819b38a4-91fa-4657-92a7-f00475bdb566-config\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.519095 master-0 kubenswrapper[29936]: I1205 13:04:44.519073 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4b0fc27c-14db-4e8d-84ad-0661fd7b8896\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e1ea35db-f4a0-4409-9bf2-748377753db6\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.519143 master-0 kubenswrapper[29936]: I1205 13:04:44.519096 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbbmr\" (UniqueName: \"kubernetes.io/projected/819b38a4-91fa-4657-92a7-f00475bdb566-kube-api-access-zbbmr\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.519143 master-0 kubenswrapper[29936]: I1205 13:04:44.519124 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/819b38a4-91fa-4657-92a7-f00475bdb566-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.519259 master-0 kubenswrapper[29936]: I1205 13:04:44.519213 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819b38a4-91fa-4657-92a7-f00475bdb566-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.620950 master-0 kubenswrapper[29936]: I1205 13:04:44.620882 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819b38a4-91fa-4657-92a7-f00475bdb566-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.620950 master-0 kubenswrapper[29936]: I1205 13:04:44.620953 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/819b38a4-91fa-4657-92a7-f00475bdb566-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.621401 master-0 kubenswrapper[29936]: I1205 13:04:44.621336 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/819b38a4-91fa-4657-92a7-f00475bdb566-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.621805 master-0 kubenswrapper[29936]: I1205 13:04:44.621749 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/819b38a4-91fa-4657-92a7-f00475bdb566-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.621886 master-0 kubenswrapper[29936]: I1205 13:04:44.621816 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819b38a4-91fa-4657-92a7-f00475bdb566-config\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.621886 master-0 kubenswrapper[29936]: I1205 13:04:44.621865 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4b0fc27c-14db-4e8d-84ad-0661fd7b8896\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e1ea35db-f4a0-4409-9bf2-748377753db6\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.621886 master-0 kubenswrapper[29936]: I1205 13:04:44.621887 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbbmr\" (UniqueName: \"kubernetes.io/projected/819b38a4-91fa-4657-92a7-f00475bdb566-kube-api-access-zbbmr\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.622367 master-0 kubenswrapper[29936]: I1205 13:04:44.621937 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/819b38a4-91fa-4657-92a7-f00475bdb566-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.622562 master-0 kubenswrapper[29936]: I1205 13:04:44.622507 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/819b38a4-91fa-4657-92a7-f00475bdb566-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.623452 master-0 kubenswrapper[29936]: I1205 13:04:44.623410 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/819b38a4-91fa-4657-92a7-f00475bdb566-config\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.624069 master-0 kubenswrapper[29936]: I1205 13:04:44.623962 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 13:04:44.624344 master-0 kubenswrapper[29936]: I1205 13:04:44.624309 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/819b38a4-91fa-4657-92a7-f00475bdb566-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.638325 master-0 kubenswrapper[29936]: I1205 13:04:44.638265 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:04:44.638600 master-0 kubenswrapper[29936]: I1205 13:04:44.638337 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4b0fc27c-14db-4e8d-84ad-0661fd7b8896\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e1ea35db-f4a0-4409-9bf2-748377753db6\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/d854dac1560c9d02fea55da8cff2a52391699830315ae331ed480cfe1e8954ec/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.638802 master-0 kubenswrapper[29936]: I1205 13:04:44.638698 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819b38a4-91fa-4657-92a7-f00475bdb566-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.638962 master-0 kubenswrapper[29936]: I1205 13:04:44.638656 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/819b38a4-91fa-4657-92a7-f00475bdb566-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.639266 master-0 kubenswrapper[29936]: I1205 13:04:44.639230 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/819b38a4-91fa-4657-92a7-f00475bdb566-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:44.642916 master-0 kubenswrapper[29936]: I1205 13:04:44.642883 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbbmr\" (UniqueName: \"kubernetes.io/projected/819b38a4-91fa-4657-92a7-f00475bdb566-kube-api-access-zbbmr\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:46.331384 master-0 kubenswrapper[29936]: I1205 13:04:46.329776 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4b0fc27c-14db-4e8d-84ad-0661fd7b8896\" (UniqueName: \"kubernetes.io/csi/topolvm.io^e1ea35db-f4a0-4409-9bf2-748377753db6\") pod \"ovsdbserver-nb-0\" (UID: \"819b38a4-91fa-4657-92a7-f00475bdb566\") " pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:46.547488 master-0 kubenswrapper[29936]: I1205 13:04:46.547298 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 13:04:47.524793 master-0 kubenswrapper[29936]: I1205 13:04:47.524699 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 13:04:47.528032 master-0 kubenswrapper[29936]: I1205 13:04:47.527958 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.530952 master-0 kubenswrapper[29936]: I1205 13:04:47.530893 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 13:04:47.531077 master-0 kubenswrapper[29936]: I1205 13:04:47.531015 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 13:04:47.531612 master-0 kubenswrapper[29936]: I1205 13:04:47.531569 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 13:04:47.655997 master-0 kubenswrapper[29936]: I1205 13:04:47.655902 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 13:04:47.724271 master-0 kubenswrapper[29936]: I1205 13:04:47.724172 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8454790-c77d-4164-99ff-edbfcdc4c426-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.724665 master-0 kubenswrapper[29936]: I1205 13:04:47.724493 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3dc32be9-bb10-4d3c-be83-72752bf08be0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^cceffdec-f26c-454c-ac6b-d5d825ea0ba9\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.724665 master-0 kubenswrapper[29936]: I1205 13:04:47.724661 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8454790-c77d-4164-99ff-edbfcdc4c426-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.724961 master-0 kubenswrapper[29936]: I1205 13:04:47.724882 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8454790-c77d-4164-99ff-edbfcdc4c426-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.725364 master-0 kubenswrapper[29936]: I1205 13:04:47.725332 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8454790-c77d-4164-99ff-edbfcdc4c426-config\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.725446 master-0 kubenswrapper[29936]: I1205 13:04:47.725421 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7fz9\" (UniqueName: \"kubernetes.io/projected/e8454790-c77d-4164-99ff-edbfcdc4c426-kube-api-access-v7fz9\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.725595 master-0 kubenswrapper[29936]: I1205 13:04:47.725564 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8454790-c77d-4164-99ff-edbfcdc4c426-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.725673 master-0 kubenswrapper[29936]: I1205 13:04:47.725655 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8454790-c77d-4164-99ff-edbfcdc4c426-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.827517 master-0 kubenswrapper[29936]: I1205 13:04:47.827321 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3dc32be9-bb10-4d3c-be83-72752bf08be0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^cceffdec-f26c-454c-ac6b-d5d825ea0ba9\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.827517 master-0 kubenswrapper[29936]: I1205 13:04:47.827409 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8454790-c77d-4164-99ff-edbfcdc4c426-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.827517 master-0 kubenswrapper[29936]: I1205 13:04:47.827443 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8454790-c77d-4164-99ff-edbfcdc4c426-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.827517 master-0 kubenswrapper[29936]: I1205 13:04:47.827500 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8454790-c77d-4164-99ff-edbfcdc4c426-config\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.827946 master-0 kubenswrapper[29936]: I1205 13:04:47.827738 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7fz9\" (UniqueName: \"kubernetes.io/projected/e8454790-c77d-4164-99ff-edbfcdc4c426-kube-api-access-v7fz9\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.827946 master-0 kubenswrapper[29936]: I1205 13:04:47.827933 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8454790-c77d-4164-99ff-edbfcdc4c426-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.828575 master-0 kubenswrapper[29936]: I1205 13:04:47.828273 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8454790-c77d-4164-99ff-edbfcdc4c426-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.828575 master-0 kubenswrapper[29936]: I1205 13:04:47.828359 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8454790-c77d-4164-99ff-edbfcdc4c426-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.829018 master-0 kubenswrapper[29936]: I1205 13:04:47.828684 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8454790-c77d-4164-99ff-edbfcdc4c426-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.829361 master-0 kubenswrapper[29936]: I1205 13:04:47.829330 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:04:47.829416 master-0 kubenswrapper[29936]: I1205 13:04:47.829367 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3dc32be9-bb10-4d3c-be83-72752bf08be0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^cceffdec-f26c-454c-ac6b-d5d825ea0ba9\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/588e54891bc82caa3a790315d3590e92fb1d12ea1476a9fe7903c25b788c8bbc/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.829770 master-0 kubenswrapper[29936]: I1205 13:04:47.829719 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8454790-c77d-4164-99ff-edbfcdc4c426-config\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.829819 master-0 kubenswrapper[29936]: I1205 13:04:47.829799 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8454790-c77d-4164-99ff-edbfcdc4c426-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.832209 master-0 kubenswrapper[29936]: I1205 13:04:47.832039 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8454790-c77d-4164-99ff-edbfcdc4c426-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.832209 master-0 kubenswrapper[29936]: I1205 13:04:47.832126 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8454790-c77d-4164-99ff-edbfcdc4c426-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:47.838212 master-0 kubenswrapper[29936]: I1205 13:04:47.838136 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8454790-c77d-4164-99ff-edbfcdc4c426-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:48.447621 master-0 kubenswrapper[29936]: I1205 13:04:48.444922 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7fz9\" (UniqueName: \"kubernetes.io/projected/e8454790-c77d-4164-99ff-edbfcdc4c426-kube-api-access-v7fz9\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:49.510290 master-0 kubenswrapper[29936]: I1205 13:04:49.510219 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3dc32be9-bb10-4d3c-be83-72752bf08be0\" (UniqueName: \"kubernetes.io/csi/topolvm.io^cceffdec-f26c-454c-ac6b-d5d825ea0ba9\") pod \"ovsdbserver-sb-0\" (UID: \"e8454790-c77d-4164-99ff-edbfcdc4c426\") " pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:49.662833 master-0 kubenswrapper[29936]: I1205 13:04:49.662775 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 13:04:50.846615 master-0 kubenswrapper[29936]: I1205 13:04:50.846389 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a34de370-e400-45de-adbb-09995d9c1953","Type":"ContainerStarted","Data":"adccd425e003860d3fb50f4bca2808d4d40bef050fbc623c8fb643d6dc5f22d4"} Dec 05 13:04:52.151981 master-0 kubenswrapper[29936]: I1205 13:04:52.147678 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 13:04:52.226967 master-0 kubenswrapper[29936]: I1205 13:04:52.226890 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 13:04:52.586399 master-0 kubenswrapper[29936]: W1205 13:04:52.586131 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac9589fd_c5fd_41de_8018_16c6e272d05e.slice/crio-b10697ce72f3ad44941916b2ef1f3d65a56e63ae831f18fb329f929b8bf945a4 WatchSource:0}: Error finding container b10697ce72f3ad44941916b2ef1f3d65a56e63ae831f18fb329f929b8bf945a4: Status 404 returned error can't find the container with id b10697ce72f3ad44941916b2ef1f3d65a56e63ae831f18fb329f929b8bf945a4 Dec 05 13:04:52.871656 master-0 kubenswrapper[29936]: I1205 13:04:52.871068 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f","Type":"ContainerStarted","Data":"df6d974104ba09564d82c7ee5412737aca9712992e065c137cae9f8f658ed1c6"} Dec 05 13:04:52.874694 master-0 kubenswrapper[29936]: I1205 13:04:52.874609 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ac9589fd-c5fd-41de-8018-16c6e272d05e","Type":"ContainerStarted","Data":"b10697ce72f3ad44941916b2ef1f3d65a56e63ae831f18fb329f929b8bf945a4"} Dec 05 13:04:53.078858 master-0 kubenswrapper[29936]: I1205 13:04:53.078789 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 13:04:53.245563 master-0 kubenswrapper[29936]: I1205 13:04:53.238930 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f8fxj"] Dec 05 13:04:53.511798 master-0 kubenswrapper[29936]: I1205 13:04:53.511739 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jj4cf"] Dec 05 13:04:53.605638 master-0 kubenswrapper[29936]: I1205 13:04:53.605371 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 13:04:53.737583 master-0 kubenswrapper[29936]: I1205 13:04:53.737058 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-4q9lh"] Dec 05 13:04:53.740452 master-0 kubenswrapper[29936]: I1205 13:04:53.740407 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:53.749446 master-0 kubenswrapper[29936]: I1205 13:04:53.746172 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 13:04:53.757264 master-0 kubenswrapper[29936]: I1205 13:04:53.756688 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4q9lh"] Dec 05 13:04:53.909366 master-0 kubenswrapper[29936]: I1205 13:04:53.905882 29936 generic.go:334] "Generic (PLEG): container finished" podID="d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" containerID="4af8ece6fa070cf8bebc5833a18566235a7b94c9ea805900c5e2c109dd20e6a4" exitCode=0 Dec 05 13:04:53.910262 master-0 kubenswrapper[29936]: I1205 13:04:53.908360 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" event={"ID":"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9","Type":"ContainerDied","Data":"4af8ece6fa070cf8bebc5833a18566235a7b94c9ea805900c5e2c109dd20e6a4"} Dec 05 13:04:53.914920 master-0 kubenswrapper[29936]: I1205 13:04:53.914590 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9bff68687-qw5gj"] Dec 05 13:04:53.921714 master-0 kubenswrapper[29936]: I1205 13:04:53.921642 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8b8fe631-5d8c-457a-bfb2-279b0526552f","Type":"ContainerStarted","Data":"928d23dba40f64847e3c88ef1bd6a20bf46acd839854a2d1604cd5073caae5b4"} Dec 05 13:04:53.941353 master-0 kubenswrapper[29936]: I1205 13:04:53.934729 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 13:04:53.941353 master-0 kubenswrapper[29936]: I1205 13:04:53.937644 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d901429c-c54b-4d4f-95fd-28edc0f91d91-ovn-rundir\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:53.941353 master-0 kubenswrapper[29936]: I1205 13:04:53.937641 29936 generic.go:334] "Generic (PLEG): container finished" podID="6bec31da-6702-489f-9b71-abf49a5607c9" containerID="772b3a69fc9183378eb67b102489f5152139f796687fc45451bec91515f34021" exitCode=0 Dec 05 13:04:53.941353 master-0 kubenswrapper[29936]: I1205 13:04:53.937858 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d901429c-c54b-4d4f-95fd-28edc0f91d91-ovs-rundir\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:53.941353 master-0 kubenswrapper[29936]: I1205 13:04:53.937680 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" event={"ID":"6bec31da-6702-489f-9b71-abf49a5607c9","Type":"ContainerDied","Data":"772b3a69fc9183378eb67b102489f5152139f796687fc45451bec91515f34021"} Dec 05 13:04:53.941353 master-0 kubenswrapper[29936]: I1205 13:04:53.938290 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d901429c-c54b-4d4f-95fd-28edc0f91d91-combined-ca-bundle\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:53.941353 master-0 kubenswrapper[29936]: I1205 13:04:53.940892 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tdrt\" (UniqueName: \"kubernetes.io/projected/d901429c-c54b-4d4f-95fd-28edc0f91d91-kube-api-access-2tdrt\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:53.941353 master-0 kubenswrapper[29936]: I1205 13:04:53.940985 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d901429c-c54b-4d4f-95fd-28edc0f91d91-config\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:53.941353 master-0 kubenswrapper[29936]: I1205 13:04:53.941160 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d901429c-c54b-4d4f-95fd-28edc0f91d91-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:53.942248 master-0 kubenswrapper[29936]: I1205 13:04:53.942199 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-kbdm6"] Dec 05 13:04:53.947744 master-0 kubenswrapper[29936]: I1205 13:04:53.944621 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:53.950373 master-0 kubenswrapper[29936]: I1205 13:04:53.949922 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 13:04:53.991867 master-0 kubenswrapper[29936]: I1205 13:04:53.991710 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-kbdm6"] Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.045651 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d901429c-c54b-4d4f-95fd-28edc0f91d91-ovs-rundir\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.045790 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d901429c-c54b-4d4f-95fd-28edc0f91d91-combined-ca-bundle\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.045873 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.045912 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-dns-svc\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.045911 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/d901429c-c54b-4d4f-95fd-28edc0f91d91-ovs-rundir\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.046119 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-config\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.046156 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tdrt\" (UniqueName: \"kubernetes.io/projected/d901429c-c54b-4d4f-95fd-28edc0f91d91-kube-api-access-2tdrt\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.046206 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d901429c-c54b-4d4f-95fd-28edc0f91d91-config\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.046264 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d901429c-c54b-4d4f-95fd-28edc0f91d91-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.046344 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcshv\" (UniqueName: \"kubernetes.io/projected/38517b6c-32f7-4853-adb2-e40488c6bb56-kube-api-access-fcshv\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.047543 master-0 kubenswrapper[29936]: I1205 13:04:54.046382 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d901429c-c54b-4d4f-95fd-28edc0f91d91-ovn-rundir\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.048485 master-0 kubenswrapper[29936]: I1205 13:04:54.047589 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/d901429c-c54b-4d4f-95fd-28edc0f91d91-ovn-rundir\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.048485 master-0 kubenswrapper[29936]: I1205 13:04:54.048454 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d901429c-c54b-4d4f-95fd-28edc0f91d91-config\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.050829 master-0 kubenswrapper[29936]: I1205 13:04:54.050792 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d901429c-c54b-4d4f-95fd-28edc0f91d91-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.052775 master-0 kubenswrapper[29936]: I1205 13:04:54.052655 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d901429c-c54b-4d4f-95fd-28edc0f91d91-combined-ca-bundle\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.141616 master-0 kubenswrapper[29936]: I1205 13:04:54.141541 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tdrt\" (UniqueName: \"kubernetes.io/projected/d901429c-c54b-4d4f-95fd-28edc0f91d91-kube-api-access-2tdrt\") pod \"ovn-controller-metrics-4q9lh\" (UID: \"d901429c-c54b-4d4f-95fd-28edc0f91d91\") " pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:54.144485 master-0 kubenswrapper[29936]: I1205 13:04:54.144381 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.873077491 podStartE2EDuration="30.144352825s" podCreationTimestamp="2025-12-05 13:04:24 +0000 UTC" firstStartedPulling="2025-12-05 13:04:37.53712207 +0000 UTC m=+874.669201751" lastFinishedPulling="2025-12-05 13:04:52.808397394 +0000 UTC m=+889.940477085" observedRunningTime="2025-12-05 13:04:54.129649675 +0000 UTC m=+891.261729366" watchObservedRunningTime="2025-12-05 13:04:54.144352825 +0000 UTC m=+891.276432506" Dec 05 13:04:54.148339 master-0 kubenswrapper[29936]: I1205 13:04:54.148284 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.148410 master-0 kubenswrapper[29936]: I1205 13:04:54.148355 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-dns-svc\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.148455 master-0 kubenswrapper[29936]: I1205 13:04:54.148413 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-config\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.148536 master-0 kubenswrapper[29936]: I1205 13:04:54.148504 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcshv\" (UniqueName: \"kubernetes.io/projected/38517b6c-32f7-4853-adb2-e40488c6bb56-kube-api-access-fcshv\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.149480 master-0 kubenswrapper[29936]: I1205 13:04:54.149440 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-dns-svc\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.149800 master-0 kubenswrapper[29936]: I1205 13:04:54.149746 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-config\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.150693 master-0 kubenswrapper[29936]: I1205 13:04:54.150650 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.217600 master-0 kubenswrapper[29936]: I1205 13:04:54.217497 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcshv\" (UniqueName: \"kubernetes.io/projected/38517b6c-32f7-4853-adb2-e40488c6bb56-kube-api-access-fcshv\") pod \"dnsmasq-dns-5b4bcf6ff-kbdm6\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.299449 master-0 kubenswrapper[29936]: I1205 13:04:54.296232 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:54.316452 master-0 kubenswrapper[29936]: I1205 13:04:54.316347 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 13:04:54.377967 master-0 kubenswrapper[29936]: I1205 13:04:54.377882 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-4q9lh" Dec 05 13:04:55.048313 master-0 kubenswrapper[29936]: W1205 13:04:55.048004 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdf8b04c_25f3_49a7_be24_553190212e0d.slice/crio-d6efe9a06921014878dadf5f1f591a55606ad4a94d4a8b7dfa7eed7bb953838c WatchSource:0}: Error finding container d6efe9a06921014878dadf5f1f591a55606ad4a94d4a8b7dfa7eed7bb953838c: Status 404 returned error can't find the container with id d6efe9a06921014878dadf5f1f591a55606ad4a94d4a8b7dfa7eed7bb953838c Dec 05 13:04:55.382357 master-0 kubenswrapper[29936]: I1205 13:04:55.382277 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" Dec 05 13:04:55.585206 master-0 kubenswrapper[29936]: I1205 13:04:55.585095 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c44tm\" (UniqueName: \"kubernetes.io/projected/6bec31da-6702-489f-9b71-abf49a5607c9-kube-api-access-c44tm\") pod \"6bec31da-6702-489f-9b71-abf49a5607c9\" (UID: \"6bec31da-6702-489f-9b71-abf49a5607c9\") " Dec 05 13:04:55.585561 master-0 kubenswrapper[29936]: I1205 13:04:55.585457 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bec31da-6702-489f-9b71-abf49a5607c9-config\") pod \"6bec31da-6702-489f-9b71-abf49a5607c9\" (UID: \"6bec31da-6702-489f-9b71-abf49a5607c9\") " Dec 05 13:04:55.632083 master-0 kubenswrapper[29936]: I1205 13:04:55.630108 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bec31da-6702-489f-9b71-abf49a5607c9-kube-api-access-c44tm" (OuterVolumeSpecName: "kube-api-access-c44tm") pod "6bec31da-6702-489f-9b71-abf49a5607c9" (UID: "6bec31da-6702-489f-9b71-abf49a5607c9"). InnerVolumeSpecName "kube-api-access-c44tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:04:55.653821 master-0 kubenswrapper[29936]: I1205 13:04:55.653727 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-4q9lh"] Dec 05 13:04:55.689210 master-0 kubenswrapper[29936]: I1205 13:04:55.689088 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c44tm\" (UniqueName: \"kubernetes.io/projected/6bec31da-6702-489f-9b71-abf49a5607c9-kube-api-access-c44tm\") on node \"master-0\" DevicePath \"\"" Dec 05 13:04:55.807121 master-0 kubenswrapper[29936]: I1205 13:04:55.807003 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-kbdm6"] Dec 05 13:04:55.815994 master-0 kubenswrapper[29936]: I1205 13:04:55.815861 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bec31da-6702-489f-9b71-abf49a5607c9-config" (OuterVolumeSpecName: "config") pod "6bec31da-6702-489f-9b71-abf49a5607c9" (UID: "6bec31da-6702-489f-9b71-abf49a5607c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:04:55.896329 master-0 kubenswrapper[29936]: I1205 13:04:55.896265 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bec31da-6702-489f-9b71-abf49a5607c9-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:04:55.987823 master-0 kubenswrapper[29936]: I1205 13:04:55.987749 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4q9lh" event={"ID":"d901429c-c54b-4d4f-95fd-28edc0f91d91","Type":"ContainerStarted","Data":"cc9decc4c4c8b44eaf5ac3d1857136320ce8d4c3b610b5eb3d70c97391fd8c28"} Dec 05 13:04:55.990521 master-0 kubenswrapper[29936]: I1205 13:04:55.990474 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fb7bb661-3aba-4919-a9ee-32b999cb4f05","Type":"ContainerStarted","Data":"6694bdd9f1db96b597a640c2f14fa96a5ebea4ca5d4bc141a762d9f29071b148"} Dec 05 13:04:55.997757 master-0 kubenswrapper[29936]: I1205 13:04:55.997673 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" event={"ID":"38517b6c-32f7-4853-adb2-e40488c6bb56","Type":"ContainerStarted","Data":"6047516470de64baa1c046efb19f13e8a15c78dbff8d038439b270e80aec5514"} Dec 05 13:04:56.003261 master-0 kubenswrapper[29936]: I1205 13:04:56.003137 29936 generic.go:334] "Generic (PLEG): container finished" podID="6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18" containerID="07882c4735a944de0586b3e0485dc90da42ce2acdd916dc11174c3331a40ea12" exitCode=0 Dec 05 13:04:56.003554 master-0 kubenswrapper[29936]: I1205 13:04:56.003358 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" event={"ID":"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18","Type":"ContainerDied","Data":"07882c4735a944de0586b3e0485dc90da42ce2acdd916dc11174c3331a40ea12"} Dec 05 13:04:56.015294 master-0 kubenswrapper[29936]: I1205 13:04:56.008222 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8454790-c77d-4164-99ff-edbfcdc4c426","Type":"ContainerStarted","Data":"6b835f855acfac3925bf45bb64378a17ebc3fffc7f3b684c8103f8b7a2896dd5"} Dec 05 13:04:56.015294 master-0 kubenswrapper[29936]: I1205 13:04:56.011765 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jj4cf" event={"ID":"fdf8b04c-25f3-49a7-be24-553190212e0d","Type":"ContainerStarted","Data":"d6efe9a06921014878dadf5f1f591a55606ad4a94d4a8b7dfa7eed7bb953838c"} Dec 05 13:04:56.015996 master-0 kubenswrapper[29936]: I1205 13:04:56.015692 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" Dec 05 13:04:56.015996 master-0 kubenswrapper[29936]: I1205 13:04:56.015736 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd" event={"ID":"6bec31da-6702-489f-9b71-abf49a5607c9","Type":"ContainerDied","Data":"37480124f66d030ae536fdc7a790f51e72de9587206a9acfb72d3c565f86947b"} Dec 05 13:04:56.015996 master-0 kubenswrapper[29936]: I1205 13:04:56.015810 29936 scope.go:117] "RemoveContainer" containerID="772b3a69fc9183378eb67b102489f5152139f796687fc45451bec91515f34021" Dec 05 13:04:56.022211 master-0 kubenswrapper[29936]: I1205 13:04:56.022130 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" event={"ID":"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9","Type":"ContainerStarted","Data":"e9154adb68c11112d543434edba715d3927482d5c7d621fc80c6843c062510d5"} Dec 05 13:04:56.022631 master-0 kubenswrapper[29936]: I1205 13:04:56.022543 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:04:56.024986 master-0 kubenswrapper[29936]: I1205 13:04:56.024946 29936 generic.go:334] "Generic (PLEG): container finished" podID="dd6f05c5-1783-4c1f-83b8-130470139655" containerID="e1ae587701d4b5952619039253f971101ca6b1deb048457317b2a9f8843686aa" exitCode=0 Dec 05 13:04:56.025272 master-0 kubenswrapper[29936]: I1205 13:04:56.025020 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bff68687-qw5gj" event={"ID":"dd6f05c5-1783-4c1f-83b8-130470139655","Type":"ContainerDied","Data":"e1ae587701d4b5952619039253f971101ca6b1deb048457317b2a9f8843686aa"} Dec 05 13:04:56.028139 master-0 kubenswrapper[29936]: I1205 13:04:56.028096 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"819b38a4-91fa-4657-92a7-f00475bdb566","Type":"ContainerStarted","Data":"403bcfb449614689091c81042d4d0e5def3774526ac91b57396ffcb2e7384d8d"} Dec 05 13:04:56.053589 master-0 kubenswrapper[29936]: I1205 13:04:56.035402 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8fxj" event={"ID":"1aaf3eff-076d-42cd-a86c-9e5af7a38664","Type":"ContainerStarted","Data":"45f31cc076e2f963f71ea843d2c755ac6a5dbe838ac9038ef6a4131035e604e8"} Dec 05 13:04:56.100593 master-0 kubenswrapper[29936]: I1205 13:04:56.100394 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" podStartSLOduration=4.823766277 podStartE2EDuration="36.100364863s" podCreationTimestamp="2025-12-05 13:04:20 +0000 UTC" firstStartedPulling="2025-12-05 13:04:21.529814845 +0000 UTC m=+858.661894536" lastFinishedPulling="2025-12-05 13:04:52.806413451 +0000 UTC m=+889.938493122" observedRunningTime="2025-12-05 13:04:56.092373126 +0000 UTC m=+893.224452817" watchObservedRunningTime="2025-12-05 13:04:56.100364863 +0000 UTC m=+893.232444554" Dec 05 13:04:56.324544 master-0 kubenswrapper[29936]: I1205 13:04:56.324279 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd"] Dec 05 13:04:56.374375 master-0 kubenswrapper[29936]: I1205 13:04:56.374300 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-n2wtd"] Dec 05 13:04:56.708418 master-0 kubenswrapper[29936]: I1205 13:04:56.708367 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:56.825221 master-0 kubenswrapper[29936]: I1205 13:04:56.824681 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-config\") pod \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " Dec 05 13:04:56.825221 master-0 kubenswrapper[29936]: I1205 13:04:56.824812 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-dns-svc\") pod \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " Dec 05 13:04:56.825221 master-0 kubenswrapper[29936]: I1205 13:04:56.825093 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb2ln\" (UniqueName: \"kubernetes.io/projected/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-kube-api-access-gb2ln\") pod \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\" (UID: \"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18\") " Dec 05 13:04:56.836526 master-0 kubenswrapper[29936]: I1205 13:04:56.836334 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:56.927544 master-0 kubenswrapper[29936]: I1205 13:04:56.927446 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59597\" (UniqueName: \"kubernetes.io/projected/dd6f05c5-1783-4c1f-83b8-130470139655-kube-api-access-59597\") pod \"dd6f05c5-1783-4c1f-83b8-130470139655\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " Dec 05 13:04:56.927945 master-0 kubenswrapper[29936]: I1205 13:04:56.927908 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-dns-svc\") pod \"dd6f05c5-1783-4c1f-83b8-130470139655\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " Dec 05 13:04:56.928273 master-0 kubenswrapper[29936]: I1205 13:04:56.928249 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-config\") pod \"dd6f05c5-1783-4c1f-83b8-130470139655\" (UID: \"dd6f05c5-1783-4c1f-83b8-130470139655\") " Dec 05 13:04:57.028058 master-0 kubenswrapper[29936]: I1205 13:04:57.027717 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-kube-api-access-gb2ln" (OuterVolumeSpecName: "kube-api-access-gb2ln") pod "6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18" (UID: "6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18"). InnerVolumeSpecName "kube-api-access-gb2ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:04:57.029116 master-0 kubenswrapper[29936]: I1205 13:04:57.029013 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6f05c5-1783-4c1f-83b8-130470139655-kube-api-access-59597" (OuterVolumeSpecName: "kube-api-access-59597") pod "dd6f05c5-1783-4c1f-83b8-130470139655" (UID: "dd6f05c5-1783-4c1f-83b8-130470139655"). InnerVolumeSpecName "kube-api-access-59597". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:04:57.030919 master-0 kubenswrapper[29936]: I1205 13:04:57.030873 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb2ln\" (UniqueName: \"kubernetes.io/projected/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-kube-api-access-gb2ln\") on node \"master-0\" DevicePath \"\"" Dec 05 13:04:57.030919 master-0 kubenswrapper[29936]: I1205 13:04:57.030907 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59597\" (UniqueName: \"kubernetes.io/projected/dd6f05c5-1783-4c1f-83b8-130470139655-kube-api-access-59597\") on node \"master-0\" DevicePath \"\"" Dec 05 13:04:57.053324 master-0 kubenswrapper[29936]: I1205 13:04:57.053156 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd6f05c5-1783-4c1f-83b8-130470139655" (UID: "dd6f05c5-1783-4c1f-83b8-130470139655"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:04:57.057637 master-0 kubenswrapper[29936]: I1205 13:04:57.056205 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-config" (OuterVolumeSpecName: "config") pod "6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18" (UID: "6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:04:57.075120 master-0 kubenswrapper[29936]: I1205 13:04:57.074994 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9bff68687-qw5gj" event={"ID":"dd6f05c5-1783-4c1f-83b8-130470139655","Type":"ContainerDied","Data":"eb59d1e9d1713d7debb8653fad051f97d3b5718e956cafbca8f3bbedf4803720"} Dec 05 13:04:57.075120 master-0 kubenswrapper[29936]: I1205 13:04:57.075002 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9bff68687-qw5gj" Dec 05 13:04:57.075714 master-0 kubenswrapper[29936]: I1205 13:04:57.075079 29936 scope.go:117] "RemoveContainer" containerID="e1ae587701d4b5952619039253f971101ca6b1deb048457317b2a9f8843686aa" Dec 05 13:04:57.076060 master-0 kubenswrapper[29936]: I1205 13:04:57.075065 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18" (UID: "6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:04:57.078401 master-0 kubenswrapper[29936]: I1205 13:04:57.076835 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-config" (OuterVolumeSpecName: "config") pod "dd6f05c5-1783-4c1f-83b8-130470139655" (UID: "dd6f05c5-1783-4c1f-83b8-130470139655"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:04:57.083731 master-0 kubenswrapper[29936]: I1205 13:04:57.083569 29936 generic.go:334] "Generic (PLEG): container finished" podID="38517b6c-32f7-4853-adb2-e40488c6bb56" containerID="7150a5ed22d252b19d646ed30056deb182aadec565f86fcb83b20ced5792a7a9" exitCode=0 Dec 05 13:04:57.083830 master-0 kubenswrapper[29936]: I1205 13:04:57.083696 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" event={"ID":"38517b6c-32f7-4853-adb2-e40488c6bb56","Type":"ContainerDied","Data":"7150a5ed22d252b19d646ed30056deb182aadec565f86fcb83b20ced5792a7a9"} Dec 05 13:04:57.089351 master-0 kubenswrapper[29936]: I1205 13:04:57.089209 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" Dec 05 13:04:57.089351 master-0 kubenswrapper[29936]: I1205 13:04:57.089215 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d7c5dbd7-jfj4q" event={"ID":"6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18","Type":"ContainerDied","Data":"0abeaedaf44fb02842cfc2f9b3844218c08c4ef65a140d4884cc817c68378705"} Dec 05 13:04:57.134038 master-0 kubenswrapper[29936]: I1205 13:04:57.133958 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:04:57.134469 master-0 kubenswrapper[29936]: I1205 13:04:57.134453 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:04:57.134606 master-0 kubenswrapper[29936]: I1205 13:04:57.134592 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:04:57.134693 master-0 kubenswrapper[29936]: I1205 13:04:57.134680 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6f05c5-1783-4c1f-83b8-130470139655-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:04:57.216217 master-0 kubenswrapper[29936]: I1205 13:04:57.215870 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bec31da-6702-489f-9b71-abf49a5607c9" path="/var/lib/kubelet/pods/6bec31da-6702-489f-9b71-abf49a5607c9/volumes" Dec 05 13:04:57.286025 master-0 kubenswrapper[29936]: I1205 13:04:57.285394 29936 scope.go:117] "RemoveContainer" containerID="07882c4735a944de0586b3e0485dc90da42ce2acdd916dc11174c3331a40ea12" Dec 05 13:04:57.364282 master-0 kubenswrapper[29936]: I1205 13:04:57.364202 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-jfj4q"] Dec 05 13:04:57.387249 master-0 kubenswrapper[29936]: I1205 13:04:57.387170 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-jfj4q"] Dec 05 13:04:57.610722 master-0 kubenswrapper[29936]: I1205 13:04:57.610484 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9bff68687-qw5gj"] Dec 05 13:04:57.626520 master-0 kubenswrapper[29936]: I1205 13:04:57.626440 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9bff68687-qw5gj"] Dec 05 13:04:58.107502 master-0 kubenswrapper[29936]: I1205 13:04:58.107416 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ac9589fd-c5fd-41de-8018-16c6e272d05e","Type":"ContainerStarted","Data":"de7220b30bf57ca1dde3f7faa8205588136c8638d8e6486d140db7ee4b6fdd36"} Dec 05 13:04:58.118300 master-0 kubenswrapper[29936]: I1205 13:04:58.118202 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a34de370-e400-45de-adbb-09995d9c1953","Type":"ContainerStarted","Data":"16fecadc541306e1fe67fb00b33b73ad354dfeaef29fb1b7118c9cc281d7c540"} Dec 05 13:04:58.123954 master-0 kubenswrapper[29936]: I1205 13:04:58.123893 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" event={"ID":"38517b6c-32f7-4853-adb2-e40488c6bb56","Type":"ContainerStarted","Data":"f5fec10059bed8c9bdc5f2978a0abe16f49285cde3d46eeb107322bdb5684193"} Dec 05 13:04:58.124163 master-0 kubenswrapper[29936]: I1205 13:04:58.124133 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:04:58.253029 master-0 kubenswrapper[29936]: I1205 13:04:58.252859 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" podStartSLOduration=5.252838769 podStartE2EDuration="5.252838769s" podCreationTimestamp="2025-12-05 13:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:04:58.249820277 +0000 UTC m=+895.381899968" watchObservedRunningTime="2025-12-05 13:04:58.252838769 +0000 UTC m=+895.384918450" Dec 05 13:04:59.205090 master-0 kubenswrapper[29936]: I1205 13:04:59.205025 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18" path="/var/lib/kubelet/pods/6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18/volumes" Dec 05 13:04:59.205819 master-0 kubenswrapper[29936]: I1205 13:04:59.205716 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6f05c5-1783-4c1f-83b8-130470139655" path="/var/lib/kubelet/pods/dd6f05c5-1783-4c1f-83b8-130470139655/volumes" Dec 05 13:05:00.206646 master-0 kubenswrapper[29936]: I1205 13:05:00.206581 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 13:05:00.969553 master-0 kubenswrapper[29936]: I1205 13:05:00.969465 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:05:04.299338 master-0 kubenswrapper[29936]: I1205 13:05:04.299253 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:05:04.320796 master-0 kubenswrapper[29936]: I1205 13:05:04.320721 29936 trace.go:236] Trace[364109958]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (05-Dec-2025 13:05:03.171) (total time: 1148ms): Dec 05 13:05:04.320796 master-0 kubenswrapper[29936]: Trace[364109958]: [1.148858197s] [1.148858197s] END Dec 05 13:05:04.426886 master-0 kubenswrapper[29936]: I1205 13:05:04.426440 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-f5crt"] Dec 05 13:05:04.426886 master-0 kubenswrapper[29936]: I1205 13:05:04.426813 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" podUID="d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" containerName="dnsmasq-dns" containerID="cri-o://e9154adb68c11112d543434edba715d3927482d5c7d621fc80c6843c062510d5" gracePeriod=10 Dec 05 13:05:05.249124 master-0 kubenswrapper[29936]: I1205 13:05:05.249017 29936 generic.go:334] "Generic (PLEG): container finished" podID="d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" containerID="e9154adb68c11112d543434edba715d3927482d5c7d621fc80c6843c062510d5" exitCode=0 Dec 05 13:05:05.249124 master-0 kubenswrapper[29936]: I1205 13:05:05.249109 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" event={"ID":"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9","Type":"ContainerDied","Data":"e9154adb68c11112d543434edba715d3927482d5c7d621fc80c6843c062510d5"} Dec 05 13:05:05.968293 master-0 kubenswrapper[29936]: I1205 13:05:05.968155 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" podUID="d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.173:5353: connect: connection refused" Dec 05 13:05:06.338432 master-0 kubenswrapper[29936]: I1205 13:05:06.337495 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:05:06.423062 master-0 kubenswrapper[29936]: I1205 13:05:06.421452 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwmj2\" (UniqueName: \"kubernetes.io/projected/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-kube-api-access-vwmj2\") pod \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " Dec 05 13:05:06.423062 master-0 kubenswrapper[29936]: I1205 13:05:06.421820 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-config\") pod \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " Dec 05 13:05:06.423062 master-0 kubenswrapper[29936]: I1205 13:05:06.421842 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-dns-svc\") pod \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\" (UID: \"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9\") " Dec 05 13:05:06.432448 master-0 kubenswrapper[29936]: I1205 13:05:06.432355 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-kube-api-access-vwmj2" (OuterVolumeSpecName: "kube-api-access-vwmj2") pod "d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" (UID: "d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9"). InnerVolumeSpecName "kube-api-access-vwmj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:06.527710 master-0 kubenswrapper[29936]: I1205 13:05:06.527655 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwmj2\" (UniqueName: \"kubernetes.io/projected/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-kube-api-access-vwmj2\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:06.542943 master-0 kubenswrapper[29936]: I1205 13:05:06.542858 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" (UID: "d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:06.631439 master-0 kubenswrapper[29936]: I1205 13:05:06.630459 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:06.636247 master-0 kubenswrapper[29936]: I1205 13:05:06.635073 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-config" (OuterVolumeSpecName: "config") pod "d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" (UID: "d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:06.734281 master-0 kubenswrapper[29936]: I1205 13:05:06.732954 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:07.302148 master-0 kubenswrapper[29936]: I1205 13:05:07.302079 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" Dec 05 13:05:08.009007 master-0 kubenswrapper[29936]: I1205 13:05:08.008927 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 13:05:08.009007 master-0 kubenswrapper[29936]: I1205 13:05:08.008989 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jj4cf" event={"ID":"fdf8b04c-25f3-49a7-be24-553190212e0d","Type":"ContainerStarted","Data":"eac71f412055279fbfc6510bea7af94569c440e7d7c0dd5a3c8fa25764f77c9c"} Dec 05 13:05:08.009007 master-0 kubenswrapper[29936]: I1205 13:05:08.009014 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-f5crt" event={"ID":"d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9","Type":"ContainerDied","Data":"333f7afe247c70d1cd8e079a3c4d4b7d0323c4df655b38b3810109a6584b584e"} Dec 05 13:05:08.009514 master-0 kubenswrapper[29936]: I1205 13:05:08.009033 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-4q9lh" event={"ID":"d901429c-c54b-4d4f-95fd-28edc0f91d91","Type":"ContainerStarted","Data":"3d6275bd82c14c08e90adbfae7a21532e4639c032ba41ef9944cf4ed4e9f08ab"} Dec 05 13:05:08.009514 master-0 kubenswrapper[29936]: I1205 13:05:08.009047 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"819b38a4-91fa-4657-92a7-f00475bdb566","Type":"ContainerStarted","Data":"cbaf462301620ec52e88bf88982db4bad9de7426337db26ebcfd889846a06f29"} Dec 05 13:05:08.009514 master-0 kubenswrapper[29936]: I1205 13:05:08.009060 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-f8fxj" Dec 05 13:05:08.009514 master-0 kubenswrapper[29936]: I1205 13:05:08.009073 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8454790-c77d-4164-99ff-edbfcdc4c426","Type":"ContainerStarted","Data":"c6b6ad69f0c8f5baaf58bb10ba156899e4f12759f1e395b91ab8d64230de6dbb"} Dec 05 13:05:08.009514 master-0 kubenswrapper[29936]: I1205 13:05:08.009084 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e8454790-c77d-4164-99ff-edbfcdc4c426","Type":"ContainerStarted","Data":"0fb80266432295f937cd332dddabef25155b5fe317824ea3b53172322b9213e7"} Dec 05 13:05:08.009514 master-0 kubenswrapper[29936]: I1205 13:05:08.009097 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f","Type":"ContainerStarted","Data":"75fbeb5117c8b5642489a0b17dc96cc2044af7fbf03cd7f13f15156fe8c97673"} Dec 05 13:05:08.009514 master-0 kubenswrapper[29936]: I1205 13:05:08.009108 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8fxj" event={"ID":"1aaf3eff-076d-42cd-a86c-9e5af7a38664","Type":"ContainerStarted","Data":"fec39bba4bc33fcd2e10cf6f8b6994124bb0f6772c34e0625086e24049bd167b"} Dec 05 13:05:08.009514 master-0 kubenswrapper[29936]: I1205 13:05:08.009117 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fb7bb661-3aba-4919-a9ee-32b999cb4f05","Type":"ContainerStarted","Data":"00eabf937a5ccd8e8e53f21e57f115d320ba84227f3bf1ec583da86cc7ec9a41"} Dec 05 13:05:08.009514 master-0 kubenswrapper[29936]: I1205 13:05:08.009426 29936 scope.go:117] "RemoveContainer" containerID="e9154adb68c11112d543434edba715d3927482d5c7d621fc80c6843c062510d5" Dec 05 13:05:08.033835 master-0 kubenswrapper[29936]: I1205 13:05:08.033763 29936 scope.go:117] "RemoveContainer" containerID="4af8ece6fa070cf8bebc5833a18566235a7b94c9ea805900c5e2c109dd20e6a4" Dec 05 13:05:08.346357 master-0 kubenswrapper[29936]: I1205 13:05:08.346218 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"819b38a4-91fa-4657-92a7-f00475bdb566","Type":"ContainerStarted","Data":"d7e277614971bcb2d791905a7b243e8c18852fc325f2a5b054584f7df4acd2c9"} Dec 05 13:05:08.984380 master-0 kubenswrapper[29936]: I1205 13:05:08.983531 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-574b77955f-m4xb5"] Dec 05 13:05:08.984924 master-0 kubenswrapper[29936]: E1205 13:05:08.984844 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" containerName="init" Dec 05 13:05:08.984924 master-0 kubenswrapper[29936]: I1205 13:05:08.984866 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" containerName="init" Dec 05 13:05:08.984924 master-0 kubenswrapper[29936]: E1205 13:05:08.984914 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6f05c5-1783-4c1f-83b8-130470139655" containerName="init" Dec 05 13:05:08.984924 master-0 kubenswrapper[29936]: I1205 13:05:08.984921 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6f05c5-1783-4c1f-83b8-130470139655" containerName="init" Dec 05 13:05:08.985087 master-0 kubenswrapper[29936]: E1205 13:05:08.984956 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" containerName="dnsmasq-dns" Dec 05 13:05:08.985087 master-0 kubenswrapper[29936]: I1205 13:05:08.984962 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" containerName="dnsmasq-dns" Dec 05 13:05:08.985087 master-0 kubenswrapper[29936]: E1205 13:05:08.985009 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bec31da-6702-489f-9b71-abf49a5607c9" containerName="init" Dec 05 13:05:08.985087 master-0 kubenswrapper[29936]: I1205 13:05:08.985015 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bec31da-6702-489f-9b71-abf49a5607c9" containerName="init" Dec 05 13:05:08.985087 master-0 kubenswrapper[29936]: E1205 13:05:08.985043 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18" containerName="init" Dec 05 13:05:08.985087 master-0 kubenswrapper[29936]: I1205 13:05:08.985052 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18" containerName="init" Dec 05 13:05:08.985514 master-0 kubenswrapper[29936]: I1205 13:05:08.985492 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb40fb6-e3b5-49a8-aa7f-f7040d8e7a18" containerName="init" Dec 05 13:05:08.985567 master-0 kubenswrapper[29936]: I1205 13:05:08.985535 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6f05c5-1783-4c1f-83b8-130470139655" containerName="init" Dec 05 13:05:08.985600 master-0 kubenswrapper[29936]: I1205 13:05:08.985585 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bec31da-6702-489f-9b71-abf49a5607c9" containerName="init" Dec 05 13:05:08.985631 master-0 kubenswrapper[29936]: I1205 13:05:08.985617 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" containerName="dnsmasq-dns" Dec 05 13:05:08.988200 master-0 kubenswrapper[29936]: I1205 13:05:08.987967 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.057628 master-0 kubenswrapper[29936]: I1205 13:05:09.051788 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-574b77955f-m4xb5"] Dec 05 13:05:09.082430 master-0 kubenswrapper[29936]: I1205 13:05:09.082318 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-4q9lh" podStartSLOduration=5.606803816 podStartE2EDuration="16.082289633s" podCreationTimestamp="2025-12-05 13:04:53 +0000 UTC" firstStartedPulling="2025-12-05 13:04:55.754058373 +0000 UTC m=+892.886138054" lastFinishedPulling="2025-12-05 13:05:06.22954418 +0000 UTC m=+903.361623871" observedRunningTime="2025-12-05 13:05:09.046860741 +0000 UTC m=+906.178940432" watchObservedRunningTime="2025-12-05 13:05:09.082289633 +0000 UTC m=+906.214369314" Dec 05 13:05:09.147022 master-0 kubenswrapper[29936]: I1205 13:05:09.146929 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-ovsdbserver-nb\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.147371 master-0 kubenswrapper[29936]: I1205 13:05:09.147158 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-config\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.147371 master-0 kubenswrapper[29936]: I1205 13:05:09.147327 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-dns-svc\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.147371 master-0 kubenswrapper[29936]: I1205 13:05:09.147357 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lv58\" (UniqueName: \"kubernetes.io/projected/9509cac3-43a8-4592-bd58-a61a7e48001e-kube-api-access-9lv58\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.224319 master-0 kubenswrapper[29936]: I1205 13:05:09.221947 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.066122966000002 podStartE2EDuration="30.221914317s" podCreationTimestamp="2025-12-05 13:04:39 +0000 UTC" firstStartedPulling="2025-12-05 13:04:55.024032567 +0000 UTC m=+892.156112248" lastFinishedPulling="2025-12-05 13:05:06.179823918 +0000 UTC m=+903.311903599" observedRunningTime="2025-12-05 13:05:09.174861549 +0000 UTC m=+906.306941240" watchObservedRunningTime="2025-12-05 13:05:09.221914317 +0000 UTC m=+906.353993998" Dec 05 13:05:09.246890 master-0 kubenswrapper[29936]: I1205 13:05:09.246730 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-f5crt"] Dec 05 13:05:09.249212 master-0 kubenswrapper[29936]: I1205 13:05:09.249130 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-config\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.249606 master-0 kubenswrapper[29936]: I1205 13:05:09.249588 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-dns-svc\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.249702 master-0 kubenswrapper[29936]: I1205 13:05:09.249684 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lv58\" (UniqueName: \"kubernetes.io/projected/9509cac3-43a8-4592-bd58-a61a7e48001e-kube-api-access-9lv58\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.249995 master-0 kubenswrapper[29936]: I1205 13:05:09.249980 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-ovsdbserver-nb\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.251363 master-0 kubenswrapper[29936]: I1205 13:05:09.251343 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-config\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.253821 master-0 kubenswrapper[29936]: I1205 13:05:09.253794 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-dns-svc\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.256088 master-0 kubenswrapper[29936]: I1205 13:05:09.256065 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-ovsdbserver-nb\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.266114 master-0 kubenswrapper[29936]: I1205 13:05:09.266042 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-f5crt"] Dec 05 13:05:09.274789 master-0 kubenswrapper[29936]: I1205 13:05:09.273328 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-f8fxj" podStartSLOduration=20.185566072 podStartE2EDuration="31.27315078s" podCreationTimestamp="2025-12-05 13:04:38 +0000 UTC" firstStartedPulling="2025-12-05 13:04:55.025832186 +0000 UTC m=+892.157911867" lastFinishedPulling="2025-12-05 13:05:06.113416894 +0000 UTC m=+903.245496575" observedRunningTime="2025-12-05 13:05:09.234972303 +0000 UTC m=+906.367051974" watchObservedRunningTime="2025-12-05 13:05:09.27315078 +0000 UTC m=+906.405230461" Dec 05 13:05:09.285000 master-0 kubenswrapper[29936]: I1205 13:05:09.284942 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lv58\" (UniqueName: \"kubernetes.io/projected/9509cac3-43a8-4592-bd58-a61a7e48001e-kube-api-access-9lv58\") pod \"dnsmasq-dns-574b77955f-m4xb5\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.370271 master-0 kubenswrapper[29936]: I1205 13:05:09.370166 29936 generic.go:334] "Generic (PLEG): container finished" podID="fdf8b04c-25f3-49a7-be24-553190212e0d" containerID="eac71f412055279fbfc6510bea7af94569c440e7d7c0dd5a3c8fa25764f77c9c" exitCode=0 Dec 05 13:05:09.370959 master-0 kubenswrapper[29936]: I1205 13:05:09.370545 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jj4cf" event={"ID":"fdf8b04c-25f3-49a7-be24-553190212e0d","Type":"ContainerDied","Data":"eac71f412055279fbfc6510bea7af94569c440e7d7c0dd5a3c8fa25764f77c9c"} Dec 05 13:05:09.374060 master-0 kubenswrapper[29936]: I1205 13:05:09.373411 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:09.402806 master-0 kubenswrapper[29936]: I1205 13:05:09.402584 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=21.256261132 podStartE2EDuration="32.402552405s" podCreationTimestamp="2025-12-05 13:04:37 +0000 UTC" firstStartedPulling="2025-12-05 13:04:55.032848977 +0000 UTC m=+892.164928658" lastFinishedPulling="2025-12-05 13:05:06.17914026 +0000 UTC m=+903.311219931" observedRunningTime="2025-12-05 13:05:09.396609204 +0000 UTC m=+906.528688895" watchObservedRunningTime="2025-12-05 13:05:09.402552405 +0000 UTC m=+906.534632086" Dec 05 13:05:09.483657 master-0 kubenswrapper[29936]: I1205 13:05:09.472887 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-574b77955f-m4xb5"] Dec 05 13:05:09.554378 master-0 kubenswrapper[29936]: I1205 13:05:09.548572 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-bp8lq"] Dec 05 13:05:09.554378 master-0 kubenswrapper[29936]: I1205 13:05:09.550622 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.562606 master-0 kubenswrapper[29936]: I1205 13:05:09.558430 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 13:05:09.599224 master-0 kubenswrapper[29936]: I1205 13:05:09.592736 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-bp8lq"] Dec 05 13:05:09.670269 master-0 kubenswrapper[29936]: I1205 13:05:09.662769 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-nb\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.670269 master-0 kubenswrapper[29936]: I1205 13:05:09.662908 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-dns-svc\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.670269 master-0 kubenswrapper[29936]: I1205 13:05:09.662940 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-sb\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.670269 master-0 kubenswrapper[29936]: I1205 13:05:09.662983 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-config\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.670269 master-0 kubenswrapper[29936]: I1205 13:05:09.663015 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbn7q\" (UniqueName: \"kubernetes.io/projected/259e2263-d738-441d-adc4-e0aab272b64a-kube-api-access-pbn7q\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.670269 master-0 kubenswrapper[29936]: I1205 13:05:09.663198 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 13:05:09.766840 master-0 kubenswrapper[29936]: I1205 13:05:09.765328 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-dns-svc\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.766840 master-0 kubenswrapper[29936]: I1205 13:05:09.765420 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-sb\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.766840 master-0 kubenswrapper[29936]: I1205 13:05:09.765467 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-config\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.766840 master-0 kubenswrapper[29936]: I1205 13:05:09.765502 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbn7q\" (UniqueName: \"kubernetes.io/projected/259e2263-d738-441d-adc4-e0aab272b64a-kube-api-access-pbn7q\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.766840 master-0 kubenswrapper[29936]: I1205 13:05:09.765574 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-nb\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.766840 master-0 kubenswrapper[29936]: I1205 13:05:09.766665 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-nb\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.769831 master-0 kubenswrapper[29936]: I1205 13:05:09.769210 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-dns-svc\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.769831 master-0 kubenswrapper[29936]: I1205 13:05:09.769624 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-sb\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.769831 master-0 kubenswrapper[29936]: I1205 13:05:09.769682 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-config\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:09.807437 master-0 kubenswrapper[29936]: I1205 13:05:09.805113 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbn7q\" (UniqueName: \"kubernetes.io/projected/259e2263-d738-441d-adc4-e0aab272b64a-kube-api-access-pbn7q\") pod \"dnsmasq-dns-7d4d74cb79-bp8lq\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:10.001269 master-0 kubenswrapper[29936]: I1205 13:05:10.000871 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:10.202442 master-0 kubenswrapper[29936]: I1205 13:05:10.202285 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-574b77955f-m4xb5"] Dec 05 13:05:10.391461 master-0 kubenswrapper[29936]: I1205 13:05:10.391287 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574b77955f-m4xb5" event={"ID":"9509cac3-43a8-4592-bd58-a61a7e48001e","Type":"ContainerStarted","Data":"06fd4079ba8fd586ee22efcdbd4ac09f812b6becc20bb7807697db1954f8b9c4"} Dec 05 13:05:10.398170 master-0 kubenswrapper[29936]: I1205 13:05:10.398114 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jj4cf" event={"ID":"fdf8b04c-25f3-49a7-be24-553190212e0d","Type":"ContainerStarted","Data":"4e91fdadd6987ae15d4da54e0fc186a03476c27ff39790cc3f5271c07097109b"} Dec 05 13:05:10.543851 master-0 kubenswrapper[29936]: I1205 13:05:10.540136 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-bp8lq"] Dec 05 13:05:10.551876 master-0 kubenswrapper[29936]: I1205 13:05:10.551797 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 13:05:10.571882 master-0 kubenswrapper[29936]: W1205 13:05:10.571804 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod259e2263_d738_441d_adc4_e0aab272b64a.slice/crio-f6490271d9f5a822ef63775fc6a84b405fea934aa9587bcc2623bf8b75549502 WatchSource:0}: Error finding container f6490271d9f5a822ef63775fc6a84b405fea934aa9587bcc2623bf8b75549502: Status 404 returned error can't find the container with id f6490271d9f5a822ef63775fc6a84b405fea934aa9587bcc2623bf8b75549502 Dec 05 13:05:10.626057 master-0 kubenswrapper[29936]: I1205 13:05:10.625999 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 13:05:10.718856 master-0 kubenswrapper[29936]: I1205 13:05:10.718155 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 13:05:10.998027 master-0 kubenswrapper[29936]: I1205 13:05:10.992321 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 05 13:05:11.010017 master-0 kubenswrapper[29936]: I1205 13:05:11.009302 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 13:05:11.018790 master-0 kubenswrapper[29936]: I1205 13:05:11.016019 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 13:05:11.021765 master-0 kubenswrapper[29936]: I1205 13:05:11.021694 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 13:05:11.029279 master-0 kubenswrapper[29936]: I1205 13:05:11.027907 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 05 13:05:11.029279 master-0 kubenswrapper[29936]: I1205 13:05:11.028313 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 13:05:11.115976 master-0 kubenswrapper[29936]: I1205 13:05:11.115265 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-lock\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.115976 master-0 kubenswrapper[29936]: I1205 13:05:11.115419 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.115976 master-0 kubenswrapper[29936]: I1205 13:05:11.115491 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-cache\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.115976 master-0 kubenswrapper[29936]: I1205 13:05:11.115590 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-63fc0da1-e6b9-464b-bbbe-4b317a31556b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^93739d1f-9dc1-45a8-a69c-de980ae819fc\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.115976 master-0 kubenswrapper[29936]: I1205 13:05:11.115651 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2hk6\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-kube-api-access-g2hk6\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.203673 master-0 kubenswrapper[29936]: I1205 13:05:11.203593 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9" path="/var/lib/kubelet/pods/d8c30e46-aaad-4e4b-b6f7-2ec7661ee0b9/volumes" Dec 05 13:05:11.220139 master-0 kubenswrapper[29936]: I1205 13:05:11.220038 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.220537 master-0 kubenswrapper[29936]: I1205 13:05:11.220159 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-cache\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.220537 master-0 kubenswrapper[29936]: I1205 13:05:11.220234 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-63fc0da1-e6b9-464b-bbbe-4b317a31556b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^93739d1f-9dc1-45a8-a69c-de980ae819fc\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.220537 master-0 kubenswrapper[29936]: I1205 13:05:11.220291 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2hk6\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-kube-api-access-g2hk6\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.220537 master-0 kubenswrapper[29936]: I1205 13:05:11.220400 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-lock\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.221101 master-0 kubenswrapper[29936]: E1205 13:05:11.221068 29936 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 13:05:11.221101 master-0 kubenswrapper[29936]: E1205 13:05:11.221091 29936 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 13:05:11.221251 master-0 kubenswrapper[29936]: E1205 13:05:11.221168 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift podName:8668c9c4-68f4-4224-9395-0f2ac85b9f1d nodeName:}" failed. No retries permitted until 2025-12-05 13:05:11.721139979 +0000 UTC m=+908.853219650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift") pod "swift-storage-0" (UID: "8668c9c4-68f4-4224-9395-0f2ac85b9f1d") : configmap "swift-ring-files" not found Dec 05 13:05:11.221683 master-0 kubenswrapper[29936]: I1205 13:05:11.221638 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-cache\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.223213 master-0 kubenswrapper[29936]: I1205 13:05:11.223143 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:05:11.223311 master-0 kubenswrapper[29936]: I1205 13:05:11.223230 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-63fc0da1-e6b9-464b-bbbe-4b317a31556b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^93739d1f-9dc1-45a8-a69c-de980ae819fc\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/e1621a7a02b9c8ee871cf7006ea4b3d3930a3504a6e554abe8686e1f2f937847/globalmount\"" pod="openstack/swift-storage-0" Dec 05 13:05:11.233495 master-0 kubenswrapper[29936]: I1205 13:05:11.233168 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-lock\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.247724 master-0 kubenswrapper[29936]: I1205 13:05:11.247658 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2hk6\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-kube-api-access-g2hk6\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.412354 master-0 kubenswrapper[29936]: I1205 13:05:11.412286 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jj4cf" event={"ID":"fdf8b04c-25f3-49a7-be24-553190212e0d","Type":"ContainerStarted","Data":"87b24a9c058e7d25492464d04b46a70345aa5917f5fe7895919961c17d6ccfed"} Dec 05 13:05:11.413894 master-0 kubenswrapper[29936]: I1205 13:05:11.413858 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:05:11.413894 master-0 kubenswrapper[29936]: I1205 13:05:11.413891 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:05:11.419890 master-0 kubenswrapper[29936]: I1205 13:05:11.419812 29936 generic.go:334] "Generic (PLEG): container finished" podID="259e2263-d738-441d-adc4-e0aab272b64a" containerID="5b1fcfc50f531fdf41b481ccab7eff00ec8b2cb88e14202d7095dd1327a2f472" exitCode=0 Dec 05 13:05:11.420018 master-0 kubenswrapper[29936]: I1205 13:05:11.419906 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" event={"ID":"259e2263-d738-441d-adc4-e0aab272b64a","Type":"ContainerDied","Data":"5b1fcfc50f531fdf41b481ccab7eff00ec8b2cb88e14202d7095dd1327a2f472"} Dec 05 13:05:11.420018 master-0 kubenswrapper[29936]: I1205 13:05:11.419950 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" event={"ID":"259e2263-d738-441d-adc4-e0aab272b64a","Type":"ContainerStarted","Data":"f6490271d9f5a822ef63775fc6a84b405fea934aa9587bcc2623bf8b75549502"} Dec 05 13:05:11.422516 master-0 kubenswrapper[29936]: I1205 13:05:11.422422 29936 generic.go:334] "Generic (PLEG): container finished" podID="9509cac3-43a8-4592-bd58-a61a7e48001e" containerID="c6fca627dadf152b8fa22464e94343441e052627e2ce308fa6fc05e5793640c3" exitCode=0 Dec 05 13:05:11.422516 master-0 kubenswrapper[29936]: I1205 13:05:11.422476 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574b77955f-m4xb5" event={"ID":"9509cac3-43a8-4592-bd58-a61a7e48001e","Type":"ContainerDied","Data":"c6fca627dadf152b8fa22464e94343441e052627e2ce308fa6fc05e5793640c3"} Dec 05 13:05:11.422963 master-0 kubenswrapper[29936]: I1205 13:05:11.422941 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 13:05:11.448740 master-0 kubenswrapper[29936]: I1205 13:05:11.448610 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-jj4cf" podStartSLOduration=22.44188018 podStartE2EDuration="33.44858628s" podCreationTimestamp="2025-12-05 13:04:38 +0000 UTC" firstStartedPulling="2025-12-05 13:04:55.10656911 +0000 UTC m=+892.238648791" lastFinishedPulling="2025-12-05 13:05:06.11327521 +0000 UTC m=+903.245354891" observedRunningTime="2025-12-05 13:05:11.438767732 +0000 UTC m=+908.570847413" watchObservedRunningTime="2025-12-05 13:05:11.44858628 +0000 UTC m=+908.580665961" Dec 05 13:05:11.505225 master-0 kubenswrapper[29936]: I1205 13:05:11.503923 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 13:05:11.748576 master-0 kubenswrapper[29936]: I1205 13:05:11.742267 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:11.748576 master-0 kubenswrapper[29936]: E1205 13:05:11.742444 29936 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 13:05:11.748576 master-0 kubenswrapper[29936]: E1205 13:05:11.742512 29936 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 13:05:11.748576 master-0 kubenswrapper[29936]: E1205 13:05:11.742624 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift podName:8668c9c4-68f4-4224-9395-0f2ac85b9f1d nodeName:}" failed. No retries permitted until 2025-12-05 13:05:12.742601668 +0000 UTC m=+909.874681349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift") pod "swift-storage-0" (UID: "8668c9c4-68f4-4224-9395-0f2ac85b9f1d") : configmap "swift-ring-files" not found Dec 05 13:05:11.928467 master-0 kubenswrapper[29936]: I1205 13:05:11.928388 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mb6t5"] Dec 05 13:05:11.930616 master-0 kubenswrapper[29936]: I1205 13:05:11.930392 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:11.934718 master-0 kubenswrapper[29936]: I1205 13:05:11.933795 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 13:05:11.934718 master-0 kubenswrapper[29936]: I1205 13:05:11.933845 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 13:05:11.934718 master-0 kubenswrapper[29936]: I1205 13:05:11.934218 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 13:05:11.942416 master-0 kubenswrapper[29936]: I1205 13:05:11.942365 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:11.957349 master-0 kubenswrapper[29936]: I1205 13:05:11.954281 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mb6t5"] Dec 05 13:05:12.057862 master-0 kubenswrapper[29936]: I1205 13:05:12.057668 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-ovsdbserver-nb\") pod \"9509cac3-43a8-4592-bd58-a61a7e48001e\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " Dec 05 13:05:12.057862 master-0 kubenswrapper[29936]: I1205 13:05:12.057830 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-config\") pod \"9509cac3-43a8-4592-bd58-a61a7e48001e\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " Dec 05 13:05:12.058218 master-0 kubenswrapper[29936]: I1205 13:05:12.058154 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-dns-svc\") pod \"9509cac3-43a8-4592-bd58-a61a7e48001e\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " Dec 05 13:05:12.058285 master-0 kubenswrapper[29936]: I1205 13:05:12.058216 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lv58\" (UniqueName: \"kubernetes.io/projected/9509cac3-43a8-4592-bd58-a61a7e48001e-kube-api-access-9lv58\") pod \"9509cac3-43a8-4592-bd58-a61a7e48001e\" (UID: \"9509cac3-43a8-4592-bd58-a61a7e48001e\") " Dec 05 13:05:12.060051 master-0 kubenswrapper[29936]: I1205 13:05:12.059979 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-ring-data-devices\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.060266 master-0 kubenswrapper[29936]: I1205 13:05:12.060172 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-swiftconf\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.060360 master-0 kubenswrapper[29936]: I1205 13:05:12.060334 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-dispersionconf\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.060428 master-0 kubenswrapper[29936]: I1205 13:05:12.060403 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89vbs\" (UniqueName: \"kubernetes.io/projected/e2997639-45a9-4e46-9bf1-f011f91eeab2-kube-api-access-89vbs\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.060551 master-0 kubenswrapper[29936]: I1205 13:05:12.060517 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-scripts\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.060909 master-0 kubenswrapper[29936]: I1205 13:05:12.060859 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-combined-ca-bundle\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.061100 master-0 kubenswrapper[29936]: I1205 13:05:12.061063 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e2997639-45a9-4e46-9bf1-f011f91eeab2-etc-swift\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.064455 master-0 kubenswrapper[29936]: I1205 13:05:12.064384 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9509cac3-43a8-4592-bd58-a61a7e48001e-kube-api-access-9lv58" (OuterVolumeSpecName: "kube-api-access-9lv58") pod "9509cac3-43a8-4592-bd58-a61a7e48001e" (UID: "9509cac3-43a8-4592-bd58-a61a7e48001e"). InnerVolumeSpecName "kube-api-access-9lv58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:12.082995 master-0 kubenswrapper[29936]: I1205 13:05:12.082901 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9509cac3-43a8-4592-bd58-a61a7e48001e" (UID: "9509cac3-43a8-4592-bd58-a61a7e48001e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:12.085686 master-0 kubenswrapper[29936]: I1205 13:05:12.085619 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9509cac3-43a8-4592-bd58-a61a7e48001e" (UID: "9509cac3-43a8-4592-bd58-a61a7e48001e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:12.087601 master-0 kubenswrapper[29936]: I1205 13:05:12.087513 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-config" (OuterVolumeSpecName: "config") pod "9509cac3-43a8-4592-bd58-a61a7e48001e" (UID: "9509cac3-43a8-4592-bd58-a61a7e48001e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:12.164008 master-0 kubenswrapper[29936]: I1205 13:05:12.163878 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-ring-data-devices\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.164008 master-0 kubenswrapper[29936]: I1205 13:05:12.164015 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-swiftconf\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.164418 master-0 kubenswrapper[29936]: I1205 13:05:12.164058 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-dispersionconf\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.164418 master-0 kubenswrapper[29936]: I1205 13:05:12.164090 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89vbs\" (UniqueName: \"kubernetes.io/projected/e2997639-45a9-4e46-9bf1-f011f91eeab2-kube-api-access-89vbs\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.164418 master-0 kubenswrapper[29936]: I1205 13:05:12.164125 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-scripts\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.164418 master-0 kubenswrapper[29936]: I1205 13:05:12.164239 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-combined-ca-bundle\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.164418 master-0 kubenswrapper[29936]: I1205 13:05:12.164305 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e2997639-45a9-4e46-9bf1-f011f91eeab2-etc-swift\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.164418 master-0 kubenswrapper[29936]: I1205 13:05:12.164401 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:12.164418 master-0 kubenswrapper[29936]: I1205 13:05:12.164418 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lv58\" (UniqueName: \"kubernetes.io/projected/9509cac3-43a8-4592-bd58-a61a7e48001e-kube-api-access-9lv58\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:12.164418 master-0 kubenswrapper[29936]: I1205 13:05:12.164434 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:12.164785 master-0 kubenswrapper[29936]: I1205 13:05:12.164447 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9509cac3-43a8-4592-bd58-a61a7e48001e-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:12.165114 master-0 kubenswrapper[29936]: I1205 13:05:12.165047 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e2997639-45a9-4e46-9bf1-f011f91eeab2-etc-swift\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.165736 master-0 kubenswrapper[29936]: I1205 13:05:12.165657 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-ring-data-devices\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.166003 master-0 kubenswrapper[29936]: I1205 13:05:12.165951 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-scripts\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.169214 master-0 kubenswrapper[29936]: I1205 13:05:12.169163 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-swiftconf\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.169750 master-0 kubenswrapper[29936]: I1205 13:05:12.169662 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-dispersionconf\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.171444 master-0 kubenswrapper[29936]: I1205 13:05:12.171370 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-combined-ca-bundle\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.191360 master-0 kubenswrapper[29936]: I1205 13:05:12.191274 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89vbs\" (UniqueName: \"kubernetes.io/projected/e2997639-45a9-4e46-9bf1-f011f91eeab2-kube-api-access-89vbs\") pod \"swift-ring-rebalance-mb6t5\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.266921 master-0 kubenswrapper[29936]: I1205 13:05:12.266827 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:12.476630 master-0 kubenswrapper[29936]: I1205 13:05:12.474680 29936 generic.go:334] "Generic (PLEG): container finished" podID="e05f116e-3a9f-481a-8ca7-e9f4715f5d7f" containerID="75fbeb5117c8b5642489a0b17dc96cc2044af7fbf03cd7f13f15156fe8c97673" exitCode=0 Dec 05 13:05:12.476630 master-0 kubenswrapper[29936]: I1205 13:05:12.474884 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f","Type":"ContainerDied","Data":"75fbeb5117c8b5642489a0b17dc96cc2044af7fbf03cd7f13f15156fe8c97673"} Dec 05 13:05:12.486839 master-0 kubenswrapper[29936]: I1205 13:05:12.486727 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" event={"ID":"259e2263-d738-441d-adc4-e0aab272b64a","Type":"ContainerStarted","Data":"0547e2b8393ec7541feb08e2b2861a9fd2d114c8f44418cce551dbe6163a957a"} Dec 05 13:05:12.488036 master-0 kubenswrapper[29936]: I1205 13:05:12.487979 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:12.493248 master-0 kubenswrapper[29936]: I1205 13:05:12.490664 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-574b77955f-m4xb5" event={"ID":"9509cac3-43a8-4592-bd58-a61a7e48001e","Type":"ContainerDied","Data":"06fd4079ba8fd586ee22efcdbd4ac09f812b6becc20bb7807697db1954f8b9c4"} Dec 05 13:05:12.493248 master-0 kubenswrapper[29936]: I1205 13:05:12.490730 29936 scope.go:117] "RemoveContainer" containerID="c6fca627dadf152b8fa22464e94343441e052627e2ce308fa6fc05e5793640c3" Dec 05 13:05:12.493248 master-0 kubenswrapper[29936]: I1205 13:05:12.492534 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-574b77955f-m4xb5" Dec 05 13:05:12.623324 master-0 kubenswrapper[29936]: I1205 13:05:12.623253 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-63fc0da1-e6b9-464b-bbbe-4b317a31556b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^93739d1f-9dc1-45a8-a69c-de980ae819fc\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:12.790562 master-0 kubenswrapper[29936]: I1205 13:05:12.790469 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:12.790924 master-0 kubenswrapper[29936]: E1205 13:05:12.790791 29936 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 13:05:12.790924 master-0 kubenswrapper[29936]: E1205 13:05:12.790861 29936 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 13:05:12.791001 master-0 kubenswrapper[29936]: E1205 13:05:12.790946 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift podName:8668c9c4-68f4-4224-9395-0f2ac85b9f1d nodeName:}" failed. No retries permitted until 2025-12-05 13:05:14.790918403 +0000 UTC m=+911.922998084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift") pod "swift-storage-0" (UID: "8668c9c4-68f4-4224-9395-0f2ac85b9f1d") : configmap "swift-ring-files" not found Dec 05 13:05:13.511513 master-0 kubenswrapper[29936]: I1205 13:05:13.511430 29936 generic.go:334] "Generic (PLEG): container finished" podID="fb7bb661-3aba-4919-a9ee-32b999cb4f05" containerID="00eabf937a5ccd8e8e53f21e57f115d320ba84227f3bf1ec583da86cc7ec9a41" exitCode=0 Dec 05 13:05:13.512262 master-0 kubenswrapper[29936]: I1205 13:05:13.511661 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fb7bb661-3aba-4919-a9ee-32b999cb4f05","Type":"ContainerDied","Data":"00eabf937a5ccd8e8e53f21e57f115d320ba84227f3bf1ec583da86cc7ec9a41"} Dec 05 13:05:13.970047 master-0 kubenswrapper[29936]: I1205 13:05:13.969931 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" podStartSLOduration=4.969897147 podStartE2EDuration="4.969897147s" podCreationTimestamp="2025-12-05 13:05:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:05:13.958637412 +0000 UTC m=+911.090717103" watchObservedRunningTime="2025-12-05 13:05:13.969897147 +0000 UTC m=+911.101976828" Dec 05 13:05:14.003291 master-0 kubenswrapper[29936]: I1205 13:05:14.003001 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mb6t5"] Dec 05 13:05:14.065676 master-0 kubenswrapper[29936]: I1205 13:05:14.063680 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-574b77955f-m4xb5"] Dec 05 13:05:14.099583 master-0 kubenswrapper[29936]: I1205 13:05:14.099520 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-574b77955f-m4xb5"] Dec 05 13:05:14.525756 master-0 kubenswrapper[29936]: I1205 13:05:14.525596 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e05f116e-3a9f-481a-8ca7-e9f4715f5d7f","Type":"ContainerStarted","Data":"4bc19ea5821deb24f36ed5c7d8518d6d0c5481b6ac679ba48feccb16d959ca0d"} Dec 05 13:05:14.527877 master-0 kubenswrapper[29936]: I1205 13:05:14.527794 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mb6t5" event={"ID":"e2997639-45a9-4e46-9bf1-f011f91eeab2","Type":"ContainerStarted","Data":"641dafa31ace0b6e86d1dcbbc2b1cc975b707dd47a1b21218c56728d91bc7b19"} Dec 05 13:05:14.531119 master-0 kubenswrapper[29936]: I1205 13:05:14.531057 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fb7bb661-3aba-4919-a9ee-32b999cb4f05","Type":"ContainerStarted","Data":"d4e72147d05260d64440769d816c3d2078bf2e8ad05e0032c6dc5cd672b88e90"} Dec 05 13:05:14.743504 master-0 kubenswrapper[29936]: I1205 13:05:14.743420 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 13:05:14.843971 master-0 kubenswrapper[29936]: I1205 13:05:14.843803 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:14.844252 master-0 kubenswrapper[29936]: E1205 13:05:14.844026 29936 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 13:05:14.844252 master-0 kubenswrapper[29936]: E1205 13:05:14.844052 29936 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 13:05:14.844252 master-0 kubenswrapper[29936]: E1205 13:05:14.844124 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift podName:8668c9c4-68f4-4224-9395-0f2ac85b9f1d nodeName:}" failed. No retries permitted until 2025-12-05 13:05:18.844103982 +0000 UTC m=+915.976183673 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift") pod "swift-storage-0" (UID: "8668c9c4-68f4-4224-9395-0f2ac85b9f1d") : configmap "swift-ring-files" not found Dec 05 13:05:15.091790 master-0 kubenswrapper[29936]: I1205 13:05:15.091667 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=39.646924132 podStartE2EDuration="53.091633077s" podCreationTimestamp="2025-12-05 13:04:22 +0000 UTC" firstStartedPulling="2025-12-05 13:04:52.734412215 +0000 UTC m=+889.866491896" lastFinishedPulling="2025-12-05 13:05:06.17912116 +0000 UTC m=+903.311200841" observedRunningTime="2025-12-05 13:05:15.081370278 +0000 UTC m=+912.213449969" watchObservedRunningTime="2025-12-05 13:05:15.091633077 +0000 UTC m=+912.223712758" Dec 05 13:05:15.123304 master-0 kubenswrapper[29936]: I1205 13:05:15.123057 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=40.976827299 podStartE2EDuration="52.12303069s" podCreationTimestamp="2025-12-05 13:04:23 +0000 UTC" firstStartedPulling="2025-12-05 13:04:55.032935219 +0000 UTC m=+892.165014890" lastFinishedPulling="2025-12-05 13:05:06.17913861 +0000 UTC m=+903.311218281" observedRunningTime="2025-12-05 13:05:15.111071795 +0000 UTC m=+912.243151476" watchObservedRunningTime="2025-12-05 13:05:15.12303069 +0000 UTC m=+912.255110371" Dec 05 13:05:15.221939 master-0 kubenswrapper[29936]: I1205 13:05:15.221504 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9509cac3-43a8-4592-bd58-a61a7e48001e" path="/var/lib/kubelet/pods/9509cac3-43a8-4592-bd58-a61a7e48001e/volumes" Dec 05 13:05:15.409568 master-0 kubenswrapper[29936]: I1205 13:05:15.405133 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 13:05:15.409568 master-0 kubenswrapper[29936]: E1205 13:05:15.405752 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9509cac3-43a8-4592-bd58-a61a7e48001e" containerName="init" Dec 05 13:05:15.409568 master-0 kubenswrapper[29936]: I1205 13:05:15.405770 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9509cac3-43a8-4592-bd58-a61a7e48001e" containerName="init" Dec 05 13:05:15.409568 master-0 kubenswrapper[29936]: I1205 13:05:15.406073 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9509cac3-43a8-4592-bd58-a61a7e48001e" containerName="init" Dec 05 13:05:15.409568 master-0 kubenswrapper[29936]: I1205 13:05:15.407509 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 13:05:15.416888 master-0 kubenswrapper[29936]: I1205 13:05:15.412002 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 13:05:15.416888 master-0 kubenswrapper[29936]: I1205 13:05:15.412205 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 13:05:15.416888 master-0 kubenswrapper[29936]: I1205 13:05:15.412331 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 13:05:15.432477 master-0 kubenswrapper[29936]: I1205 13:05:15.428130 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 13:05:15.517881 master-0 kubenswrapper[29936]: I1205 13:05:15.509471 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.517881 master-0 kubenswrapper[29936]: I1205 13:05:15.509612 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5v67\" (UniqueName: \"kubernetes.io/projected/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-kube-api-access-l5v67\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.517881 master-0 kubenswrapper[29936]: I1205 13:05:15.509722 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-scripts\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.517881 master-0 kubenswrapper[29936]: I1205 13:05:15.509768 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-config\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.517881 master-0 kubenswrapper[29936]: I1205 13:05:15.509805 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.517881 master-0 kubenswrapper[29936]: I1205 13:05:15.509835 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.517881 master-0 kubenswrapper[29936]: I1205 13:05:15.509860 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.615686 master-0 kubenswrapper[29936]: I1205 13:05:15.613221 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.615686 master-0 kubenswrapper[29936]: I1205 13:05:15.613377 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5v67\" (UniqueName: \"kubernetes.io/projected/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-kube-api-access-l5v67\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.615686 master-0 kubenswrapper[29936]: I1205 13:05:15.613484 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-scripts\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.615686 master-0 kubenswrapper[29936]: I1205 13:05:15.613524 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-config\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.615686 master-0 kubenswrapper[29936]: I1205 13:05:15.613558 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.615686 master-0 kubenswrapper[29936]: I1205 13:05:15.613587 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.615686 master-0 kubenswrapper[29936]: I1205 13:05:15.613613 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.615686 master-0 kubenswrapper[29936]: I1205 13:05:15.615273 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.615686 master-0 kubenswrapper[29936]: I1205 13:05:15.615345 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-scripts\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.616740 master-0 kubenswrapper[29936]: I1205 13:05:15.616255 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-config\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.619097 master-0 kubenswrapper[29936]: I1205 13:05:15.619044 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.619995 master-0 kubenswrapper[29936]: I1205 13:05:15.619948 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.621934 master-0 kubenswrapper[29936]: I1205 13:05:15.621889 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.634249 master-0 kubenswrapper[29936]: I1205 13:05:15.634171 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5v67\" (UniqueName: \"kubernetes.io/projected/ff8becf8-c9ff-42a5-b28b-4221dfa3489d-kube-api-access-l5v67\") pod \"ovn-northd-0\" (UID: \"ff8becf8-c9ff-42a5-b28b-4221dfa3489d\") " pod="openstack/ovn-northd-0" Dec 05 13:05:15.766258 master-0 kubenswrapper[29936]: I1205 13:05:15.760172 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 13:05:16.297974 master-0 kubenswrapper[29936]: I1205 13:05:16.297890 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 13:05:16.787315 master-0 kubenswrapper[29936]: I1205 13:05:16.787240 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 13:05:16.789790 master-0 kubenswrapper[29936]: I1205 13:05:16.789722 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 13:05:17.862723 master-0 kubenswrapper[29936]: W1205 13:05:17.862622 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff8becf8_c9ff_42a5_b28b_4221dfa3489d.slice/crio-cf4edff9770facdb89552bc07368058fc7108f1d352d4c56b561d0f9dbfa120e WatchSource:0}: Error finding container cf4edff9770facdb89552bc07368058fc7108f1d352d4c56b561d0f9dbfa120e: Status 404 returned error can't find the container with id cf4edff9770facdb89552bc07368058fc7108f1d352d4c56b561d0f9dbfa120e Dec 05 13:05:18.626487 master-0 kubenswrapper[29936]: I1205 13:05:18.626416 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mb6t5" event={"ID":"e2997639-45a9-4e46-9bf1-f011f91eeab2","Type":"ContainerStarted","Data":"dc988c5accbfb796aa670cb2a07656c2d2e2b503d10255271c1b6e22f68adeea"} Dec 05 13:05:18.630096 master-0 kubenswrapper[29936]: I1205 13:05:18.630045 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff8becf8-c9ff-42a5-b28b-4221dfa3489d","Type":"ContainerStarted","Data":"cf4edff9770facdb89552bc07368058fc7108f1d352d4c56b561d0f9dbfa120e"} Dec 05 13:05:18.669861 master-0 kubenswrapper[29936]: I1205 13:05:18.669734 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mb6t5" podStartSLOduration=3.762804621 podStartE2EDuration="7.669698999s" podCreationTimestamp="2025-12-05 13:05:11 +0000 UTC" firstStartedPulling="2025-12-05 13:05:14.011843017 +0000 UTC m=+911.143922698" lastFinishedPulling="2025-12-05 13:05:17.918737395 +0000 UTC m=+915.050817076" observedRunningTime="2025-12-05 13:05:18.649531352 +0000 UTC m=+915.781611043" watchObservedRunningTime="2025-12-05 13:05:18.669698999 +0000 UTC m=+915.801778680" Dec 05 13:05:18.902873 master-0 kubenswrapper[29936]: I1205 13:05:18.902643 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:18.903698 master-0 kubenswrapper[29936]: E1205 13:05:18.902959 29936 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 13:05:18.903698 master-0 kubenswrapper[29936]: E1205 13:05:18.902992 29936 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 13:05:18.903698 master-0 kubenswrapper[29936]: E1205 13:05:18.903072 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift podName:8668c9c4-68f4-4224-9395-0f2ac85b9f1d nodeName:}" failed. No retries permitted until 2025-12-05 13:05:26.90304855 +0000 UTC m=+924.035128241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift") pod "swift-storage-0" (UID: "8668c9c4-68f4-4224-9395-0f2ac85b9f1d") : configmap "swift-ring-files" not found Dec 05 13:05:19.645859 master-0 kubenswrapper[29936]: I1205 13:05:19.645784 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff8becf8-c9ff-42a5-b28b-4221dfa3489d","Type":"ContainerStarted","Data":"96710c473fa3fca84aebddbb1c9714eea8ba3bc765967bde96c40d3d1b89aaef"} Dec 05 13:05:19.645859 master-0 kubenswrapper[29936]: I1205 13:05:19.645862 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff8becf8-c9ff-42a5-b28b-4221dfa3489d","Type":"ContainerStarted","Data":"3b3b35de3a5a7eee0556cea6b72fe92c650bebf43e175e9ed5e1765959ded5a0"} Dec 05 13:05:19.687283 master-0 kubenswrapper[29936]: I1205 13:05:19.686087 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.460410094 podStartE2EDuration="4.686060776s" podCreationTimestamp="2025-12-05 13:05:15 +0000 UTC" firstStartedPulling="2025-12-05 13:05:17.865970701 +0000 UTC m=+914.998050382" lastFinishedPulling="2025-12-05 13:05:19.091621383 +0000 UTC m=+916.223701064" observedRunningTime="2025-12-05 13:05:19.674611395 +0000 UTC m=+916.806691076" watchObservedRunningTime="2025-12-05 13:05:19.686060776 +0000 UTC m=+916.818140457" Dec 05 13:05:19.851041 master-0 kubenswrapper[29936]: E1205 13:05:19.850812 29936 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:59986->192.168.32.10:35211: write tcp 192.168.32.10:59986->192.168.32.10:35211: write: broken pipe Dec 05 13:05:20.004567 master-0 kubenswrapper[29936]: I1205 13:05:20.004481 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:05:20.658112 master-0 kubenswrapper[29936]: I1205 13:05:20.658037 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 13:05:20.903509 master-0 kubenswrapper[29936]: I1205 13:05:20.903268 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 13:05:20.986803 master-0 kubenswrapper[29936]: I1205 13:05:20.986737 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 13:05:22.479259 master-0 kubenswrapper[29936]: I1205 13:05:22.479131 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 13:05:22.479259 master-0 kubenswrapper[29936]: I1205 13:05:22.479232 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 13:05:22.562147 master-0 kubenswrapper[29936]: I1205 13:05:22.562077 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 13:05:22.775998 master-0 kubenswrapper[29936]: I1205 13:05:22.775832 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 13:05:22.851556 master-0 kubenswrapper[29936]: I1205 13:05:22.851472 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-kbdm6"] Dec 05 13:05:22.851888 master-0 kubenswrapper[29936]: I1205 13:05:22.851847 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" podUID="38517b6c-32f7-4853-adb2-e40488c6bb56" containerName="dnsmasq-dns" containerID="cri-o://f5fec10059bed8c9bdc5f2978a0abe16f49285cde3d46eeb107322bdb5684193" gracePeriod=10 Dec 05 13:05:23.737381 master-0 kubenswrapper[29936]: I1205 13:05:23.737286 29936 generic.go:334] "Generic (PLEG): container finished" podID="38517b6c-32f7-4853-adb2-e40488c6bb56" containerID="f5fec10059bed8c9bdc5f2978a0abe16f49285cde3d46eeb107322bdb5684193" exitCode=0 Dec 05 13:05:23.738070 master-0 kubenswrapper[29936]: I1205 13:05:23.737408 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" event={"ID":"38517b6c-32f7-4853-adb2-e40488c6bb56","Type":"ContainerDied","Data":"f5fec10059bed8c9bdc5f2978a0abe16f49285cde3d46eeb107322bdb5684193"} Dec 05 13:05:24.135570 master-0 kubenswrapper[29936]: I1205 13:05:24.135512 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:05:24.250466 master-0 kubenswrapper[29936]: I1205 13:05:24.250407 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-dns-svc\") pod \"38517b6c-32f7-4853-adb2-e40488c6bb56\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " Dec 05 13:05:24.250742 master-0 kubenswrapper[29936]: I1205 13:05:24.250476 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-ovsdbserver-nb\") pod \"38517b6c-32f7-4853-adb2-e40488c6bb56\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " Dec 05 13:05:24.250742 master-0 kubenswrapper[29936]: I1205 13:05:24.250589 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-config\") pod \"38517b6c-32f7-4853-adb2-e40488c6bb56\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " Dec 05 13:05:24.250742 master-0 kubenswrapper[29936]: I1205 13:05:24.250630 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcshv\" (UniqueName: \"kubernetes.io/projected/38517b6c-32f7-4853-adb2-e40488c6bb56-kube-api-access-fcshv\") pod \"38517b6c-32f7-4853-adb2-e40488c6bb56\" (UID: \"38517b6c-32f7-4853-adb2-e40488c6bb56\") " Dec 05 13:05:24.254237 master-0 kubenswrapper[29936]: I1205 13:05:24.254163 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38517b6c-32f7-4853-adb2-e40488c6bb56-kube-api-access-fcshv" (OuterVolumeSpecName: "kube-api-access-fcshv") pod "38517b6c-32f7-4853-adb2-e40488c6bb56" (UID: "38517b6c-32f7-4853-adb2-e40488c6bb56"). InnerVolumeSpecName "kube-api-access-fcshv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:24.295153 master-0 kubenswrapper[29936]: I1205 13:05:24.295070 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38517b6c-32f7-4853-adb2-e40488c6bb56" (UID: "38517b6c-32f7-4853-adb2-e40488c6bb56"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:24.306207 master-0 kubenswrapper[29936]: I1205 13:05:24.306120 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-config" (OuterVolumeSpecName: "config") pod "38517b6c-32f7-4853-adb2-e40488c6bb56" (UID: "38517b6c-32f7-4853-adb2-e40488c6bb56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:24.309039 master-0 kubenswrapper[29936]: I1205 13:05:24.308938 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38517b6c-32f7-4853-adb2-e40488c6bb56" (UID: "38517b6c-32f7-4853-adb2-e40488c6bb56"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:24.354634 master-0 kubenswrapper[29936]: I1205 13:05:24.354532 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:24.354634 master-0 kubenswrapper[29936]: I1205 13:05:24.354583 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:24.354634 master-0 kubenswrapper[29936]: I1205 13:05:24.354596 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38517b6c-32f7-4853-adb2-e40488c6bb56-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:24.354634 master-0 kubenswrapper[29936]: I1205 13:05:24.354614 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcshv\" (UniqueName: \"kubernetes.io/projected/38517b6c-32f7-4853-adb2-e40488c6bb56-kube-api-access-fcshv\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:24.749945 master-0 kubenswrapper[29936]: I1205 13:05:24.749854 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" event={"ID":"38517b6c-32f7-4853-adb2-e40488c6bb56","Type":"ContainerDied","Data":"6047516470de64baa1c046efb19f13e8a15c78dbff8d038439b270e80aec5514"} Dec 05 13:05:24.750694 master-0 kubenswrapper[29936]: I1205 13:05:24.749963 29936 scope.go:117] "RemoveContainer" containerID="f5fec10059bed8c9bdc5f2978a0abe16f49285cde3d46eeb107322bdb5684193" Dec 05 13:05:24.750694 master-0 kubenswrapper[29936]: I1205 13:05:24.749971 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4bcf6ff-kbdm6" Dec 05 13:05:24.783646 master-0 kubenswrapper[29936]: I1205 13:05:24.783594 29936 scope.go:117] "RemoveContainer" containerID="7150a5ed22d252b19d646ed30056deb182aadec565f86fcb83b20ced5792a7a9" Dec 05 13:05:26.032060 master-0 kubenswrapper[29936]: I1205 13:05:26.031960 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-kbdm6"] Dec 05 13:05:26.047774 master-0 kubenswrapper[29936]: I1205 13:05:26.047691 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-kbdm6"] Dec 05 13:05:26.079017 master-0 kubenswrapper[29936]: I1205 13:05:26.078914 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-80a7-account-create-update-j252l"] Dec 05 13:05:26.085138 master-0 kubenswrapper[29936]: E1205 13:05:26.080391 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38517b6c-32f7-4853-adb2-e40488c6bb56" containerName="init" Dec 05 13:05:26.085138 master-0 kubenswrapper[29936]: I1205 13:05:26.080440 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="38517b6c-32f7-4853-adb2-e40488c6bb56" containerName="init" Dec 05 13:05:26.085138 master-0 kubenswrapper[29936]: E1205 13:05:26.080530 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38517b6c-32f7-4853-adb2-e40488c6bb56" containerName="dnsmasq-dns" Dec 05 13:05:26.085138 master-0 kubenswrapper[29936]: I1205 13:05:26.080543 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="38517b6c-32f7-4853-adb2-e40488c6bb56" containerName="dnsmasq-dns" Dec 05 13:05:26.085138 master-0 kubenswrapper[29936]: I1205 13:05:26.080833 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="38517b6c-32f7-4853-adb2-e40488c6bb56" containerName="dnsmasq-dns" Dec 05 13:05:26.085138 master-0 kubenswrapper[29936]: I1205 13:05:26.083835 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-80a7-account-create-update-j252l" Dec 05 13:05:26.092140 master-0 kubenswrapper[29936]: I1205 13:05:26.092068 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 13:05:26.134429 master-0 kubenswrapper[29936]: I1205 13:05:26.134344 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-80a7-account-create-update-j252l"] Dec 05 13:05:26.151446 master-0 kubenswrapper[29936]: I1205 13:05:26.151364 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-tlq59"] Dec 05 13:05:26.166503 master-0 kubenswrapper[29936]: I1205 13:05:26.166417 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tlq59" Dec 05 13:05:26.211386 master-0 kubenswrapper[29936]: I1205 13:05:26.211302 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmz27\" (UniqueName: \"kubernetes.io/projected/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-kube-api-access-vmz27\") pod \"keystone-80a7-account-create-update-j252l\" (UID: \"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff\") " pod="openstack/keystone-80a7-account-create-update-j252l" Dec 05 13:05:26.212430 master-0 kubenswrapper[29936]: I1205 13:05:26.211477 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-operator-scripts\") pod \"keystone-80a7-account-create-update-j252l\" (UID: \"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff\") " pod="openstack/keystone-80a7-account-create-update-j252l" Dec 05 13:05:26.231844 master-0 kubenswrapper[29936]: I1205 13:05:26.231758 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-53e0-account-create-update-2xtnl"] Dec 05 13:05:26.235847 master-0 kubenswrapper[29936]: I1205 13:05:26.234027 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-53e0-account-create-update-2xtnl" Dec 05 13:05:26.236061 master-0 kubenswrapper[29936]: I1205 13:05:26.235892 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 13:05:26.242262 master-0 kubenswrapper[29936]: I1205 13:05:26.242205 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tlq59"] Dec 05 13:05:26.252612 master-0 kubenswrapper[29936]: I1205 13:05:26.252554 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-53e0-account-create-update-2xtnl"] Dec 05 13:05:26.279333 master-0 kubenswrapper[29936]: I1205 13:05:26.278613 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-cbcfg"] Dec 05 13:05:26.280913 master-0 kubenswrapper[29936]: I1205 13:05:26.280531 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cbcfg" Dec 05 13:05:26.291054 master-0 kubenswrapper[29936]: I1205 13:05:26.290470 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cbcfg"] Dec 05 13:05:26.314298 master-0 kubenswrapper[29936]: I1205 13:05:26.314205 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75l4t\" (UniqueName: \"kubernetes.io/projected/44320335-848c-4aa2-b78b-672d29137770-kube-api-access-75l4t\") pod \"placement-53e0-account-create-update-2xtnl\" (UID: \"44320335-848c-4aa2-b78b-672d29137770\") " pod="openstack/placement-53e0-account-create-update-2xtnl" Dec 05 13:05:26.314654 master-0 kubenswrapper[29936]: I1205 13:05:26.314357 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmz27\" (UniqueName: \"kubernetes.io/projected/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-kube-api-access-vmz27\") pod \"keystone-80a7-account-create-update-j252l\" (UID: \"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff\") " pod="openstack/keystone-80a7-account-create-update-j252l" Dec 05 13:05:26.314654 master-0 kubenswrapper[29936]: I1205 13:05:26.314505 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44320335-848c-4aa2-b78b-672d29137770-operator-scripts\") pod \"placement-53e0-account-create-update-2xtnl\" (UID: \"44320335-848c-4aa2-b78b-672d29137770\") " pod="openstack/placement-53e0-account-create-update-2xtnl" Dec 05 13:05:26.315121 master-0 kubenswrapper[29936]: I1205 13:05:26.315058 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-operator-scripts\") pod \"keystone-80a7-account-create-update-j252l\" (UID: \"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff\") " pod="openstack/keystone-80a7-account-create-update-j252l" Dec 05 13:05:26.315299 master-0 kubenswrapper[29936]: I1205 13:05:26.315165 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-operator-scripts\") pod \"keystone-db-create-tlq59\" (UID: \"b8c6d8ef-5d6f-475e-8533-e1879fc64f74\") " pod="openstack/keystone-db-create-tlq59" Dec 05 13:05:26.315299 master-0 kubenswrapper[29936]: I1205 13:05:26.315246 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhptt\" (UniqueName: \"kubernetes.io/projected/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-kube-api-access-dhptt\") pod \"keystone-db-create-tlq59\" (UID: \"b8c6d8ef-5d6f-475e-8533-e1879fc64f74\") " pod="openstack/keystone-db-create-tlq59" Dec 05 13:05:26.315937 master-0 kubenswrapper[29936]: I1205 13:05:26.315893 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-operator-scripts\") pod \"keystone-80a7-account-create-update-j252l\" (UID: \"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff\") " pod="openstack/keystone-80a7-account-create-update-j252l" Dec 05 13:05:26.332650 master-0 kubenswrapper[29936]: I1205 13:05:26.332521 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmz27\" (UniqueName: \"kubernetes.io/projected/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-kube-api-access-vmz27\") pod \"keystone-80a7-account-create-update-j252l\" (UID: \"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff\") " pod="openstack/keystone-80a7-account-create-update-j252l" Dec 05 13:05:26.417919 master-0 kubenswrapper[29936]: I1205 13:05:26.417844 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75l4t\" (UniqueName: \"kubernetes.io/projected/44320335-848c-4aa2-b78b-672d29137770-kube-api-access-75l4t\") pod \"placement-53e0-account-create-update-2xtnl\" (UID: \"44320335-848c-4aa2-b78b-672d29137770\") " pod="openstack/placement-53e0-account-create-update-2xtnl" Dec 05 13:05:26.418322 master-0 kubenswrapper[29936]: I1205 13:05:26.417950 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35238950-d610-4820-bd1f-2aa4ded2c93b-operator-scripts\") pod \"placement-db-create-cbcfg\" (UID: \"35238950-d610-4820-bd1f-2aa4ded2c93b\") " pod="openstack/placement-db-create-cbcfg" Dec 05 13:05:26.418322 master-0 kubenswrapper[29936]: I1205 13:05:26.418004 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44320335-848c-4aa2-b78b-672d29137770-operator-scripts\") pod \"placement-53e0-account-create-update-2xtnl\" (UID: \"44320335-848c-4aa2-b78b-672d29137770\") " pod="openstack/placement-53e0-account-create-update-2xtnl" Dec 05 13:05:26.418322 master-0 kubenswrapper[29936]: I1205 13:05:26.418075 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-operator-scripts\") pod \"keystone-db-create-tlq59\" (UID: \"b8c6d8ef-5d6f-475e-8533-e1879fc64f74\") " pod="openstack/keystone-db-create-tlq59" Dec 05 13:05:26.418322 master-0 kubenswrapper[29936]: I1205 13:05:26.418098 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhptt\" (UniqueName: \"kubernetes.io/projected/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-kube-api-access-dhptt\") pod \"keystone-db-create-tlq59\" (UID: \"b8c6d8ef-5d6f-475e-8533-e1879fc64f74\") " pod="openstack/keystone-db-create-tlq59" Dec 05 13:05:26.418322 master-0 kubenswrapper[29936]: I1205 13:05:26.418160 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6k72\" (UniqueName: \"kubernetes.io/projected/35238950-d610-4820-bd1f-2aa4ded2c93b-kube-api-access-m6k72\") pod \"placement-db-create-cbcfg\" (UID: \"35238950-d610-4820-bd1f-2aa4ded2c93b\") " pod="openstack/placement-db-create-cbcfg" Dec 05 13:05:26.419557 master-0 kubenswrapper[29936]: I1205 13:05:26.419520 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44320335-848c-4aa2-b78b-672d29137770-operator-scripts\") pod \"placement-53e0-account-create-update-2xtnl\" (UID: \"44320335-848c-4aa2-b78b-672d29137770\") " pod="openstack/placement-53e0-account-create-update-2xtnl" Dec 05 13:05:26.420207 master-0 kubenswrapper[29936]: I1205 13:05:26.420156 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-operator-scripts\") pod \"keystone-db-create-tlq59\" (UID: \"b8c6d8ef-5d6f-475e-8533-e1879fc64f74\") " pod="openstack/keystone-db-create-tlq59" Dec 05 13:05:26.433224 master-0 kubenswrapper[29936]: I1205 13:05:26.431546 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-80a7-account-create-update-j252l" Dec 05 13:05:26.435409 master-0 kubenswrapper[29936]: I1205 13:05:26.435357 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75l4t\" (UniqueName: \"kubernetes.io/projected/44320335-848c-4aa2-b78b-672d29137770-kube-api-access-75l4t\") pod \"placement-53e0-account-create-update-2xtnl\" (UID: \"44320335-848c-4aa2-b78b-672d29137770\") " pod="openstack/placement-53e0-account-create-update-2xtnl" Dec 05 13:05:26.439345 master-0 kubenswrapper[29936]: I1205 13:05:26.437758 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhptt\" (UniqueName: \"kubernetes.io/projected/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-kube-api-access-dhptt\") pod \"keystone-db-create-tlq59\" (UID: \"b8c6d8ef-5d6f-475e-8533-e1879fc64f74\") " pod="openstack/keystone-db-create-tlq59" Dec 05 13:05:26.494218 master-0 kubenswrapper[29936]: I1205 13:05:26.493648 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tlq59" Dec 05 13:05:26.520099 master-0 kubenswrapper[29936]: I1205 13:05:26.520036 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6k72\" (UniqueName: \"kubernetes.io/projected/35238950-d610-4820-bd1f-2aa4ded2c93b-kube-api-access-m6k72\") pod \"placement-db-create-cbcfg\" (UID: \"35238950-d610-4820-bd1f-2aa4ded2c93b\") " pod="openstack/placement-db-create-cbcfg" Dec 05 13:05:26.520468 master-0 kubenswrapper[29936]: I1205 13:05:26.520145 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35238950-d610-4820-bd1f-2aa4ded2c93b-operator-scripts\") pod \"placement-db-create-cbcfg\" (UID: \"35238950-d610-4820-bd1f-2aa4ded2c93b\") " pod="openstack/placement-db-create-cbcfg" Dec 05 13:05:26.521746 master-0 kubenswrapper[29936]: I1205 13:05:26.520859 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35238950-d610-4820-bd1f-2aa4ded2c93b-operator-scripts\") pod \"placement-db-create-cbcfg\" (UID: \"35238950-d610-4820-bd1f-2aa4ded2c93b\") " pod="openstack/placement-db-create-cbcfg" Dec 05 13:05:26.549564 master-0 kubenswrapper[29936]: I1205 13:05:26.549351 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6k72\" (UniqueName: \"kubernetes.io/projected/35238950-d610-4820-bd1f-2aa4ded2c93b-kube-api-access-m6k72\") pod \"placement-db-create-cbcfg\" (UID: \"35238950-d610-4820-bd1f-2aa4ded2c93b\") " pod="openstack/placement-db-create-cbcfg" Dec 05 13:05:26.564077 master-0 kubenswrapper[29936]: I1205 13:05:26.563986 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-53e0-account-create-update-2xtnl" Dec 05 13:05:26.606480 master-0 kubenswrapper[29936]: I1205 13:05:26.606398 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cbcfg" Dec 05 13:05:26.938346 master-0 kubenswrapper[29936]: I1205 13:05:26.938264 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:26.939409 master-0 kubenswrapper[29936]: E1205 13:05:26.938900 29936 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 13:05:26.939409 master-0 kubenswrapper[29936]: E1205 13:05:26.938946 29936 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 13:05:26.939409 master-0 kubenswrapper[29936]: E1205 13:05:26.939013 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift podName:8668c9c4-68f4-4224-9395-0f2ac85b9f1d nodeName:}" failed. No retries permitted until 2025-12-05 13:05:42.938990949 +0000 UTC m=+940.071070630 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift") pod "swift-storage-0" (UID: "8668c9c4-68f4-4224-9395-0f2ac85b9f1d") : configmap "swift-ring-files" not found Dec 05 13:05:27.203110 master-0 kubenswrapper[29936]: I1205 13:05:27.203014 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38517b6c-32f7-4853-adb2-e40488c6bb56" path="/var/lib/kubelet/pods/38517b6c-32f7-4853-adb2-e40488c6bb56/volumes" Dec 05 13:05:28.485785 master-0 kubenswrapper[29936]: I1205 13:05:28.485703 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-tlq59"] Dec 05 13:05:28.502327 master-0 kubenswrapper[29936]: W1205 13:05:28.498475 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8c6d8ef_5d6f_475e_8533_e1879fc64f74.slice/crio-e614dd32ed0724c96630f806cd164549c90159b110ed196f50a25531a27f8cd4 WatchSource:0}: Error finding container e614dd32ed0724c96630f806cd164549c90159b110ed196f50a25531a27f8cd4: Status 404 returned error can't find the container with id e614dd32ed0724c96630f806cd164549c90159b110ed196f50a25531a27f8cd4 Dec 05 13:05:28.504661 master-0 kubenswrapper[29936]: W1205 13:05:28.504494 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35238950_d610_4820_bd1f_2aa4ded2c93b.slice/crio-05287615d39169fe8c1956b16a2b699b109656fecfb8c4cb5973d754604fc789 WatchSource:0}: Error finding container 05287615d39169fe8c1956b16a2b699b109656fecfb8c4cb5973d754604fc789: Status 404 returned error can't find the container with id 05287615d39169fe8c1956b16a2b699b109656fecfb8c4cb5973d754604fc789 Dec 05 13:05:28.507095 master-0 kubenswrapper[29936]: I1205 13:05:28.507034 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-53e0-account-create-update-2xtnl"] Dec 05 13:05:28.521446 master-0 kubenswrapper[29936]: W1205 13:05:28.521382 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44320335_848c_4aa2_b78b_672d29137770.slice/crio-8f89f14d3d8dd0e9cafdd637600652c3ea31b4258e19f71fd8252ab7f742b11b WatchSource:0}: Error finding container 8f89f14d3d8dd0e9cafdd637600652c3ea31b4258e19f71fd8252ab7f742b11b: Status 404 returned error can't find the container with id 8f89f14d3d8dd0e9cafdd637600652c3ea31b4258e19f71fd8252ab7f742b11b Dec 05 13:05:28.539385 master-0 kubenswrapper[29936]: I1205 13:05:28.539334 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cbcfg"] Dec 05 13:05:28.549676 master-0 kubenswrapper[29936]: I1205 13:05:28.549596 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-80a7-account-create-update-j252l"] Dec 05 13:05:28.835792 master-0 kubenswrapper[29936]: I1205 13:05:28.834871 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tlq59" event={"ID":"b8c6d8ef-5d6f-475e-8533-e1879fc64f74","Type":"ContainerStarted","Data":"e614dd32ed0724c96630f806cd164549c90159b110ed196f50a25531a27f8cd4"} Dec 05 13:05:28.837364 master-0 kubenswrapper[29936]: I1205 13:05:28.836711 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-80a7-account-create-update-j252l" event={"ID":"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff","Type":"ContainerStarted","Data":"98eeb8b01f45036ff94bebe1cd86bad2c29ba09c45f59416773cd8837cd5f134"} Dec 05 13:05:28.840412 master-0 kubenswrapper[29936]: I1205 13:05:28.838532 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cbcfg" event={"ID":"35238950-d610-4820-bd1f-2aa4ded2c93b","Type":"ContainerStarted","Data":"05287615d39169fe8c1956b16a2b699b109656fecfb8c4cb5973d754604fc789"} Dec 05 13:05:28.840412 master-0 kubenswrapper[29936]: I1205 13:05:28.839948 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-53e0-account-create-update-2xtnl" event={"ID":"44320335-848c-4aa2-b78b-672d29137770","Type":"ContainerStarted","Data":"8f89f14d3d8dd0e9cafdd637600652c3ea31b4258e19f71fd8252ab7f742b11b"} Dec 05 13:05:29.870866 master-0 kubenswrapper[29936]: I1205 13:05:29.870759 29936 generic.go:334] "Generic (PLEG): container finished" podID="ac9589fd-c5fd-41de-8018-16c6e272d05e" containerID="de7220b30bf57ca1dde3f7faa8205588136c8638d8e6486d140db7ee4b6fdd36" exitCode=0 Dec 05 13:05:29.870866 master-0 kubenswrapper[29936]: I1205 13:05:29.870839 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ac9589fd-c5fd-41de-8018-16c6e272d05e","Type":"ContainerDied","Data":"de7220b30bf57ca1dde3f7faa8205588136c8638d8e6486d140db7ee4b6fdd36"} Dec 05 13:05:30.832240 master-0 kubenswrapper[29936]: I1205 13:05:30.830237 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 13:05:30.884963 master-0 kubenswrapper[29936]: I1205 13:05:30.884875 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ac9589fd-c5fd-41de-8018-16c6e272d05e","Type":"ContainerStarted","Data":"acf1002b4816e52199f818c3e60136c420d1ca78a9a1ac201886ec3bcc1e825d"} Dec 05 13:05:30.886457 master-0 kubenswrapper[29936]: I1205 13:05:30.886419 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tlq59" event={"ID":"b8c6d8ef-5d6f-475e-8533-e1879fc64f74","Type":"ContainerStarted","Data":"aad82ce45003a366b65f76f7465e5da00d59d6cb694847622cd837271d0bd783"} Dec 05 13:05:30.888515 master-0 kubenswrapper[29936]: I1205 13:05:30.888455 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-80a7-account-create-update-j252l" event={"ID":"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff","Type":"ContainerStarted","Data":"5526388e8dcb2c46558dfa38fb13dcfba8dace91895cb24abc65d005040f0351"} Dec 05 13:05:30.890974 master-0 kubenswrapper[29936]: I1205 13:05:30.890915 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cbcfg" event={"ID":"35238950-d610-4820-bd1f-2aa4ded2c93b","Type":"ContainerStarted","Data":"0301efa77391c1e2edf5f153ae70ca5bb69dea80043c28ef95573231dea6dbbf"} Dec 05 13:05:30.893512 master-0 kubenswrapper[29936]: I1205 13:05:30.893477 29936 generic.go:334] "Generic (PLEG): container finished" podID="a34de370-e400-45de-adbb-09995d9c1953" containerID="16fecadc541306e1fe67fb00b33b73ad354dfeaef29fb1b7118c9cc281d7c540" exitCode=0 Dec 05 13:05:30.893593 master-0 kubenswrapper[29936]: I1205 13:05:30.893552 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a34de370-e400-45de-adbb-09995d9c1953","Type":"ContainerDied","Data":"16fecadc541306e1fe67fb00b33b73ad354dfeaef29fb1b7118c9cc281d7c540"} Dec 05 13:05:30.896230 master-0 kubenswrapper[29936]: I1205 13:05:30.896145 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-53e0-account-create-update-2xtnl" event={"ID":"44320335-848c-4aa2-b78b-672d29137770","Type":"ContainerStarted","Data":"96c4c247cf4c258d3fbcffc77932796c618ab7cc8392fe3caae8f89e52bb86cb"} Dec 05 13:05:31.911640 master-0 kubenswrapper[29936]: I1205 13:05:31.911535 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a34de370-e400-45de-adbb-09995d9c1953","Type":"ContainerStarted","Data":"f777716158bf6291e9dc688be554ac06e366fddc442827807798cc27f7a03b13"} Dec 05 13:05:32.786272 master-0 kubenswrapper[29936]: I1205 13:05:32.784976 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2k6sd"] Dec 05 13:05:32.788025 master-0 kubenswrapper[29936]: I1205 13:05:32.787042 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2k6sd" Dec 05 13:05:32.788025 master-0 kubenswrapper[29936]: I1205 13:05:32.787600 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-80a7-account-create-update-j252l" podStartSLOduration=6.787587995 podStartE2EDuration="6.787587995s" podCreationTimestamp="2025-12-05 13:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:05:32.76899082 +0000 UTC m=+929.901070501" watchObservedRunningTime="2025-12-05 13:05:32.787587995 +0000 UTC m=+929.919667676" Dec 05 13:05:32.816046 master-0 kubenswrapper[29936]: I1205 13:05:32.814914 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2k6sd"] Dec 05 13:05:32.820970 master-0 kubenswrapper[29936]: I1205 13:05:32.820885 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=70.225363926 podStartE2EDuration="1m12.820850719s" podCreationTimestamp="2025-12-05 13:04:20 +0000 UTC" firstStartedPulling="2025-12-05 13:04:52.605715178 +0000 UTC m=+889.737794859" lastFinishedPulling="2025-12-05 13:04:55.201201971 +0000 UTC m=+892.333281652" observedRunningTime="2025-12-05 13:05:32.81353892 +0000 UTC m=+929.945618601" watchObservedRunningTime="2025-12-05 13:05:32.820850719 +0000 UTC m=+929.952930400" Dec 05 13:05:32.849319 master-0 kubenswrapper[29936]: I1205 13:05:32.849235 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-cbcfg" podStartSLOduration=6.849212609 podStartE2EDuration="6.849212609s" podCreationTimestamp="2025-12-05 13:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:05:32.841353305 +0000 UTC m=+929.973432986" watchObservedRunningTime="2025-12-05 13:05:32.849212609 +0000 UTC m=+929.981292290" Dec 05 13:05:32.906688 master-0 kubenswrapper[29936]: I1205 13:05:32.906332 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60f33781-dd68-4b4b-8ca7-7b271a1aa195-operator-scripts\") pod \"glance-db-create-2k6sd\" (UID: \"60f33781-dd68-4b4b-8ca7-7b271a1aa195\") " pod="openstack/glance-db-create-2k6sd" Dec 05 13:05:32.906688 master-0 kubenswrapper[29936]: I1205 13:05:32.906409 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbsbm\" (UniqueName: \"kubernetes.io/projected/60f33781-dd68-4b4b-8ca7-7b271a1aa195-kube-api-access-cbsbm\") pod \"glance-db-create-2k6sd\" (UID: \"60f33781-dd68-4b4b-8ca7-7b271a1aa195\") " pod="openstack/glance-db-create-2k6sd" Dec 05 13:05:32.909198 master-0 kubenswrapper[29936]: I1205 13:05:32.909149 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4269-account-create-update-rmqxq"] Dec 05 13:05:32.913263 master-0 kubenswrapper[29936]: I1205 13:05:32.913224 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4269-account-create-update-rmqxq" Dec 05 13:05:32.923532 master-0 kubenswrapper[29936]: I1205 13:05:32.922760 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 13:05:32.929101 master-0 kubenswrapper[29936]: I1205 13:05:32.929030 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4269-account-create-update-rmqxq"] Dec 05 13:05:32.933503 master-0 kubenswrapper[29936]: I1205 13:05:32.933205 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-tlq59" podStartSLOduration=6.93316758 podStartE2EDuration="6.93316758s" podCreationTimestamp="2025-12-05 13:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:05:32.922806519 +0000 UTC m=+930.054886220" watchObservedRunningTime="2025-12-05 13:05:32.93316758 +0000 UTC m=+930.065247261" Dec 05 13:05:32.939478 master-0 kubenswrapper[29936]: I1205 13:05:32.937988 29936 generic.go:334] "Generic (PLEG): container finished" podID="e2997639-45a9-4e46-9bf1-f011f91eeab2" containerID="dc988c5accbfb796aa670cb2a07656c2d2e2b503d10255271c1b6e22f68adeea" exitCode=0 Dec 05 13:05:32.939478 master-0 kubenswrapper[29936]: I1205 13:05:32.938670 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mb6t5" event={"ID":"e2997639-45a9-4e46-9bf1-f011f91eeab2","Type":"ContainerDied","Data":"dc988c5accbfb796aa670cb2a07656c2d2e2b503d10255271c1b6e22f68adeea"} Dec 05 13:05:32.939478 master-0 kubenswrapper[29936]: I1205 13:05:32.939364 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:05:32.997602 master-0 kubenswrapper[29936]: I1205 13:05:32.997512 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-53e0-account-create-update-2xtnl" podStartSLOduration=6.997486778 podStartE2EDuration="6.997486778s" podCreationTimestamp="2025-12-05 13:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:05:32.98871889 +0000 UTC m=+930.120798581" watchObservedRunningTime="2025-12-05 13:05:32.997486778 +0000 UTC m=+930.129566459" Dec 05 13:05:33.009371 master-0 kubenswrapper[29936]: I1205 13:05:33.009285 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87999eba-73aa-43cf-be9e-1e07b1dc22e0-operator-scripts\") pod \"glance-4269-account-create-update-rmqxq\" (UID: \"87999eba-73aa-43cf-be9e-1e07b1dc22e0\") " pod="openstack/glance-4269-account-create-update-rmqxq" Dec 05 13:05:33.009634 master-0 kubenswrapper[29936]: I1205 13:05:33.009508 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60f33781-dd68-4b4b-8ca7-7b271a1aa195-operator-scripts\") pod \"glance-db-create-2k6sd\" (UID: \"60f33781-dd68-4b4b-8ca7-7b271a1aa195\") " pod="openstack/glance-db-create-2k6sd" Dec 05 13:05:33.009634 master-0 kubenswrapper[29936]: I1205 13:05:33.009546 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbsbm\" (UniqueName: \"kubernetes.io/projected/60f33781-dd68-4b4b-8ca7-7b271a1aa195-kube-api-access-cbsbm\") pod \"glance-db-create-2k6sd\" (UID: \"60f33781-dd68-4b4b-8ca7-7b271a1aa195\") " pod="openstack/glance-db-create-2k6sd" Dec 05 13:05:33.011344 master-0 kubenswrapper[29936]: I1205 13:05:33.010253 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dwjv\" (UniqueName: \"kubernetes.io/projected/87999eba-73aa-43cf-be9e-1e07b1dc22e0-kube-api-access-2dwjv\") pod \"glance-4269-account-create-update-rmqxq\" (UID: \"87999eba-73aa-43cf-be9e-1e07b1dc22e0\") " pod="openstack/glance-4269-account-create-update-rmqxq" Dec 05 13:05:33.011344 master-0 kubenswrapper[29936]: I1205 13:05:33.010962 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60f33781-dd68-4b4b-8ca7-7b271a1aa195-operator-scripts\") pod \"glance-db-create-2k6sd\" (UID: \"60f33781-dd68-4b4b-8ca7-7b271a1aa195\") " pod="openstack/glance-db-create-2k6sd" Dec 05 13:05:33.035830 master-0 kubenswrapper[29936]: I1205 13:05:33.035773 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbsbm\" (UniqueName: \"kubernetes.io/projected/60f33781-dd68-4b4b-8ca7-7b271a1aa195-kube-api-access-cbsbm\") pod \"glance-db-create-2k6sd\" (UID: \"60f33781-dd68-4b4b-8ca7-7b271a1aa195\") " pod="openstack/glance-db-create-2k6sd" Dec 05 13:05:33.036858 master-0 kubenswrapper[29936]: I1205 13:05:33.036725 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=69.439230802 podStartE2EDuration="1m14.036692533s" podCreationTimestamp="2025-12-05 13:04:19 +0000 UTC" firstStartedPulling="2025-12-05 13:04:50.577851227 +0000 UTC m=+887.709930908" lastFinishedPulling="2025-12-05 13:04:55.175312958 +0000 UTC m=+892.307392639" observedRunningTime="2025-12-05 13:05:33.028533621 +0000 UTC m=+930.160613302" watchObservedRunningTime="2025-12-05 13:05:33.036692533 +0000 UTC m=+930.168772224" Dec 05 13:05:33.118238 master-0 kubenswrapper[29936]: I1205 13:05:33.114993 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dwjv\" (UniqueName: \"kubernetes.io/projected/87999eba-73aa-43cf-be9e-1e07b1dc22e0-kube-api-access-2dwjv\") pod \"glance-4269-account-create-update-rmqxq\" (UID: \"87999eba-73aa-43cf-be9e-1e07b1dc22e0\") " pod="openstack/glance-4269-account-create-update-rmqxq" Dec 05 13:05:33.118238 master-0 kubenswrapper[29936]: I1205 13:05:33.115115 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87999eba-73aa-43cf-be9e-1e07b1dc22e0-operator-scripts\") pod \"glance-4269-account-create-update-rmqxq\" (UID: \"87999eba-73aa-43cf-be9e-1e07b1dc22e0\") " pod="openstack/glance-4269-account-create-update-rmqxq" Dec 05 13:05:33.118238 master-0 kubenswrapper[29936]: I1205 13:05:33.116368 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87999eba-73aa-43cf-be9e-1e07b1dc22e0-operator-scripts\") pod \"glance-4269-account-create-update-rmqxq\" (UID: \"87999eba-73aa-43cf-be9e-1e07b1dc22e0\") " pod="openstack/glance-4269-account-create-update-rmqxq" Dec 05 13:05:33.121599 master-0 kubenswrapper[29936]: I1205 13:05:33.120541 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2k6sd" Dec 05 13:05:33.143860 master-0 kubenswrapper[29936]: I1205 13:05:33.143670 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dwjv\" (UniqueName: \"kubernetes.io/projected/87999eba-73aa-43cf-be9e-1e07b1dc22e0-kube-api-access-2dwjv\") pod \"glance-4269-account-create-update-rmqxq\" (UID: \"87999eba-73aa-43cf-be9e-1e07b1dc22e0\") " pod="openstack/glance-4269-account-create-update-rmqxq" Dec 05 13:05:33.261079 master-0 kubenswrapper[29936]: I1205 13:05:33.259893 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4269-account-create-update-rmqxq" Dec 05 13:05:33.698261 master-0 kubenswrapper[29936]: I1205 13:05:33.698092 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2k6sd"] Dec 05 13:05:33.886469 master-0 kubenswrapper[29936]: I1205 13:05:33.886373 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4269-account-create-update-rmqxq"] Dec 05 13:05:33.891878 master-0 kubenswrapper[29936]: W1205 13:05:33.891776 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87999eba_73aa_43cf_be9e_1e07b1dc22e0.slice/crio-34cf66cab6903bea3ee00d91fd173ee2e94f78bb1e9eca171392e26eeabff6df WatchSource:0}: Error finding container 34cf66cab6903bea3ee00d91fd173ee2e94f78bb1e9eca171392e26eeabff6df: Status 404 returned error can't find the container with id 34cf66cab6903bea3ee00d91fd173ee2e94f78bb1e9eca171392e26eeabff6df Dec 05 13:05:33.957128 master-0 kubenswrapper[29936]: I1205 13:05:33.957021 29936 generic.go:334] "Generic (PLEG): container finished" podID="64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff" containerID="5526388e8dcb2c46558dfa38fb13dcfba8dace91895cb24abc65d005040f0351" exitCode=0 Dec 05 13:05:33.958085 master-0 kubenswrapper[29936]: I1205 13:05:33.957840 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-80a7-account-create-update-j252l" event={"ID":"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff","Type":"ContainerDied","Data":"5526388e8dcb2c46558dfa38fb13dcfba8dace91895cb24abc65d005040f0351"} Dec 05 13:05:33.961368 master-0 kubenswrapper[29936]: I1205 13:05:33.961309 29936 generic.go:334] "Generic (PLEG): container finished" podID="35238950-d610-4820-bd1f-2aa4ded2c93b" containerID="0301efa77391c1e2edf5f153ae70ca5bb69dea80043c28ef95573231dea6dbbf" exitCode=0 Dec 05 13:05:33.961561 master-0 kubenswrapper[29936]: I1205 13:05:33.961525 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cbcfg" event={"ID":"35238950-d610-4820-bd1f-2aa4ded2c93b","Type":"ContainerDied","Data":"0301efa77391c1e2edf5f153ae70ca5bb69dea80043c28ef95573231dea6dbbf"} Dec 05 13:05:33.963751 master-0 kubenswrapper[29936]: I1205 13:05:33.963682 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2k6sd" event={"ID":"60f33781-dd68-4b4b-8ca7-7b271a1aa195","Type":"ContainerStarted","Data":"5597c5ceb379707a5b08ea4ec7140b5c3a59c44f4f347cc766263ae420cace91"} Dec 05 13:05:33.968131 master-0 kubenswrapper[29936]: I1205 13:05:33.968069 29936 generic.go:334] "Generic (PLEG): container finished" podID="44320335-848c-4aa2-b78b-672d29137770" containerID="96c4c247cf4c258d3fbcffc77932796c618ab7cc8392fe3caae8f89e52bb86cb" exitCode=0 Dec 05 13:05:33.968324 master-0 kubenswrapper[29936]: I1205 13:05:33.968271 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-53e0-account-create-update-2xtnl" event={"ID":"44320335-848c-4aa2-b78b-672d29137770","Type":"ContainerDied","Data":"96c4c247cf4c258d3fbcffc77932796c618ab7cc8392fe3caae8f89e52bb86cb"} Dec 05 13:05:33.970763 master-0 kubenswrapper[29936]: I1205 13:05:33.970664 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4269-account-create-update-rmqxq" event={"ID":"87999eba-73aa-43cf-be9e-1e07b1dc22e0","Type":"ContainerStarted","Data":"34cf66cab6903bea3ee00d91fd173ee2e94f78bb1e9eca171392e26eeabff6df"} Dec 05 13:05:33.976213 master-0 kubenswrapper[29936]: I1205 13:05:33.976079 29936 generic.go:334] "Generic (PLEG): container finished" podID="b8c6d8ef-5d6f-475e-8533-e1879fc64f74" containerID="aad82ce45003a366b65f76f7465e5da00d59d6cb694847622cd837271d0bd783" exitCode=0 Dec 05 13:05:33.976500 master-0 kubenswrapper[29936]: I1205 13:05:33.976454 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tlq59" event={"ID":"b8c6d8ef-5d6f-475e-8533-e1879fc64f74","Type":"ContainerDied","Data":"aad82ce45003a366b65f76f7465e5da00d59d6cb694847622cd837271d0bd783"} Dec 05 13:05:34.556674 master-0 kubenswrapper[29936]: I1205 13:05:34.556607 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:34.672803 master-0 kubenswrapper[29936]: I1205 13:05:34.671814 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89vbs\" (UniqueName: \"kubernetes.io/projected/e2997639-45a9-4e46-9bf1-f011f91eeab2-kube-api-access-89vbs\") pod \"e2997639-45a9-4e46-9bf1-f011f91eeab2\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " Dec 05 13:05:34.672803 master-0 kubenswrapper[29936]: I1205 13:05:34.672014 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-ring-data-devices\") pod \"e2997639-45a9-4e46-9bf1-f011f91eeab2\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " Dec 05 13:05:34.672803 master-0 kubenswrapper[29936]: I1205 13:05:34.672081 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e2997639-45a9-4e46-9bf1-f011f91eeab2-etc-swift\") pod \"e2997639-45a9-4e46-9bf1-f011f91eeab2\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " Dec 05 13:05:34.672803 master-0 kubenswrapper[29936]: I1205 13:05:34.672139 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-swiftconf\") pod \"e2997639-45a9-4e46-9bf1-f011f91eeab2\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " Dec 05 13:05:34.672803 master-0 kubenswrapper[29936]: I1205 13:05:34.672599 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-scripts\") pod \"e2997639-45a9-4e46-9bf1-f011f91eeab2\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " Dec 05 13:05:34.672803 master-0 kubenswrapper[29936]: I1205 13:05:34.672745 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-dispersionconf\") pod \"e2997639-45a9-4e46-9bf1-f011f91eeab2\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " Dec 05 13:05:34.672803 master-0 kubenswrapper[29936]: I1205 13:05:34.672753 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e2997639-45a9-4e46-9bf1-f011f91eeab2" (UID: "e2997639-45a9-4e46-9bf1-f011f91eeab2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:34.673402 master-0 kubenswrapper[29936]: I1205 13:05:34.673014 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-combined-ca-bundle\") pod \"e2997639-45a9-4e46-9bf1-f011f91eeab2\" (UID: \"e2997639-45a9-4e46-9bf1-f011f91eeab2\") " Dec 05 13:05:34.673402 master-0 kubenswrapper[29936]: I1205 13:05:34.673196 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2997639-45a9-4e46-9bf1-f011f91eeab2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e2997639-45a9-4e46-9bf1-f011f91eeab2" (UID: "e2997639-45a9-4e46-9bf1-f011f91eeab2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:05:34.674672 master-0 kubenswrapper[29936]: I1205 13:05:34.674621 29936 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:34.674672 master-0 kubenswrapper[29936]: I1205 13:05:34.674667 29936 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e2997639-45a9-4e46-9bf1-f011f91eeab2-etc-swift\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:34.679382 master-0 kubenswrapper[29936]: I1205 13:05:34.679167 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2997639-45a9-4e46-9bf1-f011f91eeab2-kube-api-access-89vbs" (OuterVolumeSpecName: "kube-api-access-89vbs") pod "e2997639-45a9-4e46-9bf1-f011f91eeab2" (UID: "e2997639-45a9-4e46-9bf1-f011f91eeab2"). InnerVolumeSpecName "kube-api-access-89vbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:34.682746 master-0 kubenswrapper[29936]: I1205 13:05:34.682602 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e2997639-45a9-4e46-9bf1-f011f91eeab2" (UID: "e2997639-45a9-4e46-9bf1-f011f91eeab2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:05:34.706586 master-0 kubenswrapper[29936]: I1205 13:05:34.706494 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-scripts" (OuterVolumeSpecName: "scripts") pod "e2997639-45a9-4e46-9bf1-f011f91eeab2" (UID: "e2997639-45a9-4e46-9bf1-f011f91eeab2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:34.711198 master-0 kubenswrapper[29936]: I1205 13:05:34.709977 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2997639-45a9-4e46-9bf1-f011f91eeab2" (UID: "e2997639-45a9-4e46-9bf1-f011f91eeab2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:05:34.711452 master-0 kubenswrapper[29936]: I1205 13:05:34.711375 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e2997639-45a9-4e46-9bf1-f011f91eeab2" (UID: "e2997639-45a9-4e46-9bf1-f011f91eeab2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:05:34.776466 master-0 kubenswrapper[29936]: I1205 13:05:34.776410 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2997639-45a9-4e46-9bf1-f011f91eeab2-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:34.776466 master-0 kubenswrapper[29936]: I1205 13:05:34.776458 29936 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-dispersionconf\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:34.776466 master-0 kubenswrapper[29936]: I1205 13:05:34.776474 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:34.776785 master-0 kubenswrapper[29936]: I1205 13:05:34.776484 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89vbs\" (UniqueName: \"kubernetes.io/projected/e2997639-45a9-4e46-9bf1-f011f91eeab2-kube-api-access-89vbs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:34.776785 master-0 kubenswrapper[29936]: I1205 13:05:34.776494 29936 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e2997639-45a9-4e46-9bf1-f011f91eeab2-swiftconf\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:34.993474 master-0 kubenswrapper[29936]: I1205 13:05:34.993419 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mb6t5" Dec 05 13:05:34.994422 master-0 kubenswrapper[29936]: I1205 13:05:34.993402 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mb6t5" event={"ID":"e2997639-45a9-4e46-9bf1-f011f91eeab2","Type":"ContainerDied","Data":"641dafa31ace0b6e86d1dcbbc2b1cc975b707dd47a1b21218c56728d91bc7b19"} Dec 05 13:05:34.994422 master-0 kubenswrapper[29936]: I1205 13:05:34.993631 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="641dafa31ace0b6e86d1dcbbc2b1cc975b707dd47a1b21218c56728d91bc7b19" Dec 05 13:05:34.997298 master-0 kubenswrapper[29936]: I1205 13:05:34.997254 29936 generic.go:334] "Generic (PLEG): container finished" podID="60f33781-dd68-4b4b-8ca7-7b271a1aa195" containerID="096fabf329918aaa19f0dd01ada94924b6d9ef114f7c552fe8a648c12cd8583b" exitCode=0 Dec 05 13:05:34.997558 master-0 kubenswrapper[29936]: I1205 13:05:34.997412 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2k6sd" event={"ID":"60f33781-dd68-4b4b-8ca7-7b271a1aa195","Type":"ContainerDied","Data":"096fabf329918aaa19f0dd01ada94924b6d9ef114f7c552fe8a648c12cd8583b"} Dec 05 13:05:35.003414 master-0 kubenswrapper[29936]: I1205 13:05:35.001732 29936 generic.go:334] "Generic (PLEG): container finished" podID="87999eba-73aa-43cf-be9e-1e07b1dc22e0" containerID="46d0bd9a939bd564c8f213558390311e1364aa3f24da86fb352ef1405f8ab17d" exitCode=0 Dec 05 13:05:35.003414 master-0 kubenswrapper[29936]: I1205 13:05:35.001814 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4269-account-create-update-rmqxq" event={"ID":"87999eba-73aa-43cf-be9e-1e07b1dc22e0","Type":"ContainerDied","Data":"46d0bd9a939bd564c8f213558390311e1364aa3f24da86fb352ef1405f8ab17d"} Dec 05 13:05:35.635325 master-0 kubenswrapper[29936]: I1205 13:05:35.635283 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-53e0-account-create-update-2xtnl" Dec 05 13:05:35.729529 master-0 kubenswrapper[29936]: I1205 13:05:35.728359 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75l4t\" (UniqueName: \"kubernetes.io/projected/44320335-848c-4aa2-b78b-672d29137770-kube-api-access-75l4t\") pod \"44320335-848c-4aa2-b78b-672d29137770\" (UID: \"44320335-848c-4aa2-b78b-672d29137770\") " Dec 05 13:05:35.729529 master-0 kubenswrapper[29936]: I1205 13:05:35.728453 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44320335-848c-4aa2-b78b-672d29137770-operator-scripts\") pod \"44320335-848c-4aa2-b78b-672d29137770\" (UID: \"44320335-848c-4aa2-b78b-672d29137770\") " Dec 05 13:05:35.730163 master-0 kubenswrapper[29936]: I1205 13:05:35.730097 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44320335-848c-4aa2-b78b-672d29137770-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44320335-848c-4aa2-b78b-672d29137770" (UID: "44320335-848c-4aa2-b78b-672d29137770"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:35.730905 master-0 kubenswrapper[29936]: I1205 13:05:35.730877 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44320335-848c-4aa2-b78b-672d29137770-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:35.749475 master-0 kubenswrapper[29936]: I1205 13:05:35.735087 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44320335-848c-4aa2-b78b-672d29137770-kube-api-access-75l4t" (OuterVolumeSpecName: "kube-api-access-75l4t") pod "44320335-848c-4aa2-b78b-672d29137770" (UID: "44320335-848c-4aa2-b78b-672d29137770"). InnerVolumeSpecName "kube-api-access-75l4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:35.841040 master-0 kubenswrapper[29936]: I1205 13:05:35.840934 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75l4t\" (UniqueName: \"kubernetes.io/projected/44320335-848c-4aa2-b78b-672d29137770-kube-api-access-75l4t\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:35.948817 master-0 kubenswrapper[29936]: I1205 13:05:35.948582 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tlq59" Dec 05 13:05:35.951878 master-0 kubenswrapper[29936]: I1205 13:05:35.951823 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-80a7-account-create-update-j252l" Dec 05 13:05:35.970280 master-0 kubenswrapper[29936]: I1205 13:05:35.969294 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cbcfg" Dec 05 13:05:36.045166 master-0 kubenswrapper[29936]: I1205 13:05:36.044709 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35238950-d610-4820-bd1f-2aa4ded2c93b-operator-scripts\") pod \"35238950-d610-4820-bd1f-2aa4ded2c93b\" (UID: \"35238950-d610-4820-bd1f-2aa4ded2c93b\") " Dec 05 13:05:36.045166 master-0 kubenswrapper[29936]: I1205 13:05:36.045079 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-operator-scripts\") pod \"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff\" (UID: \"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff\") " Dec 05 13:05:36.045904 master-0 kubenswrapper[29936]: I1205 13:05:36.045202 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6k72\" (UniqueName: \"kubernetes.io/projected/35238950-d610-4820-bd1f-2aa4ded2c93b-kube-api-access-m6k72\") pod \"35238950-d610-4820-bd1f-2aa4ded2c93b\" (UID: \"35238950-d610-4820-bd1f-2aa4ded2c93b\") " Dec 05 13:05:36.047389 master-0 kubenswrapper[29936]: I1205 13:05:36.047318 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmz27\" (UniqueName: \"kubernetes.io/projected/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-kube-api-access-vmz27\") pod \"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff\" (UID: \"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff\") " Dec 05 13:05:36.047389 master-0 kubenswrapper[29936]: I1205 13:05:36.047421 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhptt\" (UniqueName: \"kubernetes.io/projected/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-kube-api-access-dhptt\") pod \"b8c6d8ef-5d6f-475e-8533-e1879fc64f74\" (UID: \"b8c6d8ef-5d6f-475e-8533-e1879fc64f74\") " Dec 05 13:05:36.047892 master-0 kubenswrapper[29936]: I1205 13:05:36.047506 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-operator-scripts\") pod \"b8c6d8ef-5d6f-475e-8533-e1879fc64f74\" (UID: \"b8c6d8ef-5d6f-475e-8533-e1879fc64f74\") " Dec 05 13:05:36.049508 master-0 kubenswrapper[29936]: I1205 13:05:36.049450 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff" (UID: "64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:36.049978 master-0 kubenswrapper[29936]: I1205 13:05:36.046921 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35238950-d610-4820-bd1f-2aa4ded2c93b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "35238950-d610-4820-bd1f-2aa4ded2c93b" (UID: "35238950-d610-4820-bd1f-2aa4ded2c93b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:36.050029 master-0 kubenswrapper[29936]: I1205 13:05:36.049968 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8c6d8ef-5d6f-475e-8533-e1879fc64f74" (UID: "b8c6d8ef-5d6f-475e-8533-e1879fc64f74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:36.055070 master-0 kubenswrapper[29936]: I1205 13:05:36.055008 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35238950-d610-4820-bd1f-2aa4ded2c93b-kube-api-access-m6k72" (OuterVolumeSpecName: "kube-api-access-m6k72") pod "35238950-d610-4820-bd1f-2aa4ded2c93b" (UID: "35238950-d610-4820-bd1f-2aa4ded2c93b"). InnerVolumeSpecName "kube-api-access-m6k72". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:36.055493 master-0 kubenswrapper[29936]: I1205 13:05:36.054949 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-kube-api-access-vmz27" (OuterVolumeSpecName: "kube-api-access-vmz27") pod "64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff" (UID: "64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff"). InnerVolumeSpecName "kube-api-access-vmz27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:36.055881 master-0 kubenswrapper[29936]: I1205 13:05:36.055430 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-kube-api-access-dhptt" (OuterVolumeSpecName: "kube-api-access-dhptt") pod "b8c6d8ef-5d6f-475e-8533-e1879fc64f74" (UID: "b8c6d8ef-5d6f-475e-8533-e1879fc64f74"). InnerVolumeSpecName "kube-api-access-dhptt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:36.055881 master-0 kubenswrapper[29936]: I1205 13:05:36.055875 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmz27\" (UniqueName: \"kubernetes.io/projected/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-kube-api-access-vmz27\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:36.055978 master-0 kubenswrapper[29936]: I1205 13:05:36.055894 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:36.055978 master-0 kubenswrapper[29936]: I1205 13:05:36.055907 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/35238950-d610-4820-bd1f-2aa4ded2c93b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:36.055978 master-0 kubenswrapper[29936]: I1205 13:05:36.055917 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:36.055978 master-0 kubenswrapper[29936]: I1205 13:05:36.055927 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6k72\" (UniqueName: \"kubernetes.io/projected/35238950-d610-4820-bd1f-2aa4ded2c93b-kube-api-access-m6k72\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:36.062137 master-0 kubenswrapper[29936]: I1205 13:05:36.061814 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-80a7-account-create-update-j252l" event={"ID":"64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff","Type":"ContainerDied","Data":"98eeb8b01f45036ff94bebe1cd86bad2c29ba09c45f59416773cd8837cd5f134"} Dec 05 13:05:36.062137 master-0 kubenswrapper[29936]: I1205 13:05:36.061881 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98eeb8b01f45036ff94bebe1cd86bad2c29ba09c45f59416773cd8837cd5f134" Dec 05 13:05:36.062137 master-0 kubenswrapper[29936]: I1205 13:05:36.061980 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-80a7-account-create-update-j252l" Dec 05 13:05:36.068001 master-0 kubenswrapper[29936]: I1205 13:05:36.067867 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cbcfg" event={"ID":"35238950-d610-4820-bd1f-2aa4ded2c93b","Type":"ContainerDied","Data":"05287615d39169fe8c1956b16a2b699b109656fecfb8c4cb5973d754604fc789"} Dec 05 13:05:36.068001 master-0 kubenswrapper[29936]: I1205 13:05:36.067948 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05287615d39169fe8c1956b16a2b699b109656fecfb8c4cb5973d754604fc789" Dec 05 13:05:36.068230 master-0 kubenswrapper[29936]: I1205 13:05:36.068064 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cbcfg" Dec 05 13:05:36.070927 master-0 kubenswrapper[29936]: I1205 13:05:36.070881 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-53e0-account-create-update-2xtnl" event={"ID":"44320335-848c-4aa2-b78b-672d29137770","Type":"ContainerDied","Data":"8f89f14d3d8dd0e9cafdd637600652c3ea31b4258e19f71fd8252ab7f742b11b"} Dec 05 13:05:36.070990 master-0 kubenswrapper[29936]: I1205 13:05:36.070929 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f89f14d3d8dd0e9cafdd637600652c3ea31b4258e19f71fd8252ab7f742b11b" Dec 05 13:05:36.071040 master-0 kubenswrapper[29936]: I1205 13:05:36.070994 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-53e0-account-create-update-2xtnl" Dec 05 13:05:36.076312 master-0 kubenswrapper[29936]: I1205 13:05:36.076263 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-tlq59" Dec 05 13:05:36.076426 master-0 kubenswrapper[29936]: I1205 13:05:36.076314 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-tlq59" event={"ID":"b8c6d8ef-5d6f-475e-8533-e1879fc64f74","Type":"ContainerDied","Data":"e614dd32ed0724c96630f806cd164549c90159b110ed196f50a25531a27f8cd4"} Dec 05 13:05:36.076426 master-0 kubenswrapper[29936]: I1205 13:05:36.076350 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e614dd32ed0724c96630f806cd164549c90159b110ed196f50a25531a27f8cd4" Dec 05 13:05:36.171302 master-0 kubenswrapper[29936]: I1205 13:05:36.170969 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhptt\" (UniqueName: \"kubernetes.io/projected/b8c6d8ef-5d6f-475e-8533-e1879fc64f74-kube-api-access-dhptt\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:36.469172 master-0 kubenswrapper[29936]: I1205 13:05:36.469088 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4269-account-create-update-rmqxq" Dec 05 13:05:36.598216 master-0 kubenswrapper[29936]: I1205 13:05:36.595489 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dwjv\" (UniqueName: \"kubernetes.io/projected/87999eba-73aa-43cf-be9e-1e07b1dc22e0-kube-api-access-2dwjv\") pod \"87999eba-73aa-43cf-be9e-1e07b1dc22e0\" (UID: \"87999eba-73aa-43cf-be9e-1e07b1dc22e0\") " Dec 05 13:05:36.598216 master-0 kubenswrapper[29936]: I1205 13:05:36.595604 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87999eba-73aa-43cf-be9e-1e07b1dc22e0-operator-scripts\") pod \"87999eba-73aa-43cf-be9e-1e07b1dc22e0\" (UID: \"87999eba-73aa-43cf-be9e-1e07b1dc22e0\") " Dec 05 13:05:36.608446 master-0 kubenswrapper[29936]: I1205 13:05:36.607750 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87999eba-73aa-43cf-be9e-1e07b1dc22e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87999eba-73aa-43cf-be9e-1e07b1dc22e0" (UID: "87999eba-73aa-43cf-be9e-1e07b1dc22e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:36.612232 master-0 kubenswrapper[29936]: I1205 13:05:36.610574 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87999eba-73aa-43cf-be9e-1e07b1dc22e0-kube-api-access-2dwjv" (OuterVolumeSpecName: "kube-api-access-2dwjv") pod "87999eba-73aa-43cf-be9e-1e07b1dc22e0" (UID: "87999eba-73aa-43cf-be9e-1e07b1dc22e0"). InnerVolumeSpecName "kube-api-access-2dwjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:36.618456 master-0 kubenswrapper[29936]: I1205 13:05:36.617766 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dwjv\" (UniqueName: \"kubernetes.io/projected/87999eba-73aa-43cf-be9e-1e07b1dc22e0-kube-api-access-2dwjv\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:36.618456 master-0 kubenswrapper[29936]: I1205 13:05:36.617843 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87999eba-73aa-43cf-be9e-1e07b1dc22e0-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:36.708962 master-0 kubenswrapper[29936]: I1205 13:05:36.708893 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2k6sd" Dec 05 13:05:36.824894 master-0 kubenswrapper[29936]: I1205 13:05:36.822702 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbsbm\" (UniqueName: \"kubernetes.io/projected/60f33781-dd68-4b4b-8ca7-7b271a1aa195-kube-api-access-cbsbm\") pod \"60f33781-dd68-4b4b-8ca7-7b271a1aa195\" (UID: \"60f33781-dd68-4b4b-8ca7-7b271a1aa195\") " Dec 05 13:05:36.824894 master-0 kubenswrapper[29936]: I1205 13:05:36.823156 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60f33781-dd68-4b4b-8ca7-7b271a1aa195-operator-scripts\") pod \"60f33781-dd68-4b4b-8ca7-7b271a1aa195\" (UID: \"60f33781-dd68-4b4b-8ca7-7b271a1aa195\") " Dec 05 13:05:36.824894 master-0 kubenswrapper[29936]: I1205 13:05:36.824051 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f33781-dd68-4b4b-8ca7-7b271a1aa195-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60f33781-dd68-4b4b-8ca7-7b271a1aa195" (UID: "60f33781-dd68-4b4b-8ca7-7b271a1aa195"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:36.826785 master-0 kubenswrapper[29936]: I1205 13:05:36.826698 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f33781-dd68-4b4b-8ca7-7b271a1aa195-kube-api-access-cbsbm" (OuterVolumeSpecName: "kube-api-access-cbsbm") pod "60f33781-dd68-4b4b-8ca7-7b271a1aa195" (UID: "60f33781-dd68-4b4b-8ca7-7b271a1aa195"). InnerVolumeSpecName "kube-api-access-cbsbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:36.926751 master-0 kubenswrapper[29936]: I1205 13:05:36.926583 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60f33781-dd68-4b4b-8ca7-7b271a1aa195-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:36.926751 master-0 kubenswrapper[29936]: I1205 13:05:36.926649 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbsbm\" (UniqueName: \"kubernetes.io/projected/60f33781-dd68-4b4b-8ca7-7b271a1aa195-kube-api-access-cbsbm\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:37.089678 master-0 kubenswrapper[29936]: I1205 13:05:37.089626 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2k6sd" Dec 05 13:05:37.090312 master-0 kubenswrapper[29936]: I1205 13:05:37.089624 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2k6sd" event={"ID":"60f33781-dd68-4b4b-8ca7-7b271a1aa195","Type":"ContainerDied","Data":"5597c5ceb379707a5b08ea4ec7140b5c3a59c44f4f347cc766263ae420cace91"} Dec 05 13:05:37.090312 master-0 kubenswrapper[29936]: I1205 13:05:37.089827 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5597c5ceb379707a5b08ea4ec7140b5c3a59c44f4f347cc766263ae420cace91" Dec 05 13:05:37.092506 master-0 kubenswrapper[29936]: I1205 13:05:37.092449 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4269-account-create-update-rmqxq" event={"ID":"87999eba-73aa-43cf-be9e-1e07b1dc22e0","Type":"ContainerDied","Data":"34cf66cab6903bea3ee00d91fd173ee2e94f78bb1e9eca171392e26eeabff6df"} Dec 05 13:05:37.092566 master-0 kubenswrapper[29936]: I1205 13:05:37.092509 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34cf66cab6903bea3ee00d91fd173ee2e94f78bb1e9eca171392e26eeabff6df" Dec 05 13:05:37.092566 master-0 kubenswrapper[29936]: I1205 13:05:37.092509 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4269-account-create-update-rmqxq" Dec 05 13:05:38.977369 master-0 kubenswrapper[29936]: I1205 13:05:38.977208 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 13:05:39.516878 master-0 kubenswrapper[29936]: I1205 13:05:39.516805 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-f8fxj" podUID="1aaf3eff-076d-42cd-a86c-9e5af7a38664" containerName="ovn-controller" probeResult="failure" output=< Dec 05 13:05:39.516878 master-0 kubenswrapper[29936]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 13:05:39.516878 master-0 kubenswrapper[29936]: > Dec 05 13:05:42.988451 master-0 kubenswrapper[29936]: I1205 13:05:42.987882 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:42.999074 master-0 kubenswrapper[29936]: I1205 13:05:42.998996 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8668c9c4-68f4-4224-9395-0f2ac85b9f1d-etc-swift\") pod \"swift-storage-0\" (UID: \"8668c9c4-68f4-4224-9395-0f2ac85b9f1d\") " pod="openstack/swift-storage-0" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.011148 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-448s4"] Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: E1205 13:05:43.011828 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f33781-dd68-4b4b-8ca7-7b271a1aa195" containerName="mariadb-database-create" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.011850 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f33781-dd68-4b4b-8ca7-7b271a1aa195" containerName="mariadb-database-create" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: E1205 13:05:43.011883 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2997639-45a9-4e46-9bf1-f011f91eeab2" containerName="swift-ring-rebalance" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.011892 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2997639-45a9-4e46-9bf1-f011f91eeab2" containerName="swift-ring-rebalance" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: E1205 13:05:43.011936 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44320335-848c-4aa2-b78b-672d29137770" containerName="mariadb-account-create-update" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.011946 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="44320335-848c-4aa2-b78b-672d29137770" containerName="mariadb-account-create-update" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: E1205 13:05:43.011970 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87999eba-73aa-43cf-be9e-1e07b1dc22e0" containerName="mariadb-account-create-update" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.011978 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="87999eba-73aa-43cf-be9e-1e07b1dc22e0" containerName="mariadb-account-create-update" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: E1205 13:05:43.011993 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35238950-d610-4820-bd1f-2aa4ded2c93b" containerName="mariadb-database-create" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.012001 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="35238950-d610-4820-bd1f-2aa4ded2c93b" containerName="mariadb-database-create" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: E1205 13:05:43.012016 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c6d8ef-5d6f-475e-8533-e1879fc64f74" containerName="mariadb-database-create" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.012024 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c6d8ef-5d6f-475e-8533-e1879fc64f74" containerName="mariadb-database-create" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: E1205 13:05:43.012051 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff" containerName="mariadb-account-create-update" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.012060 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff" containerName="mariadb-account-create-update" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.012367 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="87999eba-73aa-43cf-be9e-1e07b1dc22e0" containerName="mariadb-account-create-update" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.012428 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2997639-45a9-4e46-9bf1-f011f91eeab2" containerName="swift-ring-rebalance" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.012445 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f33781-dd68-4b4b-8ca7-7b271a1aa195" containerName="mariadb-database-create" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.012462 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="35238950-d610-4820-bd1f-2aa4ded2c93b" containerName="mariadb-database-create" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.012477 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="44320335-848c-4aa2-b78b-672d29137770" containerName="mariadb-account-create-update" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.012488 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c6d8ef-5d6f-475e-8533-e1879fc64f74" containerName="mariadb-database-create" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.012509 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff" containerName="mariadb-account-create-update" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.013502 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.016351 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b46d8-config-data" Dec 05 13:05:43.046376 master-0 kubenswrapper[29936]: I1205 13:05:43.031702 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-448s4"] Dec 05 13:05:43.091803 master-0 kubenswrapper[29936]: I1205 13:05:43.091678 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-config-data\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.091803 master-0 kubenswrapper[29936]: I1205 13:05:43.091806 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-combined-ca-bundle\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.092071 master-0 kubenswrapper[29936]: I1205 13:05:43.091898 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc92v\" (UniqueName: \"kubernetes.io/projected/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-kube-api-access-mc92v\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.092071 master-0 kubenswrapper[29936]: I1205 13:05:43.091943 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-db-sync-config-data\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.185457 master-0 kubenswrapper[29936]: I1205 13:05:43.185254 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 13:05:43.194983 master-0 kubenswrapper[29936]: I1205 13:05:43.194904 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc92v\" (UniqueName: \"kubernetes.io/projected/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-kube-api-access-mc92v\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.195282 master-0 kubenswrapper[29936]: I1205 13:05:43.195011 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-db-sync-config-data\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.195282 master-0 kubenswrapper[29936]: I1205 13:05:43.195252 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-config-data\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.195436 master-0 kubenswrapper[29936]: I1205 13:05:43.195311 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-combined-ca-bundle\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.202577 master-0 kubenswrapper[29936]: I1205 13:05:43.200772 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-combined-ca-bundle\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.202577 master-0 kubenswrapper[29936]: I1205 13:05:43.200769 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-config-data\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.202577 master-0 kubenswrapper[29936]: I1205 13:05:43.200806 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-db-sync-config-data\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.218839 master-0 kubenswrapper[29936]: I1205 13:05:43.218769 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc92v\" (UniqueName: \"kubernetes.io/projected/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-kube-api-access-mc92v\") pod \"glance-db-sync-448s4\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.388133 master-0 kubenswrapper[29936]: I1205 13:05:43.388050 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-448s4" Dec 05 13:05:43.704275 master-0 kubenswrapper[29936]: I1205 13:05:43.704100 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 13:05:44.022141 master-0 kubenswrapper[29936]: I1205 13:05:44.022064 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-448s4"] Dec 05 13:05:44.179030 master-0 kubenswrapper[29936]: I1205 13:05:44.178955 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-448s4" event={"ID":"5a7bb352-8943-448f-ad3f-a06ebd4b8b30","Type":"ContainerStarted","Data":"f693960a409f314c337dbfd68ddeeda1f87b2fc4da586635bee71a4b605c4379"} Dec 05 13:05:44.181532 master-0 kubenswrapper[29936]: I1205 13:05:44.181460 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"4d9eefcdcc024bb62acb8357b5649c26036728dfcee7a17ecd7d2402af77872d"} Dec 05 13:05:44.513375 master-0 kubenswrapper[29936]: I1205 13:05:44.513222 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:05:44.513891 master-0 kubenswrapper[29936]: I1205 13:05:44.513213 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-f8fxj" podUID="1aaf3eff-076d-42cd-a86c-9e5af7a38664" containerName="ovn-controller" probeResult="failure" output=< Dec 05 13:05:44.513891 master-0 kubenswrapper[29936]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 13:05:44.513891 master-0 kubenswrapper[29936]: > Dec 05 13:05:44.528558 master-0 kubenswrapper[29936]: I1205 13:05:44.528502 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jj4cf" Dec 05 13:05:44.831642 master-0 kubenswrapper[29936]: I1205 13:05:44.831533 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-f8fxj-config-bmczj"] Dec 05 13:05:44.850401 master-0 kubenswrapper[29936]: I1205 13:05:44.850342 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:44.853691 master-0 kubenswrapper[29936]: I1205 13:05:44.853627 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 13:05:44.862774 master-0 kubenswrapper[29936]: I1205 13:05:44.862717 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f8fxj-config-bmczj"] Dec 05 13:05:44.968350 master-0 kubenswrapper[29936]: I1205 13:05:44.968259 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r546l\" (UniqueName: \"kubernetes.io/projected/2d8a18e3-137c-460b-947a-912dd42df73b-kube-api-access-r546l\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:44.968700 master-0 kubenswrapper[29936]: I1205 13:05:44.968399 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:44.968700 master-0 kubenswrapper[29936]: I1205 13:05:44.968467 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-log-ovn\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:44.968700 master-0 kubenswrapper[29936]: I1205 13:05:44.968496 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-scripts\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:44.968700 master-0 kubenswrapper[29936]: I1205 13:05:44.968550 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-additional-scripts\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:44.968700 master-0 kubenswrapper[29936]: I1205 13:05:44.968639 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run-ovn\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.071282 master-0 kubenswrapper[29936]: I1205 13:05:45.071126 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r546l\" (UniqueName: \"kubernetes.io/projected/2d8a18e3-137c-460b-947a-912dd42df73b-kube-api-access-r546l\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.071282 master-0 kubenswrapper[29936]: I1205 13:05:45.071235 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.071899 master-0 kubenswrapper[29936]: I1205 13:05:45.071303 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-log-ovn\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.071899 master-0 kubenswrapper[29936]: I1205 13:05:45.071332 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-scripts\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.071899 master-0 kubenswrapper[29936]: I1205 13:05:45.071380 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-additional-scripts\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.071899 master-0 kubenswrapper[29936]: I1205 13:05:45.071418 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run-ovn\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.071899 master-0 kubenswrapper[29936]: I1205 13:05:45.071648 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.071899 master-0 kubenswrapper[29936]: I1205 13:05:45.071829 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run-ovn\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.072439 master-0 kubenswrapper[29936]: I1205 13:05:45.072409 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-additional-scripts\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.072524 master-0 kubenswrapper[29936]: I1205 13:05:45.072420 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-log-ovn\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.075916 master-0 kubenswrapper[29936]: I1205 13:05:45.075893 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-scripts\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.092725 master-0 kubenswrapper[29936]: I1205 13:05:45.092669 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r546l\" (UniqueName: \"kubernetes.io/projected/2d8a18e3-137c-460b-947a-912dd42df73b-kube-api-access-r546l\") pod \"ovn-controller-f8fxj-config-bmczj\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.197757 master-0 kubenswrapper[29936]: I1205 13:05:45.197641 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:45.209274 master-0 kubenswrapper[29936]: I1205 13:05:45.209210 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"0cfb009bc87e2ec9266f882a7a60f32a8258aba5e95d10d780b9c32fe67e4835"} Dec 05 13:05:46.147102 master-0 kubenswrapper[29936]: I1205 13:05:46.147008 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f8fxj-config-bmczj"] Dec 05 13:05:46.241002 master-0 kubenswrapper[29936]: I1205 13:05:46.240881 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"9f060a64150381d460c129cd974fe6cc4b16161d6e1b98d4c5a1bffb5d7f9a66"} Dec 05 13:05:46.241002 master-0 kubenswrapper[29936]: I1205 13:05:46.240962 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"ea0c91b26bae3def20756ae211fd935a2f53cb77a17e9aba0860729367597c64"} Dec 05 13:05:46.243123 master-0 kubenswrapper[29936]: I1205 13:05:46.243080 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8fxj-config-bmczj" event={"ID":"2d8a18e3-137c-460b-947a-912dd42df73b","Type":"ContainerStarted","Data":"8f06d8d938e48c091c4539b0d7a456a0b4d51ef301a427c644badf5209773cb3"} Dec 05 13:05:46.366658 master-0 kubenswrapper[29936]: I1205 13:05:46.366501 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 13:05:47.260402 master-0 kubenswrapper[29936]: I1205 13:05:47.259996 29936 generic.go:334] "Generic (PLEG): container finished" podID="2d8a18e3-137c-460b-947a-912dd42df73b" containerID="c4182bf062e159005d362d853f2c17d8d8e737de8526286a95ead21fc8d8861f" exitCode=0 Dec 05 13:05:47.260402 master-0 kubenswrapper[29936]: I1205 13:05:47.260108 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8fxj-config-bmczj" event={"ID":"2d8a18e3-137c-460b-947a-912dd42df73b","Type":"ContainerDied","Data":"c4182bf062e159005d362d853f2c17d8d8e737de8526286a95ead21fc8d8861f"} Dec 05 13:05:47.268535 master-0 kubenswrapper[29936]: I1205 13:05:47.268471 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"9ab996109b647b65ff21faf0dd375275d8dd89bd037e3220fc23139d1540e809"} Dec 05 13:05:48.297946 master-0 kubenswrapper[29936]: I1205 13:05:48.297842 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"ae4384cc3dd2c903f9ebc39a926c44ae44d7ed1b880295ec3754c597d8dda733"} Dec 05 13:05:48.794021 master-0 kubenswrapper[29936]: I1205 13:05:48.793827 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:48.897221 master-0 kubenswrapper[29936]: I1205 13:05:48.896416 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run-ovn\") pod \"2d8a18e3-137c-460b-947a-912dd42df73b\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " Dec 05 13:05:48.897221 master-0 kubenswrapper[29936]: I1205 13:05:48.896542 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-additional-scripts\") pod \"2d8a18e3-137c-460b-947a-912dd42df73b\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " Dec 05 13:05:48.897221 master-0 kubenswrapper[29936]: I1205 13:05:48.896742 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-log-ovn\") pod \"2d8a18e3-137c-460b-947a-912dd42df73b\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " Dec 05 13:05:48.897221 master-0 kubenswrapper[29936]: I1205 13:05:48.896926 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r546l\" (UniqueName: \"kubernetes.io/projected/2d8a18e3-137c-460b-947a-912dd42df73b-kube-api-access-r546l\") pod \"2d8a18e3-137c-460b-947a-912dd42df73b\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " Dec 05 13:05:48.897221 master-0 kubenswrapper[29936]: I1205 13:05:48.896966 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-scripts\") pod \"2d8a18e3-137c-460b-947a-912dd42df73b\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " Dec 05 13:05:48.897221 master-0 kubenswrapper[29936]: I1205 13:05:48.897055 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run\") pod \"2d8a18e3-137c-460b-947a-912dd42df73b\" (UID: \"2d8a18e3-137c-460b-947a-912dd42df73b\") " Dec 05 13:05:48.897762 master-0 kubenswrapper[29936]: I1205 13:05:48.897719 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run" (OuterVolumeSpecName: "var-run") pod "2d8a18e3-137c-460b-947a-912dd42df73b" (UID: "2d8a18e3-137c-460b-947a-912dd42df73b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:05:48.897835 master-0 kubenswrapper[29936]: I1205 13:05:48.897773 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2d8a18e3-137c-460b-947a-912dd42df73b" (UID: "2d8a18e3-137c-460b-947a-912dd42df73b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:05:48.898105 master-0 kubenswrapper[29936]: I1205 13:05:48.898039 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2d8a18e3-137c-460b-947a-912dd42df73b" (UID: "2d8a18e3-137c-460b-947a-912dd42df73b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:05:48.899504 master-0 kubenswrapper[29936]: I1205 13:05:48.899003 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "2d8a18e3-137c-460b-947a-912dd42df73b" (UID: "2d8a18e3-137c-460b-947a-912dd42df73b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:48.899637 master-0 kubenswrapper[29936]: I1205 13:05:48.899378 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-scripts" (OuterVolumeSpecName: "scripts") pod "2d8a18e3-137c-460b-947a-912dd42df73b" (UID: "2d8a18e3-137c-460b-947a-912dd42df73b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:05:48.925789 master-0 kubenswrapper[29936]: I1205 13:05:48.925695 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8a18e3-137c-460b-947a-912dd42df73b-kube-api-access-r546l" (OuterVolumeSpecName: "kube-api-access-r546l") pod "2d8a18e3-137c-460b-947a-912dd42df73b" (UID: "2d8a18e3-137c-460b-947a-912dd42df73b"). InnerVolumeSpecName "kube-api-access-r546l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:05:48.981495 master-0 kubenswrapper[29936]: I1205 13:05:48.981436 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 13:05:48.999663 master-0 kubenswrapper[29936]: I1205 13:05:48.999554 29936 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:49.000003 master-0 kubenswrapper[29936]: I1205 13:05:48.999985 29936 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-additional-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:49.000081 master-0 kubenswrapper[29936]: I1205 13:05:49.000071 29936 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:49.000145 master-0 kubenswrapper[29936]: I1205 13:05:49.000136 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r546l\" (UniqueName: \"kubernetes.io/projected/2d8a18e3-137c-460b-947a-912dd42df73b-kube-api-access-r546l\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:49.000236 master-0 kubenswrapper[29936]: I1205 13:05:49.000226 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d8a18e3-137c-460b-947a-912dd42df73b-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:49.000305 master-0 kubenswrapper[29936]: I1205 13:05:49.000296 29936 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d8a18e3-137c-460b-947a-912dd42df73b-var-run\") on node \"master-0\" DevicePath \"\"" Dec 05 13:05:49.314994 master-0 kubenswrapper[29936]: I1205 13:05:49.314822 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8fxj-config-bmczj" Dec 05 13:05:49.314994 master-0 kubenswrapper[29936]: I1205 13:05:49.314899 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8fxj-config-bmczj" event={"ID":"2d8a18e3-137c-460b-947a-912dd42df73b","Type":"ContainerDied","Data":"8f06d8d938e48c091c4539b0d7a456a0b4d51ef301a427c644badf5209773cb3"} Dec 05 13:05:49.316056 master-0 kubenswrapper[29936]: I1205 13:05:49.315026 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f06d8d938e48c091c4539b0d7a456a0b4d51ef301a427c644badf5209773cb3" Dec 05 13:05:49.323224 master-0 kubenswrapper[29936]: I1205 13:05:49.322996 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"065c94da837d8d5b57154eef7c01bd8700d23d419c13a8301e79904258dfcbd8"} Dec 05 13:05:49.323224 master-0 kubenswrapper[29936]: I1205 13:05:49.323093 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"5aa7774c9742fff1dd2cfb8ea8e4b68b9587cf2dbbf5b4cb0b2eee897b2fe486"} Dec 05 13:05:49.555454 master-0 kubenswrapper[29936]: I1205 13:05:49.555275 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-f8fxj" Dec 05 13:05:49.795329 master-0 kubenswrapper[29936]: I1205 13:05:49.793124 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-92nh8"] Dec 05 13:05:49.795329 master-0 kubenswrapper[29936]: E1205 13:05:49.793867 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8a18e3-137c-460b-947a-912dd42df73b" containerName="ovn-config" Dec 05 13:05:49.795329 master-0 kubenswrapper[29936]: I1205 13:05:49.793884 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8a18e3-137c-460b-947a-912dd42df73b" containerName="ovn-config" Dec 05 13:05:49.795329 master-0 kubenswrapper[29936]: I1205 13:05:49.794150 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8a18e3-137c-460b-947a-912dd42df73b" containerName="ovn-config" Dec 05 13:05:49.795329 master-0 kubenswrapper[29936]: I1205 13:05:49.795045 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-92nh8" Dec 05 13:05:49.816592 master-0 kubenswrapper[29936]: I1205 13:05:49.816524 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-92nh8"] Dec 05 13:05:49.917451 master-0 kubenswrapper[29936]: I1205 13:05:49.916972 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-73e4-account-create-update-z9x2z"] Dec 05 13:05:49.918931 master-0 kubenswrapper[29936]: I1205 13:05:49.918888 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-73e4-account-create-update-z9x2z" Dec 05 13:05:49.921722 master-0 kubenswrapper[29936]: I1205 13:05:49.921633 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 13:05:49.952224 master-0 kubenswrapper[29936]: I1205 13:05:49.952100 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-73e4-account-create-update-z9x2z"] Dec 05 13:05:49.984041 master-0 kubenswrapper[29936]: I1205 13:05:49.983948 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n8g5\" (UniqueName: \"kubernetes.io/projected/02a61d97-31ab-485e-9f4c-f18097ce33c7-kube-api-access-8n8g5\") pod \"cinder-db-create-92nh8\" (UID: \"02a61d97-31ab-485e-9f4c-f18097ce33c7\") " pod="openstack/cinder-db-create-92nh8" Dec 05 13:05:49.984492 master-0 kubenswrapper[29936]: I1205 13:05:49.984291 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02a61d97-31ab-485e-9f4c-f18097ce33c7-operator-scripts\") pod \"cinder-db-create-92nh8\" (UID: \"02a61d97-31ab-485e-9f4c-f18097ce33c7\") " pod="openstack/cinder-db-create-92nh8" Dec 05 13:05:49.995576 master-0 kubenswrapper[29936]: I1205 13:05:49.995468 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-f8fxj-config-bmczj"] Dec 05 13:05:50.001356 master-0 kubenswrapper[29936]: I1205 13:05:50.001257 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-f8fxj-config-bmczj"] Dec 05 13:05:50.035889 master-0 kubenswrapper[29936]: I1205 13:05:50.035801 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4x574"] Dec 05 13:05:50.038381 master-0 kubenswrapper[29936]: I1205 13:05:50.037979 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4x574" Dec 05 13:05:50.064865 master-0 kubenswrapper[29936]: I1205 13:05:50.064802 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4x574"] Dec 05 13:05:50.089149 master-0 kubenswrapper[29936]: I1205 13:05:50.088446 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zm69\" (UniqueName: \"kubernetes.io/projected/4c67a1ed-b95a-414f-8f7d-972c98a55a88-kube-api-access-6zm69\") pod \"neutron-db-create-4x574\" (UID: \"4c67a1ed-b95a-414f-8f7d-972c98a55a88\") " pod="openstack/neutron-db-create-4x574" Dec 05 13:05:50.089149 master-0 kubenswrapper[29936]: I1205 13:05:50.088558 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02a61d97-31ab-485e-9f4c-f18097ce33c7-operator-scripts\") pod \"cinder-db-create-92nh8\" (UID: \"02a61d97-31ab-485e-9f4c-f18097ce33c7\") " pod="openstack/cinder-db-create-92nh8" Dec 05 13:05:50.089149 master-0 kubenswrapper[29936]: I1205 13:05:50.088654 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n8g5\" (UniqueName: \"kubernetes.io/projected/02a61d97-31ab-485e-9f4c-f18097ce33c7-kube-api-access-8n8g5\") pod \"cinder-db-create-92nh8\" (UID: \"02a61d97-31ab-485e-9f4c-f18097ce33c7\") " pod="openstack/cinder-db-create-92nh8" Dec 05 13:05:50.089149 master-0 kubenswrapper[29936]: I1205 13:05:50.088685 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx7dm\" (UniqueName: \"kubernetes.io/projected/376c0716-8fd1-432f-9e4b-a3b21373a7cc-kube-api-access-kx7dm\") pod \"cinder-73e4-account-create-update-z9x2z\" (UID: \"376c0716-8fd1-432f-9e4b-a3b21373a7cc\") " pod="openstack/cinder-73e4-account-create-update-z9x2z" Dec 05 13:05:50.089149 master-0 kubenswrapper[29936]: I1205 13:05:50.088704 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c67a1ed-b95a-414f-8f7d-972c98a55a88-operator-scripts\") pod \"neutron-db-create-4x574\" (UID: \"4c67a1ed-b95a-414f-8f7d-972c98a55a88\") " pod="openstack/neutron-db-create-4x574" Dec 05 13:05:50.089149 master-0 kubenswrapper[29936]: I1205 13:05:50.088733 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/376c0716-8fd1-432f-9e4b-a3b21373a7cc-operator-scripts\") pod \"cinder-73e4-account-create-update-z9x2z\" (UID: \"376c0716-8fd1-432f-9e4b-a3b21373a7cc\") " pod="openstack/cinder-73e4-account-create-update-z9x2z" Dec 05 13:05:50.090700 master-0 kubenswrapper[29936]: I1205 13:05:50.090672 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02a61d97-31ab-485e-9f4c-f18097ce33c7-operator-scripts\") pod \"cinder-db-create-92nh8\" (UID: \"02a61d97-31ab-485e-9f4c-f18097ce33c7\") " pod="openstack/cinder-db-create-92nh8" Dec 05 13:05:50.112334 master-0 kubenswrapper[29936]: I1205 13:05:50.112226 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n8g5\" (UniqueName: \"kubernetes.io/projected/02a61d97-31ab-485e-9f4c-f18097ce33c7-kube-api-access-8n8g5\") pod \"cinder-db-create-92nh8\" (UID: \"02a61d97-31ab-485e-9f4c-f18097ce33c7\") " pod="openstack/cinder-db-create-92nh8" Dec 05 13:05:50.130859 master-0 kubenswrapper[29936]: I1205 13:05:50.130771 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c9b1-account-create-update-6mbv8"] Dec 05 13:05:50.144993 master-0 kubenswrapper[29936]: I1205 13:05:50.135767 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-92nh8" Dec 05 13:05:50.144993 master-0 kubenswrapper[29936]: I1205 13:05:50.138385 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c9b1-account-create-update-6mbv8" Dec 05 13:05:50.151342 master-0 kubenswrapper[29936]: I1205 13:05:50.150868 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 13:05:50.151342 master-0 kubenswrapper[29936]: I1205 13:05:50.150950 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c9b1-account-create-update-6mbv8"] Dec 05 13:05:50.168656 master-0 kubenswrapper[29936]: I1205 13:05:50.168526 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-f8fxj-config-rvmj7"] Dec 05 13:05:50.172034 master-0 kubenswrapper[29936]: I1205 13:05:50.172002 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.183487 master-0 kubenswrapper[29936]: I1205 13:05:50.183387 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f8fxj-config-rvmj7"] Dec 05 13:05:50.196832 master-0 kubenswrapper[29936]: I1205 13:05:50.189625 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 13:05:50.196832 master-0 kubenswrapper[29936]: I1205 13:05:50.194829 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/376c0716-8fd1-432f-9e4b-a3b21373a7cc-operator-scripts\") pod \"cinder-73e4-account-create-update-z9x2z\" (UID: \"376c0716-8fd1-432f-9e4b-a3b21373a7cc\") " pod="openstack/cinder-73e4-account-create-update-z9x2z" Dec 05 13:05:50.196832 master-0 kubenswrapper[29936]: I1205 13:05:50.195245 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zm69\" (UniqueName: \"kubernetes.io/projected/4c67a1ed-b95a-414f-8f7d-972c98a55a88-kube-api-access-6zm69\") pod \"neutron-db-create-4x574\" (UID: \"4c67a1ed-b95a-414f-8f7d-972c98a55a88\") " pod="openstack/neutron-db-create-4x574" Dec 05 13:05:50.196832 master-0 kubenswrapper[29936]: I1205 13:05:50.195568 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx7dm\" (UniqueName: \"kubernetes.io/projected/376c0716-8fd1-432f-9e4b-a3b21373a7cc-kube-api-access-kx7dm\") pod \"cinder-73e4-account-create-update-z9x2z\" (UID: \"376c0716-8fd1-432f-9e4b-a3b21373a7cc\") " pod="openstack/cinder-73e4-account-create-update-z9x2z" Dec 05 13:05:50.196832 master-0 kubenswrapper[29936]: I1205 13:05:50.195602 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c67a1ed-b95a-414f-8f7d-972c98a55a88-operator-scripts\") pod \"neutron-db-create-4x574\" (UID: \"4c67a1ed-b95a-414f-8f7d-972c98a55a88\") " pod="openstack/neutron-db-create-4x574" Dec 05 13:05:50.196832 master-0 kubenswrapper[29936]: I1205 13:05:50.196512 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c67a1ed-b95a-414f-8f7d-972c98a55a88-operator-scripts\") pod \"neutron-db-create-4x574\" (UID: \"4c67a1ed-b95a-414f-8f7d-972c98a55a88\") " pod="openstack/neutron-db-create-4x574" Dec 05 13:05:50.197500 master-0 kubenswrapper[29936]: I1205 13:05:50.197183 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/376c0716-8fd1-432f-9e4b-a3b21373a7cc-operator-scripts\") pod \"cinder-73e4-account-create-update-z9x2z\" (UID: \"376c0716-8fd1-432f-9e4b-a3b21373a7cc\") " pod="openstack/cinder-73e4-account-create-update-z9x2z" Dec 05 13:05:50.235564 master-0 kubenswrapper[29936]: I1205 13:05:50.232436 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx7dm\" (UniqueName: \"kubernetes.io/projected/376c0716-8fd1-432f-9e4b-a3b21373a7cc-kube-api-access-kx7dm\") pod \"cinder-73e4-account-create-update-z9x2z\" (UID: \"376c0716-8fd1-432f-9e4b-a3b21373a7cc\") " pod="openstack/cinder-73e4-account-create-update-z9x2z" Dec 05 13:05:50.235564 master-0 kubenswrapper[29936]: I1205 13:05:50.233771 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zm69\" (UniqueName: \"kubernetes.io/projected/4c67a1ed-b95a-414f-8f7d-972c98a55a88-kube-api-access-6zm69\") pod \"neutron-db-create-4x574\" (UID: \"4c67a1ed-b95a-414f-8f7d-972c98a55a88\") " pod="openstack/neutron-db-create-4x574" Dec 05 13:05:50.281666 master-0 kubenswrapper[29936]: I1205 13:05:50.281595 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-73e4-account-create-update-z9x2z" Dec 05 13:05:50.302959 master-0 kubenswrapper[29936]: I1205 13:05:50.302838 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.303664 master-0 kubenswrapper[29936]: I1205 13:05:50.303012 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-additional-scripts\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.303664 master-0 kubenswrapper[29936]: I1205 13:05:50.303075 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run-ovn\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.303664 master-0 kubenswrapper[29936]: I1205 13:05:50.303120 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-scripts\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.303664 master-0 kubenswrapper[29936]: I1205 13:05:50.303174 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njhlh\" (UniqueName: \"kubernetes.io/projected/80a34c12-cc34-47c1-af11-6c935c757db4-kube-api-access-njhlh\") pod \"neutron-c9b1-account-create-update-6mbv8\" (UID: \"80a34c12-cc34-47c1-af11-6c935c757db4\") " pod="openstack/neutron-c9b1-account-create-update-6mbv8" Dec 05 13:05:50.303664 master-0 kubenswrapper[29936]: I1205 13:05:50.303301 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-log-ovn\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.303664 master-0 kubenswrapper[29936]: I1205 13:05:50.303422 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82v7v\" (UniqueName: \"kubernetes.io/projected/148db2bc-447d-44d5-beb6-b94bca8b9e22-kube-api-access-82v7v\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.303664 master-0 kubenswrapper[29936]: I1205 13:05:50.303486 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80a34c12-cc34-47c1-af11-6c935c757db4-operator-scripts\") pod \"neutron-c9b1-account-create-update-6mbv8\" (UID: \"80a34c12-cc34-47c1-af11-6c935c757db4\") " pod="openstack/neutron-c9b1-account-create-update-6mbv8" Dec 05 13:05:50.307037 master-0 kubenswrapper[29936]: I1205 13:05:50.306997 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dp8qv"] Dec 05 13:05:50.310951 master-0 kubenswrapper[29936]: I1205 13:05:50.310868 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.314294 master-0 kubenswrapper[29936]: I1205 13:05:50.314171 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 13:05:50.314671 master-0 kubenswrapper[29936]: I1205 13:05:50.314646 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 13:05:50.317166 master-0 kubenswrapper[29936]: I1205 13:05:50.317086 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 13:05:50.344862 master-0 kubenswrapper[29936]: I1205 13:05:50.344792 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"7ac37c179cfab8b64b4029ca9bfbf00a03fc2e3720b20ad5b87853721b241895"} Dec 05 13:05:50.370011 master-0 kubenswrapper[29936]: I1205 13:05:50.369897 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4x574" Dec 05 13:05:50.382000 master-0 kubenswrapper[29936]: I1205 13:05:50.381919 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dp8qv"] Dec 05 13:05:50.405395 master-0 kubenswrapper[29936]: I1205 13:05:50.405324 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njhlh\" (UniqueName: \"kubernetes.io/projected/80a34c12-cc34-47c1-af11-6c935c757db4-kube-api-access-njhlh\") pod \"neutron-c9b1-account-create-update-6mbv8\" (UID: \"80a34c12-cc34-47c1-af11-6c935c757db4\") " pod="openstack/neutron-c9b1-account-create-update-6mbv8" Dec 05 13:05:50.405903 master-0 kubenswrapper[29936]: I1205 13:05:50.405482 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-log-ovn\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.405903 master-0 kubenswrapper[29936]: I1205 13:05:50.405549 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82v7v\" (UniqueName: \"kubernetes.io/projected/148db2bc-447d-44d5-beb6-b94bca8b9e22-kube-api-access-82v7v\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.405903 master-0 kubenswrapper[29936]: I1205 13:05:50.405604 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80a34c12-cc34-47c1-af11-6c935c757db4-operator-scripts\") pod \"neutron-c9b1-account-create-update-6mbv8\" (UID: \"80a34c12-cc34-47c1-af11-6c935c757db4\") " pod="openstack/neutron-c9b1-account-create-update-6mbv8" Dec 05 13:05:50.405903 master-0 kubenswrapper[29936]: I1205 13:05:50.405641 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.405903 master-0 kubenswrapper[29936]: I1205 13:05:50.405697 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-additional-scripts\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.406288 master-0 kubenswrapper[29936]: I1205 13:05:50.405963 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run-ovn\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.406288 master-0 kubenswrapper[29936]: I1205 13:05:50.406005 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-scripts\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.406510 master-0 kubenswrapper[29936]: I1205 13:05:50.406449 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-log-ovn\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.406975 master-0 kubenswrapper[29936]: I1205 13:05:50.406945 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run-ovn\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.407112 master-0 kubenswrapper[29936]: I1205 13:05:50.407078 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.407625 master-0 kubenswrapper[29936]: I1205 13:05:50.407569 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80a34c12-cc34-47c1-af11-6c935c757db4-operator-scripts\") pod \"neutron-c9b1-account-create-update-6mbv8\" (UID: \"80a34c12-cc34-47c1-af11-6c935c757db4\") " pod="openstack/neutron-c9b1-account-create-update-6mbv8" Dec 05 13:05:50.408243 master-0 kubenswrapper[29936]: I1205 13:05:50.408210 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-additional-scripts\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.411655 master-0 kubenswrapper[29936]: I1205 13:05:50.411570 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-scripts\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.510109 master-0 kubenswrapper[29936]: I1205 13:05:50.509741 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-combined-ca-bundle\") pod \"keystone-db-sync-dp8qv\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.510109 master-0 kubenswrapper[29936]: I1205 13:05:50.509863 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4qtr\" (UniqueName: \"kubernetes.io/projected/e1513da7-52be-4f09-8b6d-09e494522e1e-kube-api-access-n4qtr\") pod \"keystone-db-sync-dp8qv\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.510109 master-0 kubenswrapper[29936]: I1205 13:05:50.509927 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-config-data\") pod \"keystone-db-sync-dp8qv\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.549573 master-0 kubenswrapper[29936]: I1205 13:05:50.548960 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njhlh\" (UniqueName: \"kubernetes.io/projected/80a34c12-cc34-47c1-af11-6c935c757db4-kube-api-access-njhlh\") pod \"neutron-c9b1-account-create-update-6mbv8\" (UID: \"80a34c12-cc34-47c1-af11-6c935c757db4\") " pod="openstack/neutron-c9b1-account-create-update-6mbv8" Dec 05 13:05:50.551419 master-0 kubenswrapper[29936]: I1205 13:05:50.550546 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82v7v\" (UniqueName: \"kubernetes.io/projected/148db2bc-447d-44d5-beb6-b94bca8b9e22-kube-api-access-82v7v\") pod \"ovn-controller-f8fxj-config-rvmj7\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.614507 master-0 kubenswrapper[29936]: I1205 13:05:50.614416 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-combined-ca-bundle\") pod \"keystone-db-sync-dp8qv\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.614842 master-0 kubenswrapper[29936]: I1205 13:05:50.614569 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4qtr\" (UniqueName: \"kubernetes.io/projected/e1513da7-52be-4f09-8b6d-09e494522e1e-kube-api-access-n4qtr\") pod \"keystone-db-sync-dp8qv\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.614842 master-0 kubenswrapper[29936]: I1205 13:05:50.614654 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-config-data\") pod \"keystone-db-sync-dp8qv\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.625789 master-0 kubenswrapper[29936]: I1205 13:05:50.625732 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-config-data\") pod \"keystone-db-sync-dp8qv\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.642672 master-0 kubenswrapper[29936]: I1205 13:05:50.642601 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4qtr\" (UniqueName: \"kubernetes.io/projected/e1513da7-52be-4f09-8b6d-09e494522e1e-kube-api-access-n4qtr\") pod \"keystone-db-sync-dp8qv\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.649980 master-0 kubenswrapper[29936]: I1205 13:05:50.649922 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-combined-ca-bundle\") pod \"keystone-db-sync-dp8qv\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.652944 master-0 kubenswrapper[29936]: I1205 13:05:50.652883 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:05:50.794007 master-0 kubenswrapper[29936]: I1205 13:05:50.793077 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-92nh8"] Dec 05 13:05:50.795557 master-0 kubenswrapper[29936]: I1205 13:05:50.795158 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c9b1-account-create-update-6mbv8" Dec 05 13:05:50.830365 master-0 kubenswrapper[29936]: I1205 13:05:50.830287 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:05:50.954416 master-0 kubenswrapper[29936]: W1205 13:05:50.954322 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02a61d97_31ab_485e_9f4c_f18097ce33c7.slice/crio-8e6c682f55e8eeb068d50dc538ae4dcb3dd712478460cbdcd2043906dbc44c35 WatchSource:0}: Error finding container 8e6c682f55e8eeb068d50dc538ae4dcb3dd712478460cbdcd2043906dbc44c35: Status 404 returned error can't find the container with id 8e6c682f55e8eeb068d50dc538ae4dcb3dd712478460cbdcd2043906dbc44c35 Dec 05 13:05:50.979613 master-0 kubenswrapper[29936]: I1205 13:05:50.979540 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-73e4-account-create-update-z9x2z"] Dec 05 13:05:50.995083 master-0 kubenswrapper[29936]: I1205 13:05:50.994859 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4x574"] Dec 05 13:05:51.216372 master-0 kubenswrapper[29936]: I1205 13:05:51.216303 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8a18e3-137c-460b-947a-912dd42df73b" path="/var/lib/kubelet/pods/2d8a18e3-137c-460b-947a-912dd42df73b/volumes" Dec 05 13:05:51.368684 master-0 kubenswrapper[29936]: I1205 13:05:51.368475 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-92nh8" event={"ID":"02a61d97-31ab-485e-9f4c-f18097ce33c7","Type":"ContainerStarted","Data":"8e6c682f55e8eeb068d50dc538ae4dcb3dd712478460cbdcd2043906dbc44c35"} Dec 05 13:05:51.371492 master-0 kubenswrapper[29936]: I1205 13:05:51.370834 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4x574" event={"ID":"4c67a1ed-b95a-414f-8f7d-972c98a55a88","Type":"ContainerStarted","Data":"8638bf0f6751c1c405a4405de7542c20925799d1883a04adcb9ba7cfd31a273c"} Dec 05 13:05:51.377631 master-0 kubenswrapper[29936]: I1205 13:05:51.374843 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-73e4-account-create-update-z9x2z" event={"ID":"376c0716-8fd1-432f-9e4b-a3b21373a7cc","Type":"ContainerStarted","Data":"8617dccf39de1b6bbe270143a2ffff76eb07669b9b5035ccec43a734e4fe7c63"} Dec 05 13:05:51.665211 master-0 kubenswrapper[29936]: I1205 13:05:51.662309 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dp8qv"] Dec 05 13:05:51.679210 master-0 kubenswrapper[29936]: I1205 13:05:51.676580 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c9b1-account-create-update-6mbv8"] Dec 05 13:05:51.857599 master-0 kubenswrapper[29936]: I1205 13:05:51.850740 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-f8fxj-config-rvmj7"] Dec 05 13:06:00.240668 master-0 kubenswrapper[29936]: W1205 13:06:00.240422 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod148db2bc_447d_44d5_beb6_b94bca8b9e22.slice/crio-cb2773509a355567546f49901a4cf9d5f75d70f5818113f01de96fa346d85bac WatchSource:0}: Error finding container cb2773509a355567546f49901a4cf9d5f75d70f5818113f01de96fa346d85bac: Status 404 returned error can't find the container with id cb2773509a355567546f49901a4cf9d5f75d70f5818113f01de96fa346d85bac Dec 05 13:06:00.244514 master-0 kubenswrapper[29936]: W1205 13:06:00.244441 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80a34c12_cc34_47c1_af11_6c935c757db4.slice/crio-48339a39bbc95476dfeac1f93726be8e7671fea1eb221524ef1142f76174ebfd WatchSource:0}: Error finding container 48339a39bbc95476dfeac1f93726be8e7671fea1eb221524ef1142f76174ebfd: Status 404 returned error can't find the container with id 48339a39bbc95476dfeac1f93726be8e7671fea1eb221524ef1142f76174ebfd Dec 05 13:06:00.498567 master-0 kubenswrapper[29936]: I1205 13:06:00.498489 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dp8qv" event={"ID":"e1513da7-52be-4f09-8b6d-09e494522e1e","Type":"ContainerStarted","Data":"61a7aeeb0ed5d7e876b4064d500e3a59193ab01b5f77cd7d215c6b9eec3c4910"} Dec 05 13:06:00.500798 master-0 kubenswrapper[29936]: I1205 13:06:00.500229 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c9b1-account-create-update-6mbv8" event={"ID":"80a34c12-cc34-47c1-af11-6c935c757db4","Type":"ContainerStarted","Data":"48339a39bbc95476dfeac1f93726be8e7671fea1eb221524ef1142f76174ebfd"} Dec 05 13:06:00.502362 master-0 kubenswrapper[29936]: I1205 13:06:00.502319 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8fxj-config-rvmj7" event={"ID":"148db2bc-447d-44d5-beb6-b94bca8b9e22","Type":"ContainerStarted","Data":"cb2773509a355567546f49901a4cf9d5f75d70f5818113f01de96fa346d85bac"} Dec 05 13:06:00.605702 master-0 kubenswrapper[29936]: E1205 13:06:00.605615 29936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c67a1ed_b95a_414f_8f7d_972c98a55a88.slice/crio-c35f4ecfc3027159b7f91cbca16cb55907b08161f9b5b6aa2204b60d73b48bd6.scope\": RecentStats: unable to find data in memory cache]" Dec 05 13:06:01.518996 master-0 kubenswrapper[29936]: I1205 13:06:01.518908 29936 generic.go:334] "Generic (PLEG): container finished" podID="4c67a1ed-b95a-414f-8f7d-972c98a55a88" containerID="c35f4ecfc3027159b7f91cbca16cb55907b08161f9b5b6aa2204b60d73b48bd6" exitCode=0 Dec 05 13:06:01.518996 master-0 kubenswrapper[29936]: I1205 13:06:01.518997 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4x574" event={"ID":"4c67a1ed-b95a-414f-8f7d-972c98a55a88","Type":"ContainerDied","Data":"c35f4ecfc3027159b7f91cbca16cb55907b08161f9b5b6aa2204b60d73b48bd6"} Dec 05 13:06:01.525445 master-0 kubenswrapper[29936]: I1205 13:06:01.525324 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"5057f29781827bff5ee447f7ad25099fa2fbb401eca9593a694960df81fcdb03"} Dec 05 13:06:01.525445 master-0 kubenswrapper[29936]: I1205 13:06:01.525401 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"719961efcabe66b2d4fe0d164bd36f563832cb5714c2ef7ed0be34870d96581b"} Dec 05 13:06:01.528518 master-0 kubenswrapper[29936]: I1205 13:06:01.528418 29936 generic.go:334] "Generic (PLEG): container finished" podID="148db2bc-447d-44d5-beb6-b94bca8b9e22" containerID="403f92aa4818c98bd7c94d4469d3117dd62b47997b7a5ff9b7b1859eeb42bc86" exitCode=0 Dec 05 13:06:01.528656 master-0 kubenswrapper[29936]: I1205 13:06:01.528545 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8fxj-config-rvmj7" event={"ID":"148db2bc-447d-44d5-beb6-b94bca8b9e22","Type":"ContainerDied","Data":"403f92aa4818c98bd7c94d4469d3117dd62b47997b7a5ff9b7b1859eeb42bc86"} Dec 05 13:06:01.532072 master-0 kubenswrapper[29936]: I1205 13:06:01.531990 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-448s4" event={"ID":"5a7bb352-8943-448f-ad3f-a06ebd4b8b30","Type":"ContainerStarted","Data":"d9dffbcd65d9fd967677c7e200645f9179512261eecadecfb2c00821dffdcb61"} Dec 05 13:06:01.576726 master-0 kubenswrapper[29936]: I1205 13:06:01.541432 29936 generic.go:334] "Generic (PLEG): container finished" podID="376c0716-8fd1-432f-9e4b-a3b21373a7cc" containerID="7bcfd006685c204c32486a1fb0c7e6bfcbc4d18da7525eb75adb1ea9379958fd" exitCode=0 Dec 05 13:06:01.576726 master-0 kubenswrapper[29936]: I1205 13:06:01.541569 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-73e4-account-create-update-z9x2z" event={"ID":"376c0716-8fd1-432f-9e4b-a3b21373a7cc","Type":"ContainerDied","Data":"7bcfd006685c204c32486a1fb0c7e6bfcbc4d18da7525eb75adb1ea9379958fd"} Dec 05 13:06:01.576726 master-0 kubenswrapper[29936]: I1205 13:06:01.544681 29936 generic.go:334] "Generic (PLEG): container finished" podID="80a34c12-cc34-47c1-af11-6c935c757db4" containerID="05d63a82c97a5d89a61d6472e6852f05c37c42d4e10026ed2d27d0e960357d7d" exitCode=0 Dec 05 13:06:01.576726 master-0 kubenswrapper[29936]: I1205 13:06:01.544780 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c9b1-account-create-update-6mbv8" event={"ID":"80a34c12-cc34-47c1-af11-6c935c757db4","Type":"ContainerDied","Data":"05d63a82c97a5d89a61d6472e6852f05c37c42d4e10026ed2d27d0e960357d7d"} Dec 05 13:06:01.576726 master-0 kubenswrapper[29936]: I1205 13:06:01.547000 29936 generic.go:334] "Generic (PLEG): container finished" podID="02a61d97-31ab-485e-9f4c-f18097ce33c7" containerID="82a7d9539ee911401290ec1f18bd564c6fe1dba97b22f92a893c7bd203032802" exitCode=0 Dec 05 13:06:01.576726 master-0 kubenswrapper[29936]: I1205 13:06:01.547028 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-92nh8" event={"ID":"02a61d97-31ab-485e-9f4c-f18097ce33c7","Type":"ContainerDied","Data":"82a7d9539ee911401290ec1f18bd564c6fe1dba97b22f92a893c7bd203032802"} Dec 05 13:06:02.311590 master-0 kubenswrapper[29936]: I1205 13:06:02.311470 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-448s4" podStartSLOduration=3.916979306 podStartE2EDuration="20.311438591s" podCreationTimestamp="2025-12-05 13:05:42 +0000 UTC" firstStartedPulling="2025-12-05 13:05:44.015800793 +0000 UTC m=+941.147880474" lastFinishedPulling="2025-12-05 13:06:00.410260078 +0000 UTC m=+957.542339759" observedRunningTime="2025-12-05 13:06:02.287406981 +0000 UTC m=+959.419486662" watchObservedRunningTime="2025-12-05 13:06:02.311438591 +0000 UTC m=+959.443518272" Dec 05 13:06:02.586008 master-0 kubenswrapper[29936]: I1205 13:06:02.585808 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"b72a16b4502fc43cb4c8bafcaffc93b6306cdd3e0595825cbab18ea453cbe0ab"} Dec 05 13:06:05.303102 master-0 kubenswrapper[29936]: I1205 13:06:05.303023 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-92nh8" Dec 05 13:06:05.379634 master-0 kubenswrapper[29936]: I1205 13:06:05.379553 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n8g5\" (UniqueName: \"kubernetes.io/projected/02a61d97-31ab-485e-9f4c-f18097ce33c7-kube-api-access-8n8g5\") pod \"02a61d97-31ab-485e-9f4c-f18097ce33c7\" (UID: \"02a61d97-31ab-485e-9f4c-f18097ce33c7\") " Dec 05 13:06:05.379940 master-0 kubenswrapper[29936]: I1205 13:06:05.379674 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02a61d97-31ab-485e-9f4c-f18097ce33c7-operator-scripts\") pod \"02a61d97-31ab-485e-9f4c-f18097ce33c7\" (UID: \"02a61d97-31ab-485e-9f4c-f18097ce33c7\") " Dec 05 13:06:05.382609 master-0 kubenswrapper[29936]: I1205 13:06:05.380921 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02a61d97-31ab-485e-9f4c-f18097ce33c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02a61d97-31ab-485e-9f4c-f18097ce33c7" (UID: "02a61d97-31ab-485e-9f4c-f18097ce33c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:05.391053 master-0 kubenswrapper[29936]: I1205 13:06:05.388695 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a61d97-31ab-485e-9f4c-f18097ce33c7-kube-api-access-8n8g5" (OuterVolumeSpecName: "kube-api-access-8n8g5") pod "02a61d97-31ab-485e-9f4c-f18097ce33c7" (UID: "02a61d97-31ab-485e-9f4c-f18097ce33c7"). InnerVolumeSpecName "kube-api-access-8n8g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:05.439989 master-0 kubenswrapper[29936]: I1205 13:06:05.439893 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4x574" Dec 05 13:06:05.509608 master-0 kubenswrapper[29936]: I1205 13:06:05.484510 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zm69\" (UniqueName: \"kubernetes.io/projected/4c67a1ed-b95a-414f-8f7d-972c98a55a88-kube-api-access-6zm69\") pod \"4c67a1ed-b95a-414f-8f7d-972c98a55a88\" (UID: \"4c67a1ed-b95a-414f-8f7d-972c98a55a88\") " Dec 05 13:06:05.509608 master-0 kubenswrapper[29936]: I1205 13:06:05.484678 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c67a1ed-b95a-414f-8f7d-972c98a55a88-operator-scripts\") pod \"4c67a1ed-b95a-414f-8f7d-972c98a55a88\" (UID: \"4c67a1ed-b95a-414f-8f7d-972c98a55a88\") " Dec 05 13:06:05.509608 master-0 kubenswrapper[29936]: I1205 13:06:05.485873 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c9b1-account-create-update-6mbv8" Dec 05 13:06:05.509608 master-0 kubenswrapper[29936]: I1205 13:06:05.486435 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n8g5\" (UniqueName: \"kubernetes.io/projected/02a61d97-31ab-485e-9f4c-f18097ce33c7-kube-api-access-8n8g5\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.509608 master-0 kubenswrapper[29936]: I1205 13:06:05.486451 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02a61d97-31ab-485e-9f4c-f18097ce33c7-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.509608 master-0 kubenswrapper[29936]: I1205 13:06:05.486912 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c67a1ed-b95a-414f-8f7d-972c98a55a88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c67a1ed-b95a-414f-8f7d-972c98a55a88" (UID: "4c67a1ed-b95a-414f-8f7d-972c98a55a88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:05.509608 master-0 kubenswrapper[29936]: I1205 13:06:05.494546 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c67a1ed-b95a-414f-8f7d-972c98a55a88-kube-api-access-6zm69" (OuterVolumeSpecName: "kube-api-access-6zm69") pod "4c67a1ed-b95a-414f-8f7d-972c98a55a88" (UID: "4c67a1ed-b95a-414f-8f7d-972c98a55a88"). InnerVolumeSpecName "kube-api-access-6zm69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:05.509608 master-0 kubenswrapper[29936]: I1205 13:06:05.496449 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-73e4-account-create-update-z9x2z" Dec 05 13:06:05.520587 master-0 kubenswrapper[29936]: I1205 13:06:05.520512 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:06:05.596144 master-0 kubenswrapper[29936]: I1205 13:06:05.595924 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80a34c12-cc34-47c1-af11-6c935c757db4-operator-scripts\") pod \"80a34c12-cc34-47c1-af11-6c935c757db4\" (UID: \"80a34c12-cc34-47c1-af11-6c935c757db4\") " Dec 05 13:06:05.596144 master-0 kubenswrapper[29936]: I1205 13:06:05.596085 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-log-ovn\") pod \"148db2bc-447d-44d5-beb6-b94bca8b9e22\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " Dec 05 13:06:05.596144 master-0 kubenswrapper[29936]: I1205 13:06:05.596123 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/376c0716-8fd1-432f-9e4b-a3b21373a7cc-operator-scripts\") pod \"376c0716-8fd1-432f-9e4b-a3b21373a7cc\" (UID: \"376c0716-8fd1-432f-9e4b-a3b21373a7cc\") " Dec 05 13:06:05.596552 master-0 kubenswrapper[29936]: I1205 13:06:05.596333 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx7dm\" (UniqueName: \"kubernetes.io/projected/376c0716-8fd1-432f-9e4b-a3b21373a7cc-kube-api-access-kx7dm\") pod \"376c0716-8fd1-432f-9e4b-a3b21373a7cc\" (UID: \"376c0716-8fd1-432f-9e4b-a3b21373a7cc\") " Dec 05 13:06:05.596552 master-0 kubenswrapper[29936]: I1205 13:06:05.596452 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-scripts\") pod \"148db2bc-447d-44d5-beb6-b94bca8b9e22\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " Dec 05 13:06:05.596552 master-0 kubenswrapper[29936]: I1205 13:06:05.596523 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njhlh\" (UniqueName: \"kubernetes.io/projected/80a34c12-cc34-47c1-af11-6c935c757db4-kube-api-access-njhlh\") pod \"80a34c12-cc34-47c1-af11-6c935c757db4\" (UID: \"80a34c12-cc34-47c1-af11-6c935c757db4\") " Dec 05 13:06:05.596651 master-0 kubenswrapper[29936]: I1205 13:06:05.596579 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run\") pod \"148db2bc-447d-44d5-beb6-b94bca8b9e22\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " Dec 05 13:06:05.596689 master-0 kubenswrapper[29936]: I1205 13:06:05.596638 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80a34c12-cc34-47c1-af11-6c935c757db4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80a34c12-cc34-47c1-af11-6c935c757db4" (UID: "80a34c12-cc34-47c1-af11-6c935c757db4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:05.596839 master-0 kubenswrapper[29936]: I1205 13:06:05.596803 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-additional-scripts\") pod \"148db2bc-447d-44d5-beb6-b94bca8b9e22\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " Dec 05 13:06:05.596884 master-0 kubenswrapper[29936]: I1205 13:06:05.596840 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82v7v\" (UniqueName: \"kubernetes.io/projected/148db2bc-447d-44d5-beb6-b94bca8b9e22-kube-api-access-82v7v\") pod \"148db2bc-447d-44d5-beb6-b94bca8b9e22\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " Dec 05 13:06:05.596884 master-0 kubenswrapper[29936]: I1205 13:06:05.596874 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run-ovn\") pod \"148db2bc-447d-44d5-beb6-b94bca8b9e22\" (UID: \"148db2bc-447d-44d5-beb6-b94bca8b9e22\") " Dec 05 13:06:05.597541 master-0 kubenswrapper[29936]: I1205 13:06:05.597477 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run" (OuterVolumeSpecName: "var-run") pod "148db2bc-447d-44d5-beb6-b94bca8b9e22" (UID: "148db2bc-447d-44d5-beb6-b94bca8b9e22"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:06:05.597541 master-0 kubenswrapper[29936]: I1205 13:06:05.597528 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80a34c12-cc34-47c1-af11-6c935c757db4-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.597650 master-0 kubenswrapper[29936]: I1205 13:06:05.597538 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "148db2bc-447d-44d5-beb6-b94bca8b9e22" (UID: "148db2bc-447d-44d5-beb6-b94bca8b9e22"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:06:05.597650 master-0 kubenswrapper[29936]: I1205 13:06:05.597555 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zm69\" (UniqueName: \"kubernetes.io/projected/4c67a1ed-b95a-414f-8f7d-972c98a55a88-kube-api-access-6zm69\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.597650 master-0 kubenswrapper[29936]: I1205 13:06:05.597626 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "148db2bc-447d-44d5-beb6-b94bca8b9e22" (UID: "148db2bc-447d-44d5-beb6-b94bca8b9e22"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:06:05.597650 master-0 kubenswrapper[29936]: I1205 13:06:05.597645 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c67a1ed-b95a-414f-8f7d-972c98a55a88-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.598402 master-0 kubenswrapper[29936]: I1205 13:06:05.598367 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "148db2bc-447d-44d5-beb6-b94bca8b9e22" (UID: "148db2bc-447d-44d5-beb6-b94bca8b9e22"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:05.598847 master-0 kubenswrapper[29936]: I1205 13:06:05.598801 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/376c0716-8fd1-432f-9e4b-a3b21373a7cc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "376c0716-8fd1-432f-9e4b-a3b21373a7cc" (UID: "376c0716-8fd1-432f-9e4b-a3b21373a7cc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:05.599148 master-0 kubenswrapper[29936]: I1205 13:06:05.599112 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-scripts" (OuterVolumeSpecName: "scripts") pod "148db2bc-447d-44d5-beb6-b94bca8b9e22" (UID: "148db2bc-447d-44d5-beb6-b94bca8b9e22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:05.601329 master-0 kubenswrapper[29936]: I1205 13:06:05.601295 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/376c0716-8fd1-432f-9e4b-a3b21373a7cc-kube-api-access-kx7dm" (OuterVolumeSpecName: "kube-api-access-kx7dm") pod "376c0716-8fd1-432f-9e4b-a3b21373a7cc" (UID: "376c0716-8fd1-432f-9e4b-a3b21373a7cc"). InnerVolumeSpecName "kube-api-access-kx7dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:05.603330 master-0 kubenswrapper[29936]: I1205 13:06:05.603271 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a34c12-cc34-47c1-af11-6c935c757db4-kube-api-access-njhlh" (OuterVolumeSpecName: "kube-api-access-njhlh") pod "80a34c12-cc34-47c1-af11-6c935c757db4" (UID: "80a34c12-cc34-47c1-af11-6c935c757db4"). InnerVolumeSpecName "kube-api-access-njhlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:05.603453 master-0 kubenswrapper[29936]: I1205 13:06:05.603399 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148db2bc-447d-44d5-beb6-b94bca8b9e22-kube-api-access-82v7v" (OuterVolumeSpecName: "kube-api-access-82v7v") pod "148db2bc-447d-44d5-beb6-b94bca8b9e22" (UID: "148db2bc-447d-44d5-beb6-b94bca8b9e22"). InnerVolumeSpecName "kube-api-access-82v7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:05.625905 master-0 kubenswrapper[29936]: I1205 13:06:05.625843 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c9b1-account-create-update-6mbv8" Dec 05 13:06:05.626310 master-0 kubenswrapper[29936]: I1205 13:06:05.626236 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c9b1-account-create-update-6mbv8" event={"ID":"80a34c12-cc34-47c1-af11-6c935c757db4","Type":"ContainerDied","Data":"48339a39bbc95476dfeac1f93726be8e7671fea1eb221524ef1142f76174ebfd"} Dec 05 13:06:05.626406 master-0 kubenswrapper[29936]: I1205 13:06:05.626319 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48339a39bbc95476dfeac1f93726be8e7671fea1eb221524ef1142f76174ebfd" Dec 05 13:06:05.632225 master-0 kubenswrapper[29936]: I1205 13:06:05.632151 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-92nh8" Dec 05 13:06:05.632618 master-0 kubenswrapper[29936]: I1205 13:06:05.632526 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-92nh8" event={"ID":"02a61d97-31ab-485e-9f4c-f18097ce33c7","Type":"ContainerDied","Data":"8e6c682f55e8eeb068d50dc538ae4dcb3dd712478460cbdcd2043906dbc44c35"} Dec 05 13:06:05.632618 master-0 kubenswrapper[29936]: I1205 13:06:05.632595 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e6c682f55e8eeb068d50dc538ae4dcb3dd712478460cbdcd2043906dbc44c35" Dec 05 13:06:05.636348 master-0 kubenswrapper[29936]: I1205 13:06:05.636154 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4x574" event={"ID":"4c67a1ed-b95a-414f-8f7d-972c98a55a88","Type":"ContainerDied","Data":"8638bf0f6751c1c405a4405de7542c20925799d1883a04adcb9ba7cfd31a273c"} Dec 05 13:06:05.636348 master-0 kubenswrapper[29936]: I1205 13:06:05.636207 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8638bf0f6751c1c405a4405de7542c20925799d1883a04adcb9ba7cfd31a273c" Dec 05 13:06:05.636348 master-0 kubenswrapper[29936]: I1205 13:06:05.636277 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4x574" Dec 05 13:06:05.643303 master-0 kubenswrapper[29936]: I1205 13:06:05.642198 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"a182ffa1b892d3870bae04099626b0e1f0ffc99d83f8aef3c5781939033058fc"} Dec 05 13:06:05.651825 master-0 kubenswrapper[29936]: I1205 13:06:05.649029 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-f8fxj-config-rvmj7" event={"ID":"148db2bc-447d-44d5-beb6-b94bca8b9e22","Type":"ContainerDied","Data":"cb2773509a355567546f49901a4cf9d5f75d70f5818113f01de96fa346d85bac"} Dec 05 13:06:05.651825 master-0 kubenswrapper[29936]: I1205 13:06:05.649087 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb2773509a355567546f49901a4cf9d5f75d70f5818113f01de96fa346d85bac" Dec 05 13:06:05.651825 master-0 kubenswrapper[29936]: I1205 13:06:05.649103 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-f8fxj-config-rvmj7" Dec 05 13:06:05.654169 master-0 kubenswrapper[29936]: I1205 13:06:05.653029 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dp8qv" event={"ID":"e1513da7-52be-4f09-8b6d-09e494522e1e","Type":"ContainerStarted","Data":"3ca7a76e0f17797de05b21839a6aa098bc369697e96ab92c0e53276e210ee4d6"} Dec 05 13:06:05.666130 master-0 kubenswrapper[29936]: I1205 13:06:05.665563 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-73e4-account-create-update-z9x2z" event={"ID":"376c0716-8fd1-432f-9e4b-a3b21373a7cc","Type":"ContainerDied","Data":"8617dccf39de1b6bbe270143a2ffff76eb07669b9b5035ccec43a734e4fe7c63"} Dec 05 13:06:05.666130 master-0 kubenswrapper[29936]: I1205 13:06:05.665628 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8617dccf39de1b6bbe270143a2ffff76eb07669b9b5035ccec43a734e4fe7c63" Dec 05 13:06:05.666130 master-0 kubenswrapper[29936]: I1205 13:06:05.665689 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-73e4-account-create-update-z9x2z" Dec 05 13:06:05.687142 master-0 kubenswrapper[29936]: I1205 13:06:05.687033 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dp8qv" podStartSLOduration=10.772422907 podStartE2EDuration="15.687002906s" podCreationTimestamp="2025-12-05 13:05:50 +0000 UTC" firstStartedPulling="2025-12-05 13:06:00.231814789 +0000 UTC m=+957.363894470" lastFinishedPulling="2025-12-05 13:06:05.146394788 +0000 UTC m=+962.278474469" observedRunningTime="2025-12-05 13:06:05.677408398 +0000 UTC m=+962.809488089" watchObservedRunningTime="2025-12-05 13:06:05.687002906 +0000 UTC m=+962.819082587" Dec 05 13:06:05.700457 master-0 kubenswrapper[29936]: I1205 13:06:05.700266 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx7dm\" (UniqueName: \"kubernetes.io/projected/376c0716-8fd1-432f-9e4b-a3b21373a7cc-kube-api-access-kx7dm\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.700457 master-0 kubenswrapper[29936]: I1205 13:06:05.700371 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.700457 master-0 kubenswrapper[29936]: I1205 13:06:05.700387 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njhlh\" (UniqueName: \"kubernetes.io/projected/80a34c12-cc34-47c1-af11-6c935c757db4-kube-api-access-njhlh\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.700457 master-0 kubenswrapper[29936]: I1205 13:06:05.700396 29936 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.701780 master-0 kubenswrapper[29936]: I1205 13:06:05.700929 29936 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/148db2bc-447d-44d5-beb6-b94bca8b9e22-additional-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.701780 master-0 kubenswrapper[29936]: I1205 13:06:05.700949 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82v7v\" (UniqueName: \"kubernetes.io/projected/148db2bc-447d-44d5-beb6-b94bca8b9e22-kube-api-access-82v7v\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.701780 master-0 kubenswrapper[29936]: I1205 13:06:05.700959 29936 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.701780 master-0 kubenswrapper[29936]: I1205 13:06:05.700968 29936 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/148db2bc-447d-44d5-beb6-b94bca8b9e22-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:05.701780 master-0 kubenswrapper[29936]: I1205 13:06:05.700978 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/376c0716-8fd1-432f-9e4b-a3b21373a7cc-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:06.670538 master-0 kubenswrapper[29936]: I1205 13:06:06.669736 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-f8fxj-config-rvmj7"] Dec 05 13:06:06.686401 master-0 kubenswrapper[29936]: I1205 13:06:06.680520 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-f8fxj-config-rvmj7"] Dec 05 13:06:06.707728 master-0 kubenswrapper[29936]: I1205 13:06:06.707617 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"0bbfac77d08689ceacc80c5801a86c3447b2e42160033352b175aad37c519be0"} Dec 05 13:06:06.707728 master-0 kubenswrapper[29936]: I1205 13:06:06.707690 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"4e8243ccb4068982416d32db976a293c17d51f4fa442edad78b0b8fa02cd7191"} Dec 05 13:06:06.707728 master-0 kubenswrapper[29936]: I1205 13:06:06.707702 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8668c9c4-68f4-4224-9395-0f2ac85b9f1d","Type":"ContainerStarted","Data":"4e4ad45d81ea435c0b6f640f9ab6a76ecbc35b6da48158c47568da9eab1c028b"} Dec 05 13:06:06.757532 master-0 kubenswrapper[29936]: I1205 13:06:06.757291 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=51.375775583 podStartE2EDuration="58.75726542s" podCreationTimestamp="2025-12-05 13:05:08 +0000 UTC" firstStartedPulling="2025-12-05 13:05:43.709260384 +0000 UTC m=+940.841340065" lastFinishedPulling="2025-12-05 13:05:51.090750221 +0000 UTC m=+948.222829902" observedRunningTime="2025-12-05 13:06:06.743159736 +0000 UTC m=+963.875239417" watchObservedRunningTime="2025-12-05 13:06:06.75726542 +0000 UTC m=+963.889345101" Dec 05 13:06:07.078804 master-0 kubenswrapper[29936]: I1205 13:06:07.078700 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b54cfd657-cd9tt"] Dec 05 13:06:07.080252 master-0 kubenswrapper[29936]: E1205 13:06:07.080215 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a61d97-31ab-485e-9f4c-f18097ce33c7" containerName="mariadb-database-create" Dec 05 13:06:07.080252 master-0 kubenswrapper[29936]: I1205 13:06:07.080250 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a61d97-31ab-485e-9f4c-f18097ce33c7" containerName="mariadb-database-create" Dec 05 13:06:07.080402 master-0 kubenswrapper[29936]: E1205 13:06:07.080289 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="376c0716-8fd1-432f-9e4b-a3b21373a7cc" containerName="mariadb-account-create-update" Dec 05 13:06:07.080402 master-0 kubenswrapper[29936]: I1205 13:06:07.080301 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="376c0716-8fd1-432f-9e4b-a3b21373a7cc" containerName="mariadb-account-create-update" Dec 05 13:06:07.080402 master-0 kubenswrapper[29936]: E1205 13:06:07.080338 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148db2bc-447d-44d5-beb6-b94bca8b9e22" containerName="ovn-config" Dec 05 13:06:07.080402 master-0 kubenswrapper[29936]: I1205 13:06:07.080347 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="148db2bc-447d-44d5-beb6-b94bca8b9e22" containerName="ovn-config" Dec 05 13:06:07.080402 master-0 kubenswrapper[29936]: E1205 13:06:07.080364 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80a34c12-cc34-47c1-af11-6c935c757db4" containerName="mariadb-account-create-update" Dec 05 13:06:07.080402 master-0 kubenswrapper[29936]: I1205 13:06:07.080373 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a34c12-cc34-47c1-af11-6c935c757db4" containerName="mariadb-account-create-update" Dec 05 13:06:07.080402 master-0 kubenswrapper[29936]: E1205 13:06:07.080390 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c67a1ed-b95a-414f-8f7d-972c98a55a88" containerName="mariadb-database-create" Dec 05 13:06:07.080402 master-0 kubenswrapper[29936]: I1205 13:06:07.080400 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c67a1ed-b95a-414f-8f7d-972c98a55a88" containerName="mariadb-database-create" Dec 05 13:06:07.080710 master-0 kubenswrapper[29936]: I1205 13:06:07.080682 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c67a1ed-b95a-414f-8f7d-972c98a55a88" containerName="mariadb-database-create" Dec 05 13:06:07.080752 master-0 kubenswrapper[29936]: I1205 13:06:07.080716 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="80a34c12-cc34-47c1-af11-6c935c757db4" containerName="mariadb-account-create-update" Dec 05 13:06:07.080752 master-0 kubenswrapper[29936]: I1205 13:06:07.080746 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a61d97-31ab-485e-9f4c-f18097ce33c7" containerName="mariadb-database-create" Dec 05 13:06:07.080823 master-0 kubenswrapper[29936]: I1205 13:06:07.080770 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="148db2bc-447d-44d5-beb6-b94bca8b9e22" containerName="ovn-config" Dec 05 13:06:07.080823 master-0 kubenswrapper[29936]: I1205 13:06:07.080793 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="376c0716-8fd1-432f-9e4b-a3b21373a7cc" containerName="mariadb-account-create-update" Dec 05 13:06:07.083054 master-0 kubenswrapper[29936]: I1205 13:06:07.082981 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.091876 master-0 kubenswrapper[29936]: I1205 13:06:07.091603 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 05 13:06:07.120008 master-0 kubenswrapper[29936]: I1205 13:06:07.119946 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b54cfd657-cd9tt"] Dec 05 13:06:07.144607 master-0 kubenswrapper[29936]: I1205 13:06:07.144379 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-sb\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.145520 master-0 kubenswrapper[29936]: I1205 13:06:07.144918 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-swift-storage-0\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.145520 master-0 kubenswrapper[29936]: I1205 13:06:07.145257 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-config\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.145520 master-0 kubenswrapper[29936]: I1205 13:06:07.145411 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.145520 master-0 kubenswrapper[29936]: I1205 13:06:07.145504 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-svc\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.145685 master-0 kubenswrapper[29936]: I1205 13:06:07.145599 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6f4l\" (UniqueName: \"kubernetes.io/projected/988345f9-c29b-4d52-8caa-fcf2711f0eb0-kube-api-access-f6f4l\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.211756 master-0 kubenswrapper[29936]: I1205 13:06:07.211684 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="148db2bc-447d-44d5-beb6-b94bca8b9e22" path="/var/lib/kubelet/pods/148db2bc-447d-44d5-beb6-b94bca8b9e22/volumes" Dec 05 13:06:07.250227 master-0 kubenswrapper[29936]: I1205 13:06:07.247758 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-config\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.250227 master-0 kubenswrapper[29936]: I1205 13:06:07.247869 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.250227 master-0 kubenswrapper[29936]: I1205 13:06:07.248584 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-svc\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.250227 master-0 kubenswrapper[29936]: I1205 13:06:07.248928 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6f4l\" (UniqueName: \"kubernetes.io/projected/988345f9-c29b-4d52-8caa-fcf2711f0eb0-kube-api-access-f6f4l\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.250227 master-0 kubenswrapper[29936]: I1205 13:06:07.249303 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-sb\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.250227 master-0 kubenswrapper[29936]: I1205 13:06:07.249399 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-swift-storage-0\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.251979 master-0 kubenswrapper[29936]: I1205 13:06:07.251932 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-sb\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.254069 master-0 kubenswrapper[29936]: I1205 13:06:07.254003 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-svc\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.254160 master-0 kubenswrapper[29936]: I1205 13:06:07.251068 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.254244 master-0 kubenswrapper[29936]: I1205 13:06:07.254205 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-swift-storage-0\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.254893 master-0 kubenswrapper[29936]: I1205 13:06:07.254853 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-config\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.265830 master-0 kubenswrapper[29936]: I1205 13:06:07.265765 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6f4l\" (UniqueName: \"kubernetes.io/projected/988345f9-c29b-4d52-8caa-fcf2711f0eb0-kube-api-access-f6f4l\") pod \"dnsmasq-dns-6b54cfd657-cd9tt\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.421001 master-0 kubenswrapper[29936]: I1205 13:06:07.420896 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:07.933082 master-0 kubenswrapper[29936]: W1205 13:06:07.933018 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod988345f9_c29b_4d52_8caa_fcf2711f0eb0.slice/crio-1c713c0a692f8f88096daf79f4df37cb59bd850d9e1a3d4a6540dba87dcfb8a5 WatchSource:0}: Error finding container 1c713c0a692f8f88096daf79f4df37cb59bd850d9e1a3d4a6540dba87dcfb8a5: Status 404 returned error can't find the container with id 1c713c0a692f8f88096daf79f4df37cb59bd850d9e1a3d4a6540dba87dcfb8a5 Dec 05 13:06:07.938521 master-0 kubenswrapper[29936]: I1205 13:06:07.938425 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b54cfd657-cd9tt"] Dec 05 13:06:08.739684 master-0 kubenswrapper[29936]: I1205 13:06:08.739571 29936 generic.go:334] "Generic (PLEG): container finished" podID="988345f9-c29b-4d52-8caa-fcf2711f0eb0" containerID="2c1df04a7b39462450760b3c92edfc7ce6e11b2c4a82fab79e462f50beb85d48" exitCode=0 Dec 05 13:06:08.739684 master-0 kubenswrapper[29936]: I1205 13:06:08.739648 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" event={"ID":"988345f9-c29b-4d52-8caa-fcf2711f0eb0","Type":"ContainerDied","Data":"2c1df04a7b39462450760b3c92edfc7ce6e11b2c4a82fab79e462f50beb85d48"} Dec 05 13:06:08.740060 master-0 kubenswrapper[29936]: I1205 13:06:08.739733 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" event={"ID":"988345f9-c29b-4d52-8caa-fcf2711f0eb0","Type":"ContainerStarted","Data":"1c713c0a692f8f88096daf79f4df37cb59bd850d9e1a3d4a6540dba87dcfb8a5"} Dec 05 13:06:09.757327 master-0 kubenswrapper[29936]: I1205 13:06:09.757229 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" event={"ID":"988345f9-c29b-4d52-8caa-fcf2711f0eb0","Type":"ContainerStarted","Data":"bbe166a8d338e001f16a51e9ca9775f665aec07436b355d942d5493703911a99"} Dec 05 13:06:09.758130 master-0 kubenswrapper[29936]: I1205 13:06:09.757424 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:09.798473 master-0 kubenswrapper[29936]: I1205 13:06:09.798356 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" podStartSLOduration=2.7983270989999998 podStartE2EDuration="2.798327099s" podCreationTimestamp="2025-12-05 13:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:09.778652321 +0000 UTC m=+966.910732022" watchObservedRunningTime="2025-12-05 13:06:09.798327099 +0000 UTC m=+966.930406780" Dec 05 13:06:10.770471 master-0 kubenswrapper[29936]: I1205 13:06:10.770375 29936 generic.go:334] "Generic (PLEG): container finished" podID="e1513da7-52be-4f09-8b6d-09e494522e1e" containerID="3ca7a76e0f17797de05b21839a6aa098bc369697e96ab92c0e53276e210ee4d6" exitCode=0 Dec 05 13:06:10.770471 master-0 kubenswrapper[29936]: I1205 13:06:10.770445 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dp8qv" event={"ID":"e1513da7-52be-4f09-8b6d-09e494522e1e","Type":"ContainerDied","Data":"3ca7a76e0f17797de05b21839a6aa098bc369697e96ab92c0e53276e210ee4d6"} Dec 05 13:06:11.788792 master-0 kubenswrapper[29936]: I1205 13:06:11.788705 29936 generic.go:334] "Generic (PLEG): container finished" podID="5a7bb352-8943-448f-ad3f-a06ebd4b8b30" containerID="d9dffbcd65d9fd967677c7e200645f9179512261eecadecfb2c00821dffdcb61" exitCode=0 Dec 05 13:06:11.789510 master-0 kubenswrapper[29936]: I1205 13:06:11.789014 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-448s4" event={"ID":"5a7bb352-8943-448f-ad3f-a06ebd4b8b30","Type":"ContainerDied","Data":"d9dffbcd65d9fd967677c7e200645f9179512261eecadecfb2c00821dffdcb61"} Dec 05 13:06:12.211028 master-0 kubenswrapper[29936]: I1205 13:06:12.210967 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:06:12.300028 master-0 kubenswrapper[29936]: I1205 13:06:12.299947 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4qtr\" (UniqueName: \"kubernetes.io/projected/e1513da7-52be-4f09-8b6d-09e494522e1e-kube-api-access-n4qtr\") pod \"e1513da7-52be-4f09-8b6d-09e494522e1e\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " Dec 05 13:06:12.300453 master-0 kubenswrapper[29936]: I1205 13:06:12.300358 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-combined-ca-bundle\") pod \"e1513da7-52be-4f09-8b6d-09e494522e1e\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " Dec 05 13:06:12.300640 master-0 kubenswrapper[29936]: I1205 13:06:12.300616 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-config-data\") pod \"e1513da7-52be-4f09-8b6d-09e494522e1e\" (UID: \"e1513da7-52be-4f09-8b6d-09e494522e1e\") " Dec 05 13:06:12.303473 master-0 kubenswrapper[29936]: I1205 13:06:12.303419 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1513da7-52be-4f09-8b6d-09e494522e1e-kube-api-access-n4qtr" (OuterVolumeSpecName: "kube-api-access-n4qtr") pod "e1513da7-52be-4f09-8b6d-09e494522e1e" (UID: "e1513da7-52be-4f09-8b6d-09e494522e1e"). InnerVolumeSpecName "kube-api-access-n4qtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:12.331670 master-0 kubenswrapper[29936]: I1205 13:06:12.331539 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1513da7-52be-4f09-8b6d-09e494522e1e" (UID: "e1513da7-52be-4f09-8b6d-09e494522e1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:12.356719 master-0 kubenswrapper[29936]: I1205 13:06:12.356544 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-config-data" (OuterVolumeSpecName: "config-data") pod "e1513da7-52be-4f09-8b6d-09e494522e1e" (UID: "e1513da7-52be-4f09-8b6d-09e494522e1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:12.404795 master-0 kubenswrapper[29936]: I1205 13:06:12.404684 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:12.404795 master-0 kubenswrapper[29936]: I1205 13:06:12.404771 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1513da7-52be-4f09-8b6d-09e494522e1e-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:12.404795 master-0 kubenswrapper[29936]: I1205 13:06:12.404783 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4qtr\" (UniqueName: \"kubernetes.io/projected/e1513da7-52be-4f09-8b6d-09e494522e1e-kube-api-access-n4qtr\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:12.802125 master-0 kubenswrapper[29936]: I1205 13:06:12.802031 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dp8qv" event={"ID":"e1513da7-52be-4f09-8b6d-09e494522e1e","Type":"ContainerDied","Data":"61a7aeeb0ed5d7e876b4064d500e3a59193ab01b5f77cd7d215c6b9eec3c4910"} Dec 05 13:06:12.802125 master-0 kubenswrapper[29936]: I1205 13:06:12.802107 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61a7aeeb0ed5d7e876b4064d500e3a59193ab01b5f77cd7d215c6b9eec3c4910" Dec 05 13:06:12.802865 master-0 kubenswrapper[29936]: I1205 13:06:12.802118 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dp8qv" Dec 05 13:06:13.236848 master-0 kubenswrapper[29936]: I1205 13:06:13.236743 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b54cfd657-cd9tt"] Dec 05 13:06:13.237305 master-0 kubenswrapper[29936]: I1205 13:06:13.237245 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" podUID="988345f9-c29b-4d52-8caa-fcf2711f0eb0" containerName="dnsmasq-dns" containerID="cri-o://bbe166a8d338e001f16a51e9ca9775f665aec07436b355d942d5493703911a99" gracePeriod=10 Dec 05 13:06:13.256366 master-0 kubenswrapper[29936]: I1205 13:06:13.255785 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:13.334240 master-0 kubenswrapper[29936]: I1205 13:06:13.333469 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74c65c7fc-zhsb4"] Dec 05 13:06:13.334240 master-0 kubenswrapper[29936]: E1205 13:06:13.334151 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1513da7-52be-4f09-8b6d-09e494522e1e" containerName="keystone-db-sync" Dec 05 13:06:13.334240 master-0 kubenswrapper[29936]: I1205 13:06:13.334165 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1513da7-52be-4f09-8b6d-09e494522e1e" containerName="keystone-db-sync" Dec 05 13:06:13.334570 master-0 kubenswrapper[29936]: I1205 13:06:13.334444 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1513da7-52be-4f09-8b6d-09e494522e1e" containerName="keystone-db-sync" Dec 05 13:06:13.338211 master-0 kubenswrapper[29936]: I1205 13:06:13.335815 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.351919 master-0 kubenswrapper[29936]: I1205 13:06:13.350553 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hxjkn"] Dec 05 13:06:13.354212 master-0 kubenswrapper[29936]: I1205 13:06:13.352313 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.359220 master-0 kubenswrapper[29936]: I1205 13:06:13.354553 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 13:06:13.359220 master-0 kubenswrapper[29936]: I1205 13:06:13.354825 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 13:06:13.359220 master-0 kubenswrapper[29936]: I1205 13:06:13.355026 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 13:06:13.359220 master-0 kubenswrapper[29936]: I1205 13:06:13.355159 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 13:06:13.421230 master-0 kubenswrapper[29936]: I1205 13:06:13.418474 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hxjkn"] Dec 05 13:06:13.440403 master-0 kubenswrapper[29936]: I1205 13:06:13.440326 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74c65c7fc-zhsb4"] Dec 05 13:06:13.442536 master-0 kubenswrapper[29936]: I1205 13:06:13.441890 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-nb\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.442536 master-0 kubenswrapper[29936]: I1205 13:06:13.441967 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-config\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.442536 master-0 kubenswrapper[29936]: I1205 13:06:13.442005 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-credential-keys\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.442536 master-0 kubenswrapper[29936]: I1205 13:06:13.442030 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-scripts\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.442536 master-0 kubenswrapper[29936]: I1205 13:06:13.442063 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-config-data\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.442536 master-0 kubenswrapper[29936]: I1205 13:06:13.442094 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-combined-ca-bundle\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.442536 master-0 kubenswrapper[29936]: I1205 13:06:13.442335 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-swift-storage-0\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.442536 master-0 kubenswrapper[29936]: I1205 13:06:13.442419 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t627\" (UniqueName: \"kubernetes.io/projected/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-kube-api-access-2t627\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.442874 master-0 kubenswrapper[29936]: I1205 13:06:13.442559 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-svc\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.442874 master-0 kubenswrapper[29936]: I1205 13:06:13.442649 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-fernet-keys\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.442874 master-0 kubenswrapper[29936]: I1205 13:06:13.442717 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tct9d\" (UniqueName: \"kubernetes.io/projected/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-kube-api-access-tct9d\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.442974 master-0 kubenswrapper[29936]: I1205 13:06:13.442936 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-sb\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.474636 master-0 kubenswrapper[29936]: I1205 13:06:13.474568 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-9fz4t"] Dec 05 13:06:13.476710 master-0 kubenswrapper[29936]: I1205 13:06:13.476668 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9fz4t" Dec 05 13:06:13.486244 master-0 kubenswrapper[29936]: I1205 13:06:13.486076 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-9fz4t"] Dec 05 13:06:13.539079 master-0 kubenswrapper[29936]: I1205 13:06:13.536877 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-448s4" Dec 05 13:06:13.547308 master-0 kubenswrapper[29936]: I1205 13:06:13.544567 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-combined-ca-bundle\") pod \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " Dec 05 13:06:13.547308 master-0 kubenswrapper[29936]: I1205 13:06:13.544639 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-db-sync-config-data\") pod \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " Dec 05 13:06:13.547308 master-0 kubenswrapper[29936]: I1205 13:06:13.544706 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc92v\" (UniqueName: \"kubernetes.io/projected/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-kube-api-access-mc92v\") pod \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " Dec 05 13:06:13.547308 master-0 kubenswrapper[29936]: I1205 13:06:13.544948 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-config-data\") pod \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\" (UID: \"5a7bb352-8943-448f-ad3f-a06ebd4b8b30\") " Dec 05 13:06:13.547308 master-0 kubenswrapper[29936]: I1205 13:06:13.545575 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-nb\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.547308 master-0 kubenswrapper[29936]: I1205 13:06:13.545612 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-config\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.557386 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-kube-api-access-mc92v" (OuterVolumeSpecName: "kube-api-access-mc92v") pod "5a7bb352-8943-448f-ad3f-a06ebd4b8b30" (UID: "5a7bb352-8943-448f-ad3f-a06ebd4b8b30"). InnerVolumeSpecName "kube-api-access-mc92v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.560353 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wft9t\" (UniqueName: \"kubernetes.io/projected/72d08438-77de-4a6e-81d7-a9f76078a1b6-kube-api-access-wft9t\") pod \"ironic-db-create-9fz4t\" (UID: \"72d08438-77de-4a6e-81d7-a9f76078a1b6\") " pod="openstack/ironic-db-create-9fz4t" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.560420 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-credential-keys\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.560479 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-scripts\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.560914 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-config-data\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.561023 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-combined-ca-bundle\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.561304 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t627\" (UniqueName: \"kubernetes.io/projected/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-kube-api-access-2t627\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.561341 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-swift-storage-0\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.561402 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-svc\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.561451 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-fernet-keys\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.561476 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tct9d\" (UniqueName: \"kubernetes.io/projected/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-kube-api-access-tct9d\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.561563 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d08438-77de-4a6e-81d7-a9f76078a1b6-operator-scripts\") pod \"ironic-db-create-9fz4t\" (UID: \"72d08438-77de-4a6e-81d7-a9f76078a1b6\") " pod="openstack/ironic-db-create-9fz4t" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.562644 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-sb\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.563342 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5a7bb352-8943-448f-ad3f-a06ebd4b8b30" (UID: "5a7bb352-8943-448f-ad3f-a06ebd4b8b30"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.564148 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-config\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.564259 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc92v\" (UniqueName: \"kubernetes.io/projected/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-kube-api-access-mc92v\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.568264 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-config-data\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.571477 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-swift-storage-0\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.572280 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-scripts\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.572421 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b46d8-db-sync-6scb5"] Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: E1205 13:06:13.573211 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7bb352-8943-448f-ad3f-a06ebd4b8b30" containerName="glance-db-sync" Dec 05 13:06:13.573883 master-0 kubenswrapper[29936]: I1205 13:06:13.573232 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7bb352-8943-448f-ad3f-a06ebd4b8b30" containerName="glance-db-sync" Dec 05 13:06:13.576029 master-0 kubenswrapper[29936]: I1205 13:06:13.575684 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-svc\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.576594 master-0 kubenswrapper[29936]: I1205 13:06:13.576533 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-sb\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.576783 master-0 kubenswrapper[29936]: I1205 13:06:13.576740 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-nb\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.581619 master-0 kubenswrapper[29936]: I1205 13:06:13.581563 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-combined-ca-bundle\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.583332 master-0 kubenswrapper[29936]: I1205 13:06:13.583255 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-fernet-keys\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.585517 master-0 kubenswrapper[29936]: I1205 13:06:13.585478 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-credential-keys\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.621790 master-0 kubenswrapper[29936]: I1205 13:06:13.621682 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7bb352-8943-448f-ad3f-a06ebd4b8b30" containerName="glance-db-sync" Dec 05 13:06:13.625930 master-0 kubenswrapper[29936]: I1205 13:06:13.622632 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-db-sync-6scb5"] Dec 05 13:06:13.625930 master-0 kubenswrapper[29936]: I1205 13:06:13.622764 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.626891 master-0 kubenswrapper[29936]: I1205 13:06:13.626825 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t627\" (UniqueName: \"kubernetes.io/projected/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-kube-api-access-2t627\") pod \"dnsmasq-dns-74c65c7fc-zhsb4\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.636871 master-0 kubenswrapper[29936]: I1205 13:06:13.636797 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-config-data" Dec 05 13:06:13.637088 master-0 kubenswrapper[29936]: I1205 13:06:13.637021 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-scripts" Dec 05 13:06:13.637940 master-0 kubenswrapper[29936]: I1205 13:06:13.637894 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tct9d\" (UniqueName: \"kubernetes.io/projected/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-kube-api-access-tct9d\") pod \"keystone-bootstrap-hxjkn\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.662261 master-0 kubenswrapper[29936]: I1205 13:06:13.653302 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-d6bnq"] Dec 05 13:06:13.662261 master-0 kubenswrapper[29936]: I1205 13:06:13.655096 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:13.662761 master-0 kubenswrapper[29936]: I1205 13:06:13.662709 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a7bb352-8943-448f-ad3f-a06ebd4b8b30" (UID: "5a7bb352-8943-448f-ad3f-a06ebd4b8b30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:13.677244 master-0 kubenswrapper[29936]: I1205 13:06:13.667777 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-db-sync-config-data\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.677244 master-0 kubenswrapper[29936]: I1205 13:06:13.667940 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wft9t\" (UniqueName: \"kubernetes.io/projected/72d08438-77de-4a6e-81d7-a9f76078a1b6-kube-api-access-wft9t\") pod \"ironic-db-create-9fz4t\" (UID: \"72d08438-77de-4a6e-81d7-a9f76078a1b6\") " pod="openstack/ironic-db-create-9fz4t" Dec 05 13:06:13.677244 master-0 kubenswrapper[29936]: I1205 13:06:13.667976 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxgsx\" (UniqueName: \"kubernetes.io/projected/b82bd984-5f64-433b-a41a-f5186287a0f7-kube-api-access-sxgsx\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.677244 master-0 kubenswrapper[29936]: I1205 13:06:13.668409 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-scripts\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.677244 master-0 kubenswrapper[29936]: I1205 13:06:13.668556 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d08438-77de-4a6e-81d7-a9f76078a1b6-operator-scripts\") pod \"ironic-db-create-9fz4t\" (UID: \"72d08438-77de-4a6e-81d7-a9f76078a1b6\") " pod="openstack/ironic-db-create-9fz4t" Dec 05 13:06:13.677244 master-0 kubenswrapper[29936]: I1205 13:06:13.669542 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-config-data\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.677244 master-0 kubenswrapper[29936]: I1205 13:06:13.669592 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b82bd984-5f64-433b-a41a-f5186287a0f7-etc-machine-id\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.677244 master-0 kubenswrapper[29936]: I1205 13:06:13.669685 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-combined-ca-bundle\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.677244 master-0 kubenswrapper[29936]: I1205 13:06:13.669756 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 13:06:13.677244 master-0 kubenswrapper[29936]: I1205 13:06:13.670055 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 13:06:13.677923 master-0 kubenswrapper[29936]: I1205 13:06:13.677876 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:13.677972 master-0 kubenswrapper[29936]: I1205 13:06:13.677933 29936 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:13.679034 master-0 kubenswrapper[29936]: I1205 13:06:13.678970 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d08438-77de-4a6e-81d7-a9f76078a1b6-operator-scripts\") pod \"ironic-db-create-9fz4t\" (UID: \"72d08438-77de-4a6e-81d7-a9f76078a1b6\") " pod="openstack/ironic-db-create-9fz4t" Dec 05 13:06:13.685511 master-0 kubenswrapper[29936]: I1205 13:06:13.685445 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-0279-account-create-update-wclm4"] Dec 05 13:06:13.699095 master-0 kubenswrapper[29936]: I1205 13:06:13.688507 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:13.699095 master-0 kubenswrapper[29936]: I1205 13:06:13.694658 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-0279-account-create-update-wclm4" Dec 05 13:06:13.705725 master-0 kubenswrapper[29936]: I1205 13:06:13.701343 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Dec 05 13:06:13.728417 master-0 kubenswrapper[29936]: I1205 13:06:13.717979 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:13.741593 master-0 kubenswrapper[29936]: I1205 13:06:13.740036 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wft9t\" (UniqueName: \"kubernetes.io/projected/72d08438-77de-4a6e-81d7-a9f76078a1b6-kube-api-access-wft9t\") pod \"ironic-db-create-9fz4t\" (UID: \"72d08438-77de-4a6e-81d7-a9f76078a1b6\") " pod="openstack/ironic-db-create-9fz4t" Dec 05 13:06:13.741593 master-0 kubenswrapper[29936]: I1205 13:06:13.741004 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9fz4t" Dec 05 13:06:13.744306 master-0 kubenswrapper[29936]: I1205 13:06:13.744052 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-d6bnq"] Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779500 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-db-sync-config-data\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779619 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxgsx\" (UniqueName: \"kubernetes.io/projected/b82bd984-5f64-433b-a41a-f5186287a0f7-kube-api-access-sxgsx\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779683 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-combined-ca-bundle\") pod \"neutron-db-sync-d6bnq\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779718 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8wwf\" (UniqueName: \"kubernetes.io/projected/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-kube-api-access-q8wwf\") pod \"neutron-db-sync-d6bnq\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779746 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-config\") pod \"neutron-db-sync-d6bnq\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779826 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-scripts\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779849 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2w55\" (UniqueName: \"kubernetes.io/projected/9d76a2c1-3728-4104-b208-67b329e52d70-kube-api-access-r2w55\") pod \"ironic-0279-account-create-update-wclm4\" (UID: \"9d76a2c1-3728-4104-b208-67b329e52d70\") " pod="openstack/ironic-0279-account-create-update-wclm4" Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779892 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d76a2c1-3728-4104-b208-67b329e52d70-operator-scripts\") pod \"ironic-0279-account-create-update-wclm4\" (UID: \"9d76a2c1-3728-4104-b208-67b329e52d70\") " pod="openstack/ironic-0279-account-create-update-wclm4" Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779943 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-config-data\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779964 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b82bd984-5f64-433b-a41a-f5186287a0f7-etc-machine-id\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.783120 master-0 kubenswrapper[29936]: I1205 13:06:13.779990 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-combined-ca-bundle\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.784811 master-0 kubenswrapper[29936]: I1205 13:06:13.784697 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b82bd984-5f64-433b-a41a-f5186287a0f7-etc-machine-id\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.791876 master-0 kubenswrapper[29936]: I1205 13:06:13.788645 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-combined-ca-bundle\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.791876 master-0 kubenswrapper[29936]: I1205 13:06:13.791825 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-config-data\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.812257 master-0 kubenswrapper[29936]: I1205 13:06:13.798317 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-scripts\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.821488 master-0 kubenswrapper[29936]: I1205 13:06:13.815563 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-db-sync-config-data\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.822196 master-0 kubenswrapper[29936]: I1205 13:06:13.822131 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxgsx\" (UniqueName: \"kubernetes.io/projected/b82bd984-5f64-433b-a41a-f5186287a0f7-kube-api-access-sxgsx\") pod \"cinder-b46d8-db-sync-6scb5\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:13.834295 master-0 kubenswrapper[29936]: I1205 13:06:13.832690 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-448s4" event={"ID":"5a7bb352-8943-448f-ad3f-a06ebd4b8b30","Type":"ContainerDied","Data":"f693960a409f314c337dbfd68ddeeda1f87b2fc4da586635bee71a4b605c4379"} Dec 05 13:06:13.834295 master-0 kubenswrapper[29936]: I1205 13:06:13.832782 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f693960a409f314c337dbfd68ddeeda1f87b2fc4da586635bee71a4b605c4379" Dec 05 13:06:13.834295 master-0 kubenswrapper[29936]: I1205 13:06:13.832885 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-448s4" Dec 05 13:06:13.863350 master-0 kubenswrapper[29936]: I1205 13:06:13.836150 29936 generic.go:334] "Generic (PLEG): container finished" podID="988345f9-c29b-4d52-8caa-fcf2711f0eb0" containerID="bbe166a8d338e001f16a51e9ca9775f665aec07436b355d942d5493703911a99" exitCode=0 Dec 05 13:06:13.863350 master-0 kubenswrapper[29936]: I1205 13:06:13.836210 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-0279-account-create-update-wclm4"] Dec 05 13:06:13.863350 master-0 kubenswrapper[29936]: I1205 13:06:13.836242 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" event={"ID":"988345f9-c29b-4d52-8caa-fcf2711f0eb0","Type":"ContainerDied","Data":"bbe166a8d338e001f16a51e9ca9775f665aec07436b355d942d5493703911a99"} Dec 05 13:06:13.867784 master-0 kubenswrapper[29936]: I1205 13:06:13.867688 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-config-data" (OuterVolumeSpecName: "config-data") pod "5a7bb352-8943-448f-ad3f-a06ebd4b8b30" (UID: "5a7bb352-8943-448f-ad3f-a06ebd4b8b30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:13.896162 master-0 kubenswrapper[29936]: I1205 13:06:13.885583 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-combined-ca-bundle\") pod \"neutron-db-sync-d6bnq\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:13.896162 master-0 kubenswrapper[29936]: I1205 13:06:13.885653 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8wwf\" (UniqueName: \"kubernetes.io/projected/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-kube-api-access-q8wwf\") pod \"neutron-db-sync-d6bnq\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:13.896162 master-0 kubenswrapper[29936]: I1205 13:06:13.885675 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-config\") pod \"neutron-db-sync-d6bnq\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:13.896162 master-0 kubenswrapper[29936]: I1205 13:06:13.885760 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2w55\" (UniqueName: \"kubernetes.io/projected/9d76a2c1-3728-4104-b208-67b329e52d70-kube-api-access-r2w55\") pod \"ironic-0279-account-create-update-wclm4\" (UID: \"9d76a2c1-3728-4104-b208-67b329e52d70\") " pod="openstack/ironic-0279-account-create-update-wclm4" Dec 05 13:06:13.896162 master-0 kubenswrapper[29936]: I1205 13:06:13.885794 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d76a2c1-3728-4104-b208-67b329e52d70-operator-scripts\") pod \"ironic-0279-account-create-update-wclm4\" (UID: \"9d76a2c1-3728-4104-b208-67b329e52d70\") " pod="openstack/ironic-0279-account-create-update-wclm4" Dec 05 13:06:13.896162 master-0 kubenswrapper[29936]: I1205 13:06:13.885919 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a7bb352-8943-448f-ad3f-a06ebd4b8b30-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:13.896162 master-0 kubenswrapper[29936]: I1205 13:06:13.890040 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-combined-ca-bundle\") pod \"neutron-db-sync-d6bnq\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:13.896162 master-0 kubenswrapper[29936]: I1205 13:06:13.890710 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d76a2c1-3728-4104-b208-67b329e52d70-operator-scripts\") pod \"ironic-0279-account-create-update-wclm4\" (UID: \"9d76a2c1-3728-4104-b208-67b329e52d70\") " pod="openstack/ironic-0279-account-create-update-wclm4" Dec 05 13:06:13.896162 master-0 kubenswrapper[29936]: I1205 13:06:13.894985 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-config\") pod \"neutron-db-sync-d6bnq\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:13.921399 master-0 kubenswrapper[29936]: I1205 13:06:13.921295 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fqqdf"] Dec 05 13:06:13.954224 master-0 kubenswrapper[29936]: I1205 13:06:13.942640 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2w55\" (UniqueName: \"kubernetes.io/projected/9d76a2c1-3728-4104-b208-67b329e52d70-kube-api-access-r2w55\") pod \"ironic-0279-account-create-update-wclm4\" (UID: \"9d76a2c1-3728-4104-b208-67b329e52d70\") " pod="openstack/ironic-0279-account-create-update-wclm4" Dec 05 13:06:13.963215 master-0 kubenswrapper[29936]: I1205 13:06:13.955074 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8wwf\" (UniqueName: \"kubernetes.io/projected/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-kube-api-access-q8wwf\") pod \"neutron-db-sync-d6bnq\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:13.975453 master-0 kubenswrapper[29936]: I1205 13:06:13.965237 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:13.975453 master-0 kubenswrapper[29936]: I1205 13:06:13.971944 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 13:06:13.981306 master-0 kubenswrapper[29936]: I1205 13:06:13.981247 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 13:06:14.000333 master-0 kubenswrapper[29936]: I1205 13:06:14.000156 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-config-data\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.000333 master-0 kubenswrapper[29936]: I1205 13:06:14.000322 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21fcc891-90c5-47e7-97e9-e852adfae2bb-logs\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.016052 master-0 kubenswrapper[29936]: I1205 13:06:14.015541 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-scripts\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.016052 master-0 kubenswrapper[29936]: I1205 13:06:14.015680 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-combined-ca-bundle\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.016052 master-0 kubenswrapper[29936]: I1205 13:06:14.015758 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkwhv\" (UniqueName: \"kubernetes.io/projected/21fcc891-90c5-47e7-97e9-e852adfae2bb-kube-api-access-gkwhv\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.034276 master-0 kubenswrapper[29936]: I1205 13:06:14.032406 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fqqdf"] Dec 05 13:06:14.056214 master-0 kubenswrapper[29936]: I1205 13:06:14.053054 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:14.071150 master-0 kubenswrapper[29936]: I1205 13:06:14.071071 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:14.120201 master-0 kubenswrapper[29936]: I1205 13:06:14.118273 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:14.120201 master-0 kubenswrapper[29936]: I1205 13:06:14.119885 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-scripts\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.120201 master-0 kubenswrapper[29936]: I1205 13:06:14.119914 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-combined-ca-bundle\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.120201 master-0 kubenswrapper[29936]: I1205 13:06:14.119940 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkwhv\" (UniqueName: \"kubernetes.io/projected/21fcc891-90c5-47e7-97e9-e852adfae2bb-kube-api-access-gkwhv\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.120201 master-0 kubenswrapper[29936]: I1205 13:06:14.120027 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-config-data\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.120201 master-0 kubenswrapper[29936]: I1205 13:06:14.120052 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21fcc891-90c5-47e7-97e9-e852adfae2bb-logs\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.122583 master-0 kubenswrapper[29936]: I1205 13:06:14.121537 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21fcc891-90c5-47e7-97e9-e852adfae2bb-logs\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.131770 master-0 kubenswrapper[29936]: I1205 13:06:14.131705 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-0279-account-create-update-wclm4" Dec 05 13:06:14.140976 master-0 kubenswrapper[29936]: I1205 13:06:14.140914 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-combined-ca-bundle\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.169653 master-0 kubenswrapper[29936]: I1205 13:06:14.166596 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74c65c7fc-zhsb4"] Dec 05 13:06:14.169653 master-0 kubenswrapper[29936]: I1205 13:06:14.168299 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-config-data\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.187340 master-0 kubenswrapper[29936]: I1205 13:06:14.172795 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-scripts\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.224277 master-0 kubenswrapper[29936]: I1205 13:06:14.222090 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-sb\") pod \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " Dec 05 13:06:14.224277 master-0 kubenswrapper[29936]: I1205 13:06:14.222278 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6f4l\" (UniqueName: \"kubernetes.io/projected/988345f9-c29b-4d52-8caa-fcf2711f0eb0-kube-api-access-f6f4l\") pod \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " Dec 05 13:06:14.224277 master-0 kubenswrapper[29936]: I1205 13:06:14.222317 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-svc\") pod \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " Dec 05 13:06:14.224277 master-0 kubenswrapper[29936]: I1205 13:06:14.222416 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-nb\") pod \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " Dec 05 13:06:14.224277 master-0 kubenswrapper[29936]: I1205 13:06:14.222491 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-swift-storage-0\") pod \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " Dec 05 13:06:14.224277 master-0 kubenswrapper[29936]: I1205 13:06:14.222569 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-config\") pod \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\" (UID: \"988345f9-c29b-4d52-8caa-fcf2711f0eb0\") " Dec 05 13:06:14.278596 master-0 kubenswrapper[29936]: I1205 13:06:14.232144 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkwhv\" (UniqueName: \"kubernetes.io/projected/21fcc891-90c5-47e7-97e9-e852adfae2bb-kube-api-access-gkwhv\") pod \"placement-db-sync-fqqdf\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.278596 master-0 kubenswrapper[29936]: I1205 13:06:14.259041 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55b4ff44b9-8q6mg"] Dec 05 13:06:14.278596 master-0 kubenswrapper[29936]: E1205 13:06:14.264820 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988345f9-c29b-4d52-8caa-fcf2711f0eb0" containerName="dnsmasq-dns" Dec 05 13:06:14.278596 master-0 kubenswrapper[29936]: I1205 13:06:14.264858 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="988345f9-c29b-4d52-8caa-fcf2711f0eb0" containerName="dnsmasq-dns" Dec 05 13:06:14.278596 master-0 kubenswrapper[29936]: E1205 13:06:14.264894 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988345f9-c29b-4d52-8caa-fcf2711f0eb0" containerName="init" Dec 05 13:06:14.278596 master-0 kubenswrapper[29936]: I1205 13:06:14.264901 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="988345f9-c29b-4d52-8caa-fcf2711f0eb0" containerName="init" Dec 05 13:06:14.278596 master-0 kubenswrapper[29936]: I1205 13:06:14.265263 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="988345f9-c29b-4d52-8caa-fcf2711f0eb0" containerName="dnsmasq-dns" Dec 05 13:06:14.278596 master-0 kubenswrapper[29936]: I1205 13:06:14.266723 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988345f9-c29b-4d52-8caa-fcf2711f0eb0-kube-api-access-f6f4l" (OuterVolumeSpecName: "kube-api-access-f6f4l") pod "988345f9-c29b-4d52-8caa-fcf2711f0eb0" (UID: "988345f9-c29b-4d52-8caa-fcf2711f0eb0"). InnerVolumeSpecName "kube-api-access-f6f4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:14.289642 master-0 kubenswrapper[29936]: I1205 13:06:14.288353 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.289642 master-0 kubenswrapper[29936]: I1205 13:06:14.288341 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:14.322910 master-0 kubenswrapper[29936]: I1205 13:06:14.322843 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "988345f9-c29b-4d52-8caa-fcf2711f0eb0" (UID: "988345f9-c29b-4d52-8caa-fcf2711f0eb0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:14.351515 master-0 kubenswrapper[29936]: I1205 13:06:14.348149 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:14.351515 master-0 kubenswrapper[29936]: I1205 13:06:14.351112 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6f4l\" (UniqueName: \"kubernetes.io/projected/988345f9-c29b-4d52-8caa-fcf2711f0eb0-kube-api-access-f6f4l\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:14.351515 master-0 kubenswrapper[29936]: I1205 13:06:14.351036 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b4ff44b9-8q6mg"] Dec 05 13:06:14.365142 master-0 kubenswrapper[29936]: I1205 13:06:14.361777 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-config" (OuterVolumeSpecName: "config") pod "988345f9-c29b-4d52-8caa-fcf2711f0eb0" (UID: "988345f9-c29b-4d52-8caa-fcf2711f0eb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:14.369227 master-0 kubenswrapper[29936]: I1205 13:06:14.369106 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "988345f9-c29b-4d52-8caa-fcf2711f0eb0" (UID: "988345f9-c29b-4d52-8caa-fcf2711f0eb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:14.383859 master-0 kubenswrapper[29936]: I1205 13:06:14.382235 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "988345f9-c29b-4d52-8caa-fcf2711f0eb0" (UID: "988345f9-c29b-4d52-8caa-fcf2711f0eb0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:14.418213 master-0 kubenswrapper[29936]: I1205 13:06:14.408626 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "988345f9-c29b-4d52-8caa-fcf2711f0eb0" (UID: "988345f9-c29b-4d52-8caa-fcf2711f0eb0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:14.456661 master-0 kubenswrapper[29936]: I1205 13:06:14.454040 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmpfh\" (UniqueName: \"kubernetes.io/projected/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-kube-api-access-pmpfh\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.456661 master-0 kubenswrapper[29936]: I1205 13:06:14.454128 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-svc\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.456661 master-0 kubenswrapper[29936]: I1205 13:06:14.454347 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-sb\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.456661 master-0 kubenswrapper[29936]: I1205 13:06:14.454459 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-swift-storage-0\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.456661 master-0 kubenswrapper[29936]: I1205 13:06:14.454523 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-config\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.456661 master-0 kubenswrapper[29936]: I1205 13:06:14.454547 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-nb\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.456661 master-0 kubenswrapper[29936]: I1205 13:06:14.456019 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:14.456661 master-0 kubenswrapper[29936]: I1205 13:06:14.456056 29936 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:14.456661 master-0 kubenswrapper[29936]: I1205 13:06:14.456076 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:14.456661 master-0 kubenswrapper[29936]: I1205 13:06:14.456089 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/988345f9-c29b-4d52-8caa-fcf2711f0eb0-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:14.570175 master-0 kubenswrapper[29936]: I1205 13:06:14.569983 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-sb\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.570175 master-0 kubenswrapper[29936]: I1205 13:06:14.570143 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-swift-storage-0\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.570527 master-0 kubenswrapper[29936]: I1205 13:06:14.570228 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-config\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.570527 master-0 kubenswrapper[29936]: I1205 13:06:14.570255 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-nb\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.571018 master-0 kubenswrapper[29936]: I1205 13:06:14.570984 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmpfh\" (UniqueName: \"kubernetes.io/projected/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-kube-api-access-pmpfh\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.571063 master-0 kubenswrapper[29936]: I1205 13:06:14.571051 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-svc\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.571300 master-0 kubenswrapper[29936]: I1205 13:06:14.571252 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-sb\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.572087 master-0 kubenswrapper[29936]: I1205 13:06:14.572044 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-config\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.572353 master-0 kubenswrapper[29936]: I1205 13:06:14.572315 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-svc\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.572702 master-0 kubenswrapper[29936]: I1205 13:06:14.572647 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-nb\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.573360 master-0 kubenswrapper[29936]: I1205 13:06:14.573324 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-swift-storage-0\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:14.851518 master-0 kubenswrapper[29936]: I1205 13:06:14.851318 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" event={"ID":"988345f9-c29b-4d52-8caa-fcf2711f0eb0","Type":"ContainerDied","Data":"1c713c0a692f8f88096daf79f4df37cb59bd850d9e1a3d4a6540dba87dcfb8a5"} Dec 05 13:06:14.851518 master-0 kubenswrapper[29936]: I1205 13:06:14.851433 29936 scope.go:117] "RemoveContainer" containerID="bbe166a8d338e001f16a51e9ca9775f665aec07436b355d942d5493703911a99" Dec 05 13:06:14.852317 master-0 kubenswrapper[29936]: I1205 13:06:14.852281 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b54cfd657-cd9tt" Dec 05 13:06:14.883689 master-0 kubenswrapper[29936]: I1205 13:06:14.883380 29936 scope.go:117] "RemoveContainer" containerID="2c1df04a7b39462450760b3c92edfc7ce6e11b2c4a82fab79e462f50beb85d48" Dec 05 13:06:15.084201 master-0 kubenswrapper[29936]: I1205 13:06:15.084143 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmpfh\" (UniqueName: \"kubernetes.io/projected/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-kube-api-access-pmpfh\") pod \"dnsmasq-dns-55b4ff44b9-8q6mg\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:15.288796 master-0 kubenswrapper[29936]: I1205 13:06:15.288720 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:15.682941 master-0 kubenswrapper[29936]: W1205 13:06:15.673658 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21fcc891_90c5_47e7_97e9_e852adfae2bb.slice/crio-a26842610f9b59d6310b47d52ca35deb002ea4ccae97fdf45e5df9c05ffd9dc7 WatchSource:0}: Error finding container a26842610f9b59d6310b47d52ca35deb002ea4ccae97fdf45e5df9c05ffd9dc7: Status 404 returned error can't find the container with id a26842610f9b59d6310b47d52ca35deb002ea4ccae97fdf45e5df9c05ffd9dc7 Dec 05 13:06:15.742463 master-0 kubenswrapper[29936]: I1205 13:06:15.739549 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74c65c7fc-zhsb4"] Dec 05 13:06:15.774964 master-0 kubenswrapper[29936]: I1205 13:06:15.772930 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b54cfd657-cd9tt"] Dec 05 13:06:15.832575 master-0 kubenswrapper[29936]: I1205 13:06:15.832511 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hxjkn"] Dec 05 13:06:15.853220 master-0 kubenswrapper[29936]: I1205 13:06:15.853110 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fqqdf"] Dec 05 13:06:15.901104 master-0 kubenswrapper[29936]: I1205 13:06:15.901007 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-0279-account-create-update-wclm4"] Dec 05 13:06:15.920161 master-0 kubenswrapper[29936]: I1205 13:06:15.920082 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-db-sync-6scb5"] Dec 05 13:06:15.951207 master-0 kubenswrapper[29936]: I1205 13:06:15.944569 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-d6bnq"] Dec 05 13:06:15.968929 master-0 kubenswrapper[29936]: I1205 13:06:15.968823 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-9fz4t"] Dec 05 13:06:15.982919 master-0 kubenswrapper[29936]: I1205 13:06:15.982827 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b54cfd657-cd9tt"] Dec 05 13:06:16.028222 master-0 kubenswrapper[29936]: I1205 13:06:16.028160 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b4ff44b9-8q6mg"] Dec 05 13:06:16.081372 master-0 kubenswrapper[29936]: I1205 13:06:16.081306 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hxjkn" event={"ID":"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5","Type":"ContainerStarted","Data":"4cb7fec689ec48daec3dd85a44e12ae50a8c95d05ed8e8dc2bf1131cd63d4788"} Dec 05 13:06:16.083839 master-0 kubenswrapper[29936]: I1205 13:06:16.083785 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b4ff44b9-8q6mg"] Dec 05 13:06:16.087373 master-0 kubenswrapper[29936]: I1205 13:06:16.087295 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fqqdf" event={"ID":"21fcc891-90c5-47e7-97e9-e852adfae2bb","Type":"ContainerStarted","Data":"a26842610f9b59d6310b47d52ca35deb002ea4ccae97fdf45e5df9c05ffd9dc7"} Dec 05 13:06:16.088657 master-0 kubenswrapper[29936]: I1205 13:06:16.088551 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-0279-account-create-update-wclm4" event={"ID":"9d76a2c1-3728-4104-b208-67b329e52d70","Type":"ContainerStarted","Data":"15e696002b7a51652b0f28b591acf47ffee19a46df16874f81b9ed97ee862298"} Dec 05 13:06:16.094039 master-0 kubenswrapper[29936]: I1205 13:06:16.093954 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-9fz4t" event={"ID":"72d08438-77de-4a6e-81d7-a9f76078a1b6","Type":"ContainerStarted","Data":"73dd0b14b07761d534bd3676cabd5b4513db249a94d040e1f8c984f917822c45"} Dec 05 13:06:16.095366 master-0 kubenswrapper[29936]: I1205 13:06:16.094523 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" event={"ID":"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b","Type":"ContainerStarted","Data":"eae51f214628225d348fda2c6be776cf432b3c41712e009df1ce3b64444f7034"} Dec 05 13:06:16.098648 master-0 kubenswrapper[29936]: I1205 13:06:16.098227 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d6bnq" event={"ID":"549e5366-4e6e-4d97-aeb2-25f74ce81b4b","Type":"ContainerStarted","Data":"ca835c4a093dce5df433a43aa8a98644fb19627d451c3f4f3614978ab61beec4"} Dec 05 13:06:16.099970 master-0 kubenswrapper[29936]: I1205 13:06:16.099801 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" event={"ID":"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc","Type":"ContainerStarted","Data":"ea134aeb20810384fa8ed61606e871ca2fd5e946461cfd99a390bed0c9a2b2a2"} Dec 05 13:06:16.105584 master-0 kubenswrapper[29936]: I1205 13:06:16.101292 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-db-sync-6scb5" event={"ID":"b82bd984-5f64-433b-a41a-f5186287a0f7","Type":"ContainerStarted","Data":"79b83ccd8e3bf4e69a70ccd6f3ac4837419a37e1205366adf6336ac214099e09"} Dec 05 13:06:16.207387 master-0 kubenswrapper[29936]: I1205 13:06:16.207333 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dbf54b6fc-9q2hj"] Dec 05 13:06:16.213685 master-0 kubenswrapper[29936]: I1205 13:06:16.213630 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.243427 master-0 kubenswrapper[29936]: I1205 13:06:16.243333 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dbf54b6fc-9q2hj"] Dec 05 13:06:16.292337 master-0 kubenswrapper[29936]: I1205 13:06:16.292260 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-config\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.292337 master-0 kubenswrapper[29936]: I1205 13:06:16.292340 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-svc\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.292703 master-0 kubenswrapper[29936]: I1205 13:06:16.292417 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-swift-storage-0\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.292703 master-0 kubenswrapper[29936]: I1205 13:06:16.292539 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.292703 master-0 kubenswrapper[29936]: I1205 13:06:16.292618 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.292703 master-0 kubenswrapper[29936]: I1205 13:06:16.292641 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv84q\" (UniqueName: \"kubernetes.io/projected/9a285173-334c-409d-87a5-9c8e18c77f50-kube-api-access-bv84q\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.398756 master-0 kubenswrapper[29936]: I1205 13:06:16.395953 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-config\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.398756 master-0 kubenswrapper[29936]: I1205 13:06:16.396254 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-svc\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.398756 master-0 kubenswrapper[29936]: I1205 13:06:16.396627 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-swift-storage-0\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.398756 master-0 kubenswrapper[29936]: I1205 13:06:16.397010 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.398756 master-0 kubenswrapper[29936]: I1205 13:06:16.397048 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-config\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.398756 master-0 kubenswrapper[29936]: I1205 13:06:16.397527 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.398756 master-0 kubenswrapper[29936]: I1205 13:06:16.397609 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv84q\" (UniqueName: \"kubernetes.io/projected/9a285173-334c-409d-87a5-9c8e18c77f50-kube-api-access-bv84q\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.398756 master-0 kubenswrapper[29936]: I1205 13:06:16.398012 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-swift-storage-0\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.399406 master-0 kubenswrapper[29936]: I1205 13:06:16.398968 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-nb\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.399785 master-0 kubenswrapper[29936]: I1205 13:06:16.399711 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-svc\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.399785 master-0 kubenswrapper[29936]: I1205 13:06:16.399727 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-sb\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.428574 master-0 kubenswrapper[29936]: I1205 13:06:16.428452 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv84q\" (UniqueName: \"kubernetes.io/projected/9a285173-334c-409d-87a5-9c8e18c77f50-kube-api-access-bv84q\") pod \"dnsmasq-dns-6dbf54b6fc-9q2hj\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:16.721504 master-0 kubenswrapper[29936]: I1205 13:06:16.717354 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:17.126422 master-0 kubenswrapper[29936]: I1205 13:06:17.126351 29936 generic.go:334] "Generic (PLEG): container finished" podID="9d76a2c1-3728-4104-b208-67b329e52d70" containerID="d1a6313a656db2f5677f30dc51d1c451284d4f2dcdeffbdb49fd9efd782328c9" exitCode=0 Dec 05 13:06:17.127122 master-0 kubenswrapper[29936]: I1205 13:06:17.126438 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-0279-account-create-update-wclm4" event={"ID":"9d76a2c1-3728-4104-b208-67b329e52d70","Type":"ContainerDied","Data":"d1a6313a656db2f5677f30dc51d1c451284d4f2dcdeffbdb49fd9efd782328c9"} Dec 05 13:06:17.182803 master-0 kubenswrapper[29936]: I1205 13:06:17.182637 29936 generic.go:334] "Generic (PLEG): container finished" podID="72d08438-77de-4a6e-81d7-a9f76078a1b6" containerID="32577dcea0dff0a1e68d827f23c3807f29c2ad394ed4eaa10c7f407c6fac643a" exitCode=0 Dec 05 13:06:17.182803 master-0 kubenswrapper[29936]: I1205 13:06:17.182771 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-9fz4t" event={"ID":"72d08438-77de-4a6e-81d7-a9f76078a1b6","Type":"ContainerDied","Data":"32577dcea0dff0a1e68d827f23c3807f29c2ad394ed4eaa10c7f407c6fac643a"} Dec 05 13:06:17.191132 master-0 kubenswrapper[29936]: I1205 13:06:17.191066 29936 generic.go:334] "Generic (PLEG): container finished" podID="e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" containerID="8db18ab70376f9ce2a7f5dd54d7510d037dad0f6f2f50682f6c860ff0fe0dd56" exitCode=0 Dec 05 13:06:17.209536 master-0 kubenswrapper[29936]: I1205 13:06:17.209469 29936 generic.go:334] "Generic (PLEG): container finished" podID="3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" containerID="4f7bc339c74b5068502ab76290c63fa3c1f792b2abe6a753ca468ee53ea5ab20" exitCode=0 Dec 05 13:06:17.248016 master-0 kubenswrapper[29936]: I1205 13:06:17.247869 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988345f9-c29b-4d52-8caa-fcf2711f0eb0" path="/var/lib/kubelet/pods/988345f9-c29b-4d52-8caa-fcf2711f0eb0/volumes" Dec 05 13:06:17.248688 master-0 kubenswrapper[29936]: I1205 13:06:17.248637 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" event={"ID":"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b","Type":"ContainerDied","Data":"8db18ab70376f9ce2a7f5dd54d7510d037dad0f6f2f50682f6c860ff0fe0dd56"} Dec 05 13:06:17.248688 master-0 kubenswrapper[29936]: I1205 13:06:17.248678 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d6bnq" event={"ID":"549e5366-4e6e-4d97-aeb2-25f74ce81b4b","Type":"ContainerStarted","Data":"2ba41f0827deea62e1400136039f07960c0733677767da6118aec92285f52a1d"} Dec 05 13:06:17.249113 master-0 kubenswrapper[29936]: I1205 13:06:17.248696 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hxjkn" event={"ID":"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5","Type":"ContainerStarted","Data":"d37f4922a4c5f2d7041603a8e09aa08447bad4f98e7d1beff4eea2d643f97308"} Dec 05 13:06:17.249113 master-0 kubenswrapper[29936]: I1205 13:06:17.248708 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" event={"ID":"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc","Type":"ContainerDied","Data":"4f7bc339c74b5068502ab76290c63fa3c1f792b2abe6a753ca468ee53ea5ab20"} Dec 05 13:06:17.258973 master-0 kubenswrapper[29936]: I1205 13:06:17.258875 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-d6bnq" podStartSLOduration=4.258840544 podStartE2EDuration="4.258840544s" podCreationTimestamp="2025-12-05 13:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:17.232498644 +0000 UTC m=+974.364578345" watchObservedRunningTime="2025-12-05 13:06:17.258840544 +0000 UTC m=+974.390920225" Dec 05 13:06:17.364109 master-0 kubenswrapper[29936]: I1205 13:06:17.363400 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dbf54b6fc-9q2hj"] Dec 05 13:06:17.416682 master-0 kubenswrapper[29936]: I1205 13:06:17.414311 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hxjkn" podStartSLOduration=4.4142893579999996 podStartE2EDuration="4.414289358s" podCreationTimestamp="2025-12-05 13:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:17.360725005 +0000 UTC m=+974.492804696" watchObservedRunningTime="2025-12-05 13:06:17.414289358 +0000 UTC m=+974.546369039" Dec 05 13:06:17.761264 master-0 kubenswrapper[29936]: I1205 13:06:17.752290 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:17.761264 master-0 kubenswrapper[29936]: I1205 13:06:17.754795 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:17.761264 master-0 kubenswrapper[29936]: I1205 13:06:17.759647 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b46d8-default-external-config-data" Dec 05 13:06:17.761264 master-0 kubenswrapper[29936]: I1205 13:06:17.760114 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 13:06:17.788714 master-0 kubenswrapper[29936]: I1205 13:06:17.788210 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:17.841718 master-0 kubenswrapper[29936]: I1205 13:06:17.826915 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.917842 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-config\") pod \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.918035 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmpfh\" (UniqueName: \"kubernetes.io/projected/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-kube-api-access-pmpfh\") pod \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.918277 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-swift-storage-0\") pod \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.918486 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-svc\") pod \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.918621 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-nb\") pod \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.918688 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-sb\") pod \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\" (UID: \"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b\") " Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.919235 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.919259 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.919356 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.919385 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76z64\" (UniqueName: \"kubernetes.io/projected/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-kube-api-access-76z64\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.919474 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.919737 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.919842 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:17.921516 master-0 kubenswrapper[29936]: I1205 13:06:17.921428 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:17.927839 master-0 kubenswrapper[29936]: I1205 13:06:17.927614 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-kube-api-access-pmpfh" (OuterVolumeSpecName: "kube-api-access-pmpfh") pod "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" (UID: "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b"). InnerVolumeSpecName "kube-api-access-pmpfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:17.968032 master-0 kubenswrapper[29936]: I1205 13:06:17.967669 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" (UID: "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:17.968768 master-0 kubenswrapper[29936]: I1205 13:06:17.968633 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" (UID: "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:17.970227 master-0 kubenswrapper[29936]: I1205 13:06:17.970156 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" (UID: "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:17.974565 master-0 kubenswrapper[29936]: I1205 13:06:17.973339 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-config" (OuterVolumeSpecName: "config") pod "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" (UID: "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:18.010617 master-0 kubenswrapper[29936]: I1205 13:06:18.009830 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" (UID: "e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:18.021775 master-0 kubenswrapper[29936]: I1205 13:06:18.021708 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-config\") pod \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " Dec 05 13:06:18.021775 master-0 kubenswrapper[29936]: I1205 13:06:18.021768 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t627\" (UniqueName: \"kubernetes.io/projected/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-kube-api-access-2t627\") pod \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " Dec 05 13:06:18.022087 master-0 kubenswrapper[29936]: I1205 13:06:18.021951 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-sb\") pod \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " Dec 05 13:06:18.022087 master-0 kubenswrapper[29936]: I1205 13:06:18.021981 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-nb\") pod \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " Dec 05 13:06:18.031748 master-0 kubenswrapper[29936]: I1205 13:06:18.026987 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-swift-storage-0\") pod \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " Dec 05 13:06:18.031748 master-0 kubenswrapper[29936]: I1205 13:06:18.027115 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-svc\") pod \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\" (UID: \"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc\") " Dec 05 13:06:18.031748 master-0 kubenswrapper[29936]: I1205 13:06:18.029643 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-kube-api-access-2t627" (OuterVolumeSpecName: "kube-api-access-2t627") pod "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" (UID: "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc"). InnerVolumeSpecName "kube-api-access-2t627". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:18.040461 master-0 kubenswrapper[29936]: I1205 13:06:18.039094 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.040461 master-0 kubenswrapper[29936]: I1205 13:06:18.039201 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.040461 master-0 kubenswrapper[29936]: I1205 13:06:18.039547 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.040461 master-0 kubenswrapper[29936]: I1205 13:06:18.039630 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.040461 master-0 kubenswrapper[29936]: I1205 13:06:18.039689 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.040461 master-0 kubenswrapper[29936]: I1205 13:06:18.039727 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.040461 master-0 kubenswrapper[29936]: I1205 13:06:18.039758 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76z64\" (UniqueName: \"kubernetes.io/projected/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-kube-api-access-76z64\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.040461 master-0 kubenswrapper[29936]: I1205 13:06:18.040044 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.040461 master-0 kubenswrapper[29936]: I1205 13:06:18.040057 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.040461 master-0 kubenswrapper[29936]: I1205 13:06:18.040474 29936 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.041379 master-0 kubenswrapper[29936]: I1205 13:06:18.040492 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t627\" (UniqueName: \"kubernetes.io/projected/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-kube-api-access-2t627\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.041379 master-0 kubenswrapper[29936]: I1205 13:06:18.040505 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.041379 master-0 kubenswrapper[29936]: I1205 13:06:18.040514 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.041379 master-0 kubenswrapper[29936]: I1205 13:06:18.040524 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.041379 master-0 kubenswrapper[29936]: I1205 13:06:18.040535 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.041379 master-0 kubenswrapper[29936]: I1205 13:06:18.040546 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmpfh\" (UniqueName: \"kubernetes.io/projected/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b-kube-api-access-pmpfh\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.042172 master-0 kubenswrapper[29936]: I1205 13:06:18.042142 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:06:18.042247 master-0 kubenswrapper[29936]: I1205 13:06:18.042205 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/49c247ccb75d82d7aa2d53a927c5c3fb8512a27de6927aa154d2e3366fa1652b/globalmount\"" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.045458 master-0 kubenswrapper[29936]: I1205 13:06:18.044937 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.051018 master-0 kubenswrapper[29936]: I1205 13:06:18.050909 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.052752 master-0 kubenswrapper[29936]: I1205 13:06:18.052694 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" (UID: "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:18.076808 master-0 kubenswrapper[29936]: I1205 13:06:18.072950 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.076808 master-0 kubenswrapper[29936]: I1205 13:06:18.076698 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76z64\" (UniqueName: \"kubernetes.io/projected/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-kube-api-access-76z64\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.083519 master-0 kubenswrapper[29936]: I1205 13:06:18.083220 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" (UID: "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:18.095262 master-0 kubenswrapper[29936]: I1205 13:06:18.094162 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-config" (OuterVolumeSpecName: "config") pod "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" (UID: "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:18.105141 master-0 kubenswrapper[29936]: I1205 13:06:18.097754 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" (UID: "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:18.112035 master-0 kubenswrapper[29936]: I1205 13:06:18.111591 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:18.112691 master-0 kubenswrapper[29936]: E1205 13:06:18.112659 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-b46d8-default-external-api-0" podUID="05bf92ec-c921-41b2-bf1d-ca8bd4d52531" Dec 05 13:06:18.124275 master-0 kubenswrapper[29936]: I1205 13:06:18.123849 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" (UID: "3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:18.128095 master-0 kubenswrapper[29936]: I1205 13:06:18.128013 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:18.128918 master-0 kubenswrapper[29936]: E1205 13:06:18.128585 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" containerName="init" Dec 05 13:06:18.128918 master-0 kubenswrapper[29936]: I1205 13:06:18.128601 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" containerName="init" Dec 05 13:06:18.128918 master-0 kubenswrapper[29936]: E1205 13:06:18.128618 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" containerName="init" Dec 05 13:06:18.128918 master-0 kubenswrapper[29936]: I1205 13:06:18.128625 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" containerName="init" Dec 05 13:06:18.128918 master-0 kubenswrapper[29936]: I1205 13:06:18.128853 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" containerName="init" Dec 05 13:06:18.128918 master-0 kubenswrapper[29936]: I1205 13:06:18.128908 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" containerName="init" Dec 05 13:06:18.131497 master-0 kubenswrapper[29936]: I1205 13:06:18.130058 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.132494 master-0 kubenswrapper[29936]: I1205 13:06:18.132440 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b46d8-default-internal-config-data" Dec 05 13:06:18.142778 master-0 kubenswrapper[29936]: I1205 13:06:18.142685 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.142778 master-0 kubenswrapper[29936]: I1205 13:06:18.142734 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.142778 master-0 kubenswrapper[29936]: I1205 13:06:18.142746 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.143273 master-0 kubenswrapper[29936]: I1205 13:06:18.142758 29936 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.143273 master-0 kubenswrapper[29936]: I1205 13:06:18.142956 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.171965 master-0 kubenswrapper[29936]: I1205 13:06:18.171824 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:18.231028 master-0 kubenswrapper[29936]: I1205 13:06:18.230937 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:18.232774 master-0 kubenswrapper[29936]: E1205 13:06:18.232718 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-svh9r logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-b46d8-default-internal-api-0" podUID="e2629c18-e59e-48de-bde3-41d5a7452a70" Dec 05 13:06:18.241800 master-0 kubenswrapper[29936]: I1205 13:06:18.241647 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" Dec 05 13:06:18.242017 master-0 kubenswrapper[29936]: I1205 13:06:18.241790 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b4ff44b9-8q6mg" event={"ID":"e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b","Type":"ContainerDied","Data":"eae51f214628225d348fda2c6be776cf432b3c41712e009df1ce3b64444f7034"} Dec 05 13:06:18.242017 master-0 kubenswrapper[29936]: I1205 13:06:18.241899 29936 scope.go:117] "RemoveContainer" containerID="8db18ab70376f9ce2a7f5dd54d7510d037dad0f6f2f50682f6c860ff0fe0dd56" Dec 05 13:06:18.246274 master-0 kubenswrapper[29936]: I1205 13:06:18.244978 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svh9r\" (UniqueName: \"kubernetes.io/projected/e2629c18-e59e-48de-bde3-41d5a7452a70-kube-api-access-svh9r\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.246274 master-0 kubenswrapper[29936]: I1205 13:06:18.245071 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.246274 master-0 kubenswrapper[29936]: I1205 13:06:18.245230 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.246274 master-0 kubenswrapper[29936]: I1205 13:06:18.245282 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.246274 master-0 kubenswrapper[29936]: I1205 13:06:18.245332 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.246274 master-0 kubenswrapper[29936]: I1205 13:06:18.245647 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.246274 master-0 kubenswrapper[29936]: I1205 13:06:18.245783 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.253487 master-0 kubenswrapper[29936]: I1205 13:06:18.249844 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" event={"ID":"3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc","Type":"ContainerDied","Data":"ea134aeb20810384fa8ed61606e871ca2fd5e946461cfd99a390bed0c9a2b2a2"} Dec 05 13:06:18.253487 master-0 kubenswrapper[29936]: I1205 13:06:18.249884 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74c65c7fc-zhsb4" Dec 05 13:06:18.253969 master-0 kubenswrapper[29936]: I1205 13:06:18.253646 29936 generic.go:334] "Generic (PLEG): container finished" podID="9a285173-334c-409d-87a5-9c8e18c77f50" containerID="831e37b16587efc185886f30c74346daf9db35cf41f7b5b2159773c166b4cf09" exitCode=0 Dec 05 13:06:18.254420 master-0 kubenswrapper[29936]: I1205 13:06:18.254362 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" event={"ID":"9a285173-334c-409d-87a5-9c8e18c77f50","Type":"ContainerDied","Data":"831e37b16587efc185886f30c74346daf9db35cf41f7b5b2159773c166b4cf09"} Dec 05 13:06:18.254529 master-0 kubenswrapper[29936]: I1205 13:06:18.254435 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" event={"ID":"9a285173-334c-409d-87a5-9c8e18c77f50","Type":"ContainerStarted","Data":"2bcbfaf79ebe2dbd77659cb5bbd923a480043161f71b8d7ec94a9fe4b7ca35a7"} Dec 05 13:06:18.255538 master-0 kubenswrapper[29936]: I1205 13:06:18.255512 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.293488 master-0 kubenswrapper[29936]: I1205 13:06:18.292709 29936 scope.go:117] "RemoveContainer" containerID="4f7bc339c74b5068502ab76290c63fa3c1f792b2abe6a753ca468ee53ea5ab20" Dec 05 13:06:18.344542 master-0 kubenswrapper[29936]: I1205 13:06:18.338337 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:18.348961 master-0 kubenswrapper[29936]: I1205 13:06:18.348810 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-combined-ca-bundle\") pod \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " Dec 05 13:06:18.348961 master-0 kubenswrapper[29936]: I1205 13:06:18.348920 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-logs\") pod \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " Dec 05 13:06:18.349694 master-0 kubenswrapper[29936]: I1205 13:06:18.349030 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-scripts\") pod \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " Dec 05 13:06:18.349694 master-0 kubenswrapper[29936]: I1205 13:06:18.349089 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-config-data\") pod \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " Dec 05 13:06:18.349694 master-0 kubenswrapper[29936]: I1205 13:06:18.349128 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76z64\" (UniqueName: \"kubernetes.io/projected/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-kube-api-access-76z64\") pod \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " Dec 05 13:06:18.349694 master-0 kubenswrapper[29936]: I1205 13:06:18.349391 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-httpd-run\") pod \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " Dec 05 13:06:18.349694 master-0 kubenswrapper[29936]: I1205 13:06:18.349463 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-logs" (OuterVolumeSpecName: "logs") pod "05bf92ec-c921-41b2-bf1d-ca8bd4d52531" (UID: "05bf92ec-c921-41b2-bf1d-ca8bd4d52531"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:06:18.350341 master-0 kubenswrapper[29936]: I1205 13:06:18.350320 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.350406 master-0 kubenswrapper[29936]: I1205 13:06:18.350393 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.350521 master-0 kubenswrapper[29936]: I1205 13:06:18.350457 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.350521 master-0 kubenswrapper[29936]: I1205 13:06:18.350513 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.350600 master-0 kubenswrapper[29936]: I1205 13:06:18.350576 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.359234 master-0 kubenswrapper[29936]: I1205 13:06:18.350723 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svh9r\" (UniqueName: \"kubernetes.io/projected/e2629c18-e59e-48de-bde3-41d5a7452a70-kube-api-access-svh9r\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.359234 master-0 kubenswrapper[29936]: I1205 13:06:18.350759 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.359234 master-0 kubenswrapper[29936]: I1205 13:06:18.350974 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.359234 master-0 kubenswrapper[29936]: I1205 13:06:18.353173 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "05bf92ec-c921-41b2-bf1d-ca8bd4d52531" (UID: "05bf92ec-c921-41b2-bf1d-ca8bd4d52531"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:06:18.359234 master-0 kubenswrapper[29936]: I1205 13:06:18.353704 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.359234 master-0 kubenswrapper[29936]: I1205 13:06:18.353939 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.364223 master-0 kubenswrapper[29936]: I1205 13:06:18.361670 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-config-data" (OuterVolumeSpecName: "config-data") pod "05bf92ec-c921-41b2-bf1d-ca8bd4d52531" (UID: "05bf92ec-c921-41b2-bf1d-ca8bd4d52531"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:18.364223 master-0 kubenswrapper[29936]: I1205 13:06:18.361938 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-kube-api-access-76z64" (OuterVolumeSpecName: "kube-api-access-76z64") pod "05bf92ec-c921-41b2-bf1d-ca8bd4d52531" (UID: "05bf92ec-c921-41b2-bf1d-ca8bd4d52531"). InnerVolumeSpecName "kube-api-access-76z64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:18.364962 master-0 kubenswrapper[29936]: I1205 13:06:18.364897 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.371435 master-0 kubenswrapper[29936]: I1205 13:06:18.368153 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.371435 master-0 kubenswrapper[29936]: I1205 13:06:18.370023 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:06:18.371435 master-0 kubenswrapper[29936]: I1205 13:06:18.370095 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c11f3d51f30df7daf7a2bb71b828158f184983aebcc191306ab2ed71e7a567d1/globalmount\"" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.371435 master-0 kubenswrapper[29936]: I1205 13:06:18.378157 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.371435 master-0 kubenswrapper[29936]: I1205 13:06:18.396735 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-scripts" (OuterVolumeSpecName: "scripts") pod "05bf92ec-c921-41b2-bf1d-ca8bd4d52531" (UID: "05bf92ec-c921-41b2-bf1d-ca8bd4d52531"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:18.405306 master-0 kubenswrapper[29936]: I1205 13:06:18.401247 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05bf92ec-c921-41b2-bf1d-ca8bd4d52531" (UID: "05bf92ec-c921-41b2-bf1d-ca8bd4d52531"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:18.419149 master-0 kubenswrapper[29936]: I1205 13:06:18.419082 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svh9r\" (UniqueName: \"kubernetes.io/projected/e2629c18-e59e-48de-bde3-41d5a7452a70-kube-api-access-svh9r\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:18.462194 master-0 kubenswrapper[29936]: I1205 13:06:18.455241 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.462194 master-0 kubenswrapper[29936]: I1205 13:06:18.455316 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.462194 master-0 kubenswrapper[29936]: I1205 13:06:18.455326 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.462194 master-0 kubenswrapper[29936]: I1205 13:06:18.455343 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76z64\" (UniqueName: \"kubernetes.io/projected/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-kube-api-access-76z64\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.462194 master-0 kubenswrapper[29936]: I1205 13:06:18.455357 29936 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05bf92ec-c921-41b2-bf1d-ca8bd4d52531-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:18.837656 master-0 kubenswrapper[29936]: I1205 13:06:18.817128 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b4ff44b9-8q6mg"] Dec 05 13:06:18.895506 master-0 kubenswrapper[29936]: I1205 13:06:18.846097 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55b4ff44b9-8q6mg"] Dec 05 13:06:18.922838 master-0 kubenswrapper[29936]: I1205 13:06:18.920578 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74c65c7fc-zhsb4"] Dec 05 13:06:18.957223 master-0 kubenswrapper[29936]: I1205 13:06:18.957091 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74c65c7fc-zhsb4"] Dec 05 13:06:19.204830 master-0 kubenswrapper[29936]: I1205 13:06:19.204729 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc" path="/var/lib/kubelet/pods/3b3e7704-5fe0-4fde-81ba-e3ae1f3132fc/volumes" Dec 05 13:06:19.205561 master-0 kubenswrapper[29936]: I1205 13:06:19.205547 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b" path="/var/lib/kubelet/pods/e5eb502a-912b-4f3e-af55-c1cd9ab9fe9b/volumes" Dec 05 13:06:19.275312 master-0 kubenswrapper[29936]: I1205 13:06:19.275245 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" event={"ID":"9a285173-334c-409d-87a5-9c8e18c77f50","Type":"ContainerStarted","Data":"70524485e77b0f63ff22a28523a8a5fd365423c09029a91b199af308f5ff9cc9"} Dec 05 13:06:19.275571 master-0 kubenswrapper[29936]: I1205 13:06:19.275534 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:19.281592 master-0 kubenswrapper[29936]: I1205 13:06:19.281529 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.281592 master-0 kubenswrapper[29936]: I1205 13:06:19.281577 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:19.295853 master-0 kubenswrapper[29936]: I1205 13:06:19.295736 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:19.351917 master-0 kubenswrapper[29936]: I1205 13:06:19.351436 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" podStartSLOduration=4.351407884 podStartE2EDuration="4.351407884s" podCreationTimestamp="2025-12-05 13:06:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:19.309037549 +0000 UTC m=+976.441117240" watchObservedRunningTime="2025-12-05 13:06:19.351407884 +0000 UTC m=+976.483487555" Dec 05 13:06:19.397448 master-0 kubenswrapper[29936]: I1205 13:06:19.392475 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:19.405468 master-0 kubenswrapper[29936]: I1205 13:06:19.405409 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:19.436066 master-0 kubenswrapper[29936]: I1205 13:06:19.435992 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:19.440686 master-0 kubenswrapper[29936]: I1205 13:06:19.440619 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.441417 master-0 kubenswrapper[29936]: I1205 13:06:19.441328 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:19.447614 master-0 kubenswrapper[29936]: I1205 13:06:19.447503 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b46d8-default-external-config-data" Dec 05 13:06:19.499929 master-0 kubenswrapper[29936]: I1205 13:06:19.499841 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svh9r\" (UniqueName: \"kubernetes.io/projected/e2629c18-e59e-48de-bde3-41d5a7452a70-kube-api-access-svh9r\") pod \"e2629c18-e59e-48de-bde3-41d5a7452a70\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " Dec 05 13:06:19.500453 master-0 kubenswrapper[29936]: I1205 13:06:19.500392 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-httpd-run\") pod \"e2629c18-e59e-48de-bde3-41d5a7452a70\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " Dec 05 13:06:19.500618 master-0 kubenswrapper[29936]: I1205 13:06:19.500553 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-combined-ca-bundle\") pod \"e2629c18-e59e-48de-bde3-41d5a7452a70\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " Dec 05 13:06:19.502148 master-0 kubenswrapper[29936]: I1205 13:06:19.501087 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e2629c18-e59e-48de-bde3-41d5a7452a70" (UID: "e2629c18-e59e-48de-bde3-41d5a7452a70"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:06:19.508356 master-0 kubenswrapper[29936]: I1205 13:06:19.501249 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-scripts\") pod \"e2629c18-e59e-48de-bde3-41d5a7452a70\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " Dec 05 13:06:19.508356 master-0 kubenswrapper[29936]: I1205 13:06:19.507018 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2629c18-e59e-48de-bde3-41d5a7452a70-kube-api-access-svh9r" (OuterVolumeSpecName: "kube-api-access-svh9r") pod "e2629c18-e59e-48de-bde3-41d5a7452a70" (UID: "e2629c18-e59e-48de-bde3-41d5a7452a70"). InnerVolumeSpecName "kube-api-access-svh9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:19.508356 master-0 kubenswrapper[29936]: I1205 13:06:19.508003 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-logs\") pod \"e2629c18-e59e-48de-bde3-41d5a7452a70\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " Dec 05 13:06:19.508356 master-0 kubenswrapper[29936]: I1205 13:06:19.508335 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-config-data\") pod \"e2629c18-e59e-48de-bde3-41d5a7452a70\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " Dec 05 13:06:19.510064 master-0 kubenswrapper[29936]: I1205 13:06:19.509780 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-logs" (OuterVolumeSpecName: "logs") pod "e2629c18-e59e-48de-bde3-41d5a7452a70" (UID: "e2629c18-e59e-48de-bde3-41d5a7452a70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:06:19.510064 master-0 kubenswrapper[29936]: I1205 13:06:19.509972 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-scripts" (OuterVolumeSpecName: "scripts") pod "e2629c18-e59e-48de-bde3-41d5a7452a70" (UID: "e2629c18-e59e-48de-bde3-41d5a7452a70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:19.512900 master-0 kubenswrapper[29936]: I1205 13:06:19.512771 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:19.512900 master-0 kubenswrapper[29936]: I1205 13:06:19.512804 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:19.512900 master-0 kubenswrapper[29936]: I1205 13:06:19.512815 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svh9r\" (UniqueName: \"kubernetes.io/projected/e2629c18-e59e-48de-bde3-41d5a7452a70-kube-api-access-svh9r\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:19.512900 master-0 kubenswrapper[29936]: I1205 13:06:19.512827 29936 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e2629c18-e59e-48de-bde3-41d5a7452a70-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:19.513695 master-0 kubenswrapper[29936]: I1205 13:06:19.513661 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-config-data" (OuterVolumeSpecName: "config-data") pod "e2629c18-e59e-48de-bde3-41d5a7452a70" (UID: "e2629c18-e59e-48de-bde3-41d5a7452a70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:19.535262 master-0 kubenswrapper[29936]: I1205 13:06:19.535134 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2629c18-e59e-48de-bde3-41d5a7452a70" (UID: "e2629c18-e59e-48de-bde3-41d5a7452a70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:19.618822 master-0 kubenswrapper[29936]: I1205 13:06:19.618605 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.619151 master-0 kubenswrapper[29936]: I1205 13:06:19.618888 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fjnk\" (UniqueName: \"kubernetes.io/projected/6a2c9ca2-9c02-4227-bece-76f7c63b253e-kube-api-access-7fjnk\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.619151 master-0 kubenswrapper[29936]: I1205 13:06:19.618989 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.619151 master-0 kubenswrapper[29936]: I1205 13:06:19.619066 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.619366 master-0 kubenswrapper[29936]: I1205 13:06:19.619218 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.619366 master-0 kubenswrapper[29936]: I1205 13:06:19.619300 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.619499 master-0 kubenswrapper[29936]: I1205 13:06:19.619427 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:19.619499 master-0 kubenswrapper[29936]: I1205 13:06:19.619443 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2629c18-e59e-48de-bde3-41d5a7452a70-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:19.623526 master-0 kubenswrapper[29936]: I1205 13:06:19.623469 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.722979 master-0 kubenswrapper[29936]: I1205 13:06:19.721601 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\" (UID: \"05bf92ec-c921-41b2-bf1d-ca8bd4d52531\") " Dec 05 13:06:19.722979 master-0 kubenswrapper[29936]: I1205 13:06:19.722678 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fjnk\" (UniqueName: \"kubernetes.io/projected/6a2c9ca2-9c02-4227-bece-76f7c63b253e-kube-api-access-7fjnk\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.729879 master-0 kubenswrapper[29936]: I1205 13:06:19.728481 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.729879 master-0 kubenswrapper[29936]: I1205 13:06:19.728745 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.729879 master-0 kubenswrapper[29936]: I1205 13:06:19.728950 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.729879 master-0 kubenswrapper[29936]: I1205 13:06:19.729047 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.729879 master-0 kubenswrapper[29936]: I1205 13:06:19.729328 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.729879 master-0 kubenswrapper[29936]: I1205 13:06:19.729396 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.730434 master-0 kubenswrapper[29936]: I1205 13:06:19.730406 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.737740 master-0 kubenswrapper[29936]: I1205 13:06:19.736857 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.737740 master-0 kubenswrapper[29936]: I1205 13:06:19.736877 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.742070 master-0 kubenswrapper[29936]: I1205 13:06:19.742023 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:19.755567 master-0 kubenswrapper[29936]: I1205 13:06:19.755494 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fjnk\" (UniqueName: \"kubernetes.io/projected/6a2c9ca2-9c02-4227-bece-76f7c63b253e-kube-api-access-7fjnk\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:20.299436 master-0 kubenswrapper[29936]: I1205 13:06:20.296792 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.369678 master-0 kubenswrapper[29936]: E1205 13:06:21.369547 29936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc57f24f_5221_49a8_b6ae_03a6a4d0c5d5.slice/crio-d37f4922a4c5f2d7041603a8e09aa08447bad4f98e7d1beff4eea2d643f97308.scope\": RecentStats: unable to find data in memory cache]" Dec 05 13:06:21.431773 master-0 kubenswrapper[29936]: I1205 13:06:21.431703 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.521251 master-0 kubenswrapper[29936]: I1205 13:06:21.519688 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:21.565443 master-0 kubenswrapper[29936]: I1205 13:06:21.565337 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:21.589446 master-0 kubenswrapper[29936]: I1205 13:06:21.589371 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"e2629c18-e59e-48de-bde3-41d5a7452a70\" (UID: \"e2629c18-e59e-48de-bde3-41d5a7452a70\") " Dec 05 13:06:21.609804 master-0 kubenswrapper[29936]: I1205 13:06:21.607569 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:21.609804 master-0 kubenswrapper[29936]: I1205 13:06:21.609654 29936 trace.go:236] Trace[712138441]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (05-Dec-2025 13:06:19.221) (total time: 2388ms): Dec 05 13:06:21.609804 master-0 kubenswrapper[29936]: Trace[712138441]: [2.388263533s] [2.388263533s] END Dec 05 13:06:21.611743 master-0 kubenswrapper[29936]: I1205 13:06:21.611701 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.617331 master-0 kubenswrapper[29936]: I1205 13:06:21.617226 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-0279-account-create-update-wclm4" Dec 05 13:06:21.617683 master-0 kubenswrapper[29936]: I1205 13:06:21.617479 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b46d8-default-internal-config-data" Dec 05 13:06:21.622279 master-0 kubenswrapper[29936]: I1205 13:06:21.622219 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:21.632962 master-0 kubenswrapper[29936]: I1205 13:06:21.632890 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c" (OuterVolumeSpecName: "glance") pod "05bf92ec-c921-41b2-bf1d-ca8bd4d52531" (UID: "05bf92ec-c921-41b2-bf1d-ca8bd4d52531"). InnerVolumeSpecName "pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 13:06:21.653474 master-0 kubenswrapper[29936]: I1205 13:06:21.652611 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f" (OuterVolumeSpecName: "glance") pod "e2629c18-e59e-48de-bde3-41d5a7452a70" (UID: "e2629c18-e59e-48de-bde3-41d5a7452a70"). InnerVolumeSpecName "pvc-0d3529a7-2405-43a7-8986-74c66fb23772". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 13:06:21.692014 master-0 kubenswrapper[29936]: I1205 13:06:21.691932 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d76a2c1-3728-4104-b208-67b329e52d70-operator-scripts\") pod \"9d76a2c1-3728-4104-b208-67b329e52d70\" (UID: \"9d76a2c1-3728-4104-b208-67b329e52d70\") " Dec 05 13:06:21.692392 master-0 kubenswrapper[29936]: I1205 13:06:21.692124 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2w55\" (UniqueName: \"kubernetes.io/projected/9d76a2c1-3728-4104-b208-67b329e52d70-kube-api-access-r2w55\") pod \"9d76a2c1-3728-4104-b208-67b329e52d70\" (UID: \"9d76a2c1-3728-4104-b208-67b329e52d70\") " Dec 05 13:06:21.692392 master-0 kubenswrapper[29936]: I1205 13:06:21.692354 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.692392 master-0 kubenswrapper[29936]: I1205 13:06:21.692377 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.692524 master-0 kubenswrapper[29936]: I1205 13:06:21.692432 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.692524 master-0 kubenswrapper[29936]: I1205 13:06:21.692473 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.692609 master-0 kubenswrapper[29936]: I1205 13:06:21.692535 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:21.692609 master-0 kubenswrapper[29936]: I1205 13:06:21.692573 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.692683 master-0 kubenswrapper[29936]: I1205 13:06:21.692619 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.692683 master-0 kubenswrapper[29936]: I1205 13:06:21.692646 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhft2\" (UniqueName: \"kubernetes.io/projected/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-kube-api-access-lhft2\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.693623 master-0 kubenswrapper[29936]: I1205 13:06:21.693586 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d76a2c1-3728-4104-b208-67b329e52d70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d76a2c1-3728-4104-b208-67b329e52d70" (UID: "9d76a2c1-3728-4104-b208-67b329e52d70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:21.721261 master-0 kubenswrapper[29936]: I1205 13:06:21.720426 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d76a2c1-3728-4104-b208-67b329e52d70-kube-api-access-r2w55" (OuterVolumeSpecName: "kube-api-access-r2w55") pod "9d76a2c1-3728-4104-b208-67b329e52d70" (UID: "9d76a2c1-3728-4104-b208-67b329e52d70"). InnerVolumeSpecName "kube-api-access-r2w55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:21.798864 master-0 kubenswrapper[29936]: I1205 13:06:21.795400 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.798864 master-0 kubenswrapper[29936]: I1205 13:06:21.797230 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.798864 master-0 kubenswrapper[29936]: I1205 13:06:21.797376 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.798864 master-0 kubenswrapper[29936]: I1205 13:06:21.798264 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhft2\" (UniqueName: \"kubernetes.io/projected/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-kube-api-access-lhft2\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.798864 master-0 kubenswrapper[29936]: I1205 13:06:21.798477 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.798864 master-0 kubenswrapper[29936]: I1205 13:06:21.798509 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.799303 master-0 kubenswrapper[29936]: I1205 13:06:21.799084 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2w55\" (UniqueName: \"kubernetes.io/projected/9d76a2c1-3728-4104-b208-67b329e52d70-kube-api-access-r2w55\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:21.799303 master-0 kubenswrapper[29936]: I1205 13:06:21.799111 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d76a2c1-3728-4104-b208-67b329e52d70-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:21.803196 master-0 kubenswrapper[29936]: I1205 13:06:21.801625 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.804197 master-0 kubenswrapper[29936]: I1205 13:06:21.803706 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.804197 master-0 kubenswrapper[29936]: I1205 13:06:21.803900 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.804300 master-0 kubenswrapper[29936]: I1205 13:06:21.803429 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.807646 master-0 kubenswrapper[29936]: I1205 13:06:21.807532 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.818162 master-0 kubenswrapper[29936]: I1205 13:06:21.818109 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhft2\" (UniqueName: \"kubernetes.io/projected/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-kube-api-access-lhft2\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:21.828355 master-0 kubenswrapper[29936]: I1205 13:06:21.828307 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9fz4t" Dec 05 13:06:21.901968 master-0 kubenswrapper[29936]: I1205 13:06:21.901469 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wft9t\" (UniqueName: \"kubernetes.io/projected/72d08438-77de-4a6e-81d7-a9f76078a1b6-kube-api-access-wft9t\") pod \"72d08438-77de-4a6e-81d7-a9f76078a1b6\" (UID: \"72d08438-77de-4a6e-81d7-a9f76078a1b6\") " Dec 05 13:06:21.901968 master-0 kubenswrapper[29936]: I1205 13:06:21.901796 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d08438-77de-4a6e-81d7-a9f76078a1b6-operator-scripts\") pod \"72d08438-77de-4a6e-81d7-a9f76078a1b6\" (UID: \"72d08438-77de-4a6e-81d7-a9f76078a1b6\") " Dec 05 13:06:21.904644 master-0 kubenswrapper[29936]: I1205 13:06:21.904460 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72d08438-77de-4a6e-81d7-a9f76078a1b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72d08438-77de-4a6e-81d7-a9f76078a1b6" (UID: "72d08438-77de-4a6e-81d7-a9f76078a1b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:21.908834 master-0 kubenswrapper[29936]: I1205 13:06:21.908775 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d08438-77de-4a6e-81d7-a9f76078a1b6-kube-api-access-wft9t" (OuterVolumeSpecName: "kube-api-access-wft9t") pod "72d08438-77de-4a6e-81d7-a9f76078a1b6" (UID: "72d08438-77de-4a6e-81d7-a9f76078a1b6"). InnerVolumeSpecName "kube-api-access-wft9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:22.007794 master-0 kubenswrapper[29936]: I1205 13:06:22.007712 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d08438-77de-4a6e-81d7-a9f76078a1b6-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:22.007794 master-0 kubenswrapper[29936]: I1205 13:06:22.007774 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wft9t\" (UniqueName: \"kubernetes.io/projected/72d08438-77de-4a6e-81d7-a9f76078a1b6-kube-api-access-wft9t\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:22.334203 master-0 kubenswrapper[29936]: I1205 13:06:22.333742 29936 generic.go:334] "Generic (PLEG): container finished" podID="dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" containerID="d37f4922a4c5f2d7041603a8e09aa08447bad4f98e7d1beff4eea2d643f97308" exitCode=0 Dec 05 13:06:22.334203 master-0 kubenswrapper[29936]: I1205 13:06:22.333822 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hxjkn" event={"ID":"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5","Type":"ContainerDied","Data":"d37f4922a4c5f2d7041603a8e09aa08447bad4f98e7d1beff4eea2d643f97308"} Dec 05 13:06:22.339454 master-0 kubenswrapper[29936]: I1205 13:06:22.337520 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fqqdf" event={"ID":"21fcc891-90c5-47e7-97e9-e852adfae2bb","Type":"ContainerStarted","Data":"159c7b8004572d99bb7e1d0ce96a95f4f155c990d66f4df41aef77d1abfb8361"} Dec 05 13:06:22.341222 master-0 kubenswrapper[29936]: I1205 13:06:22.340628 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-0279-account-create-update-wclm4" event={"ID":"9d76a2c1-3728-4104-b208-67b329e52d70","Type":"ContainerDied","Data":"15e696002b7a51652b0f28b591acf47ffee19a46df16874f81b9ed97ee862298"} Dec 05 13:06:22.341222 master-0 kubenswrapper[29936]: I1205 13:06:22.340681 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15e696002b7a51652b0f28b591acf47ffee19a46df16874f81b9ed97ee862298" Dec 05 13:06:22.341222 master-0 kubenswrapper[29936]: I1205 13:06:22.340755 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-0279-account-create-update-wclm4" Dec 05 13:06:22.349358 master-0 kubenswrapper[29936]: I1205 13:06:22.348422 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-9fz4t" event={"ID":"72d08438-77de-4a6e-81d7-a9f76078a1b6","Type":"ContainerDied","Data":"73dd0b14b07761d534bd3676cabd5b4513db249a94d040e1f8c984f917822c45"} Dec 05 13:06:22.349358 master-0 kubenswrapper[29936]: I1205 13:06:22.348477 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73dd0b14b07761d534bd3676cabd5b4513db249a94d040e1f8c984f917822c45" Dec 05 13:06:22.349358 master-0 kubenswrapper[29936]: I1205 13:06:22.348551 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-9fz4t" Dec 05 13:06:22.392034 master-0 kubenswrapper[29936]: I1205 13:06:22.391935 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fqqdf" podStartSLOduration=3.465238704 podStartE2EDuration="9.391909567s" podCreationTimestamp="2025-12-05 13:06:13 +0000 UTC" firstStartedPulling="2025-12-05 13:06:15.680979095 +0000 UTC m=+972.813058766" lastFinishedPulling="2025-12-05 13:06:21.607649948 +0000 UTC m=+978.739729629" observedRunningTime="2025-12-05 13:06:22.390121641 +0000 UTC m=+979.522201322" watchObservedRunningTime="2025-12-05 13:06:22.391909567 +0000 UTC m=+979.523989248" Dec 05 13:06:23.059511 master-0 kubenswrapper[29936]: I1205 13:06:23.059429 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:23.072246 master-0 kubenswrapper[29936]: I1205 13:06:23.070562 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:23.208256 master-0 kubenswrapper[29936]: I1205 13:06:23.207541 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05bf92ec-c921-41b2-bf1d-ca8bd4d52531" path="/var/lib/kubelet/pods/05bf92ec-c921-41b2-bf1d-ca8bd4d52531/volumes" Dec 05 13:06:23.208256 master-0 kubenswrapper[29936]: I1205 13:06:23.208122 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2629c18-e59e-48de-bde3-41d5a7452a70" path="/var/lib/kubelet/pods/e2629c18-e59e-48de-bde3-41d5a7452a70/volumes" Dec 05 13:06:23.944101 master-0 kubenswrapper[29936]: I1205 13:06:23.944033 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:23.987051 master-0 kubenswrapper[29936]: I1205 13:06:23.986971 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-combined-ca-bundle\") pod \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " Dec 05 13:06:23.987448 master-0 kubenswrapper[29936]: I1205 13:06:23.987267 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tct9d\" (UniqueName: \"kubernetes.io/projected/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-kube-api-access-tct9d\") pod \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " Dec 05 13:06:23.987448 master-0 kubenswrapper[29936]: I1205 13:06:23.987329 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-fernet-keys\") pod \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " Dec 05 13:06:23.987448 master-0 kubenswrapper[29936]: I1205 13:06:23.987430 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-scripts\") pod \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " Dec 05 13:06:23.987448 master-0 kubenswrapper[29936]: I1205 13:06:23.987454 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-config-data\") pod \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " Dec 05 13:06:23.987448 master-0 kubenswrapper[29936]: I1205 13:06:23.987487 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-credential-keys\") pod \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\" (UID: \"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5\") " Dec 05 13:06:23.996361 master-0 kubenswrapper[29936]: I1205 13:06:23.996252 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-kube-api-access-tct9d" (OuterVolumeSpecName: "kube-api-access-tct9d") pod "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" (UID: "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5"). InnerVolumeSpecName "kube-api-access-tct9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:23.996361 master-0 kubenswrapper[29936]: I1205 13:06:23.996338 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-scripts" (OuterVolumeSpecName: "scripts") pod "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" (UID: "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:24.000810 master-0 kubenswrapper[29936]: I1205 13:06:24.000572 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" (UID: "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:24.007577 master-0 kubenswrapper[29936]: I1205 13:06:24.007441 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" (UID: "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:24.035165 master-0 kubenswrapper[29936]: I1205 13:06:24.035091 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-config-data" (OuterVolumeSpecName: "config-data") pod "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" (UID: "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:24.057377 master-0 kubenswrapper[29936]: I1205 13:06:24.057296 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" (UID: "dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:24.091548 master-0 kubenswrapper[29936]: I1205 13:06:24.091472 29936 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-credential-keys\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:24.091548 master-0 kubenswrapper[29936]: I1205 13:06:24.091540 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:24.091548 master-0 kubenswrapper[29936]: I1205 13:06:24.091558 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tct9d\" (UniqueName: \"kubernetes.io/projected/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-kube-api-access-tct9d\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:24.091938 master-0 kubenswrapper[29936]: I1205 13:06:24.091574 29936 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-fernet-keys\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:24.091938 master-0 kubenswrapper[29936]: I1205 13:06:24.091588 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:24.091938 master-0 kubenswrapper[29936]: I1205 13:06:24.091600 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:24.099204 master-0 kubenswrapper[29936]: I1205 13:06:24.099122 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:24.300208 master-0 kubenswrapper[29936]: I1205 13:06:24.298869 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-h9l4n"] Dec 05 13:06:24.300208 master-0 kubenswrapper[29936]: E1205 13:06:24.299646 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d76a2c1-3728-4104-b208-67b329e52d70" containerName="mariadb-account-create-update" Dec 05 13:06:24.300208 master-0 kubenswrapper[29936]: I1205 13:06:24.299664 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d76a2c1-3728-4104-b208-67b329e52d70" containerName="mariadb-account-create-update" Dec 05 13:06:24.300208 master-0 kubenswrapper[29936]: E1205 13:06:24.299727 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d08438-77de-4a6e-81d7-a9f76078a1b6" containerName="mariadb-database-create" Dec 05 13:06:24.300208 master-0 kubenswrapper[29936]: I1205 13:06:24.299738 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d08438-77de-4a6e-81d7-a9f76078a1b6" containerName="mariadb-database-create" Dec 05 13:06:24.300208 master-0 kubenswrapper[29936]: E1205 13:06:24.299754 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" containerName="keystone-bootstrap" Dec 05 13:06:24.300208 master-0 kubenswrapper[29936]: I1205 13:06:24.299760 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" containerName="keystone-bootstrap" Dec 05 13:06:24.300208 master-0 kubenswrapper[29936]: I1205 13:06:24.300077 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d08438-77de-4a6e-81d7-a9f76078a1b6" containerName="mariadb-database-create" Dec 05 13:06:24.300208 master-0 kubenswrapper[29936]: I1205 13:06:24.300110 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" containerName="keystone-bootstrap" Dec 05 13:06:24.300208 master-0 kubenswrapper[29936]: I1205 13:06:24.300138 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d76a2c1-3728-4104-b208-67b329e52d70" containerName="mariadb-account-create-update" Dec 05 13:06:24.303726 master-0 kubenswrapper[29936]: I1205 13:06:24.301661 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.313226 master-0 kubenswrapper[29936]: I1205 13:06:24.308727 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Dec 05 13:06:24.313226 master-0 kubenswrapper[29936]: I1205 13:06:24.309059 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Dec 05 13:06:24.407213 master-0 kubenswrapper[29936]: I1205 13:06:24.404419 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-scripts\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.407213 master-0 kubenswrapper[29936]: I1205 13:06:24.404506 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.407213 master-0 kubenswrapper[29936]: I1205 13:06:24.404551 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgbw5\" (UniqueName: \"kubernetes.io/projected/1690e553-8b77-483f-9f31-4f3968e6bd28-kube-api-access-cgbw5\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.407213 master-0 kubenswrapper[29936]: I1205 13:06:24.404585 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1690e553-8b77-483f-9f31-4f3968e6bd28-etc-podinfo\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.407213 master-0 kubenswrapper[29936]: I1205 13:06:24.404627 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data-merged\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.407213 master-0 kubenswrapper[29936]: I1205 13:06:24.404653 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-combined-ca-bundle\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.433213 master-0 kubenswrapper[29936]: I1205 13:06:24.429266 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-h9l4n"] Dec 05 13:06:24.483806 master-0 kubenswrapper[29936]: I1205 13:06:24.483730 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hxjkn" Dec 05 13:06:24.484399 master-0 kubenswrapper[29936]: I1205 13:06:24.484275 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hxjkn" event={"ID":"dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5","Type":"ContainerDied","Data":"4cb7fec689ec48daec3dd85a44e12ae50a8c95d05ed8e8dc2bf1131cd63d4788"} Dec 05 13:06:24.484480 master-0 kubenswrapper[29936]: I1205 13:06:24.484400 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cb7fec689ec48daec3dd85a44e12ae50a8c95d05ed8e8dc2bf1131cd63d4788" Dec 05 13:06:24.487877 master-0 kubenswrapper[29936]: I1205 13:06:24.487773 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"6a2c9ca2-9c02-4227-bece-76f7c63b253e","Type":"ContainerStarted","Data":"64894524f5da148b9cf066b28c1d4bcf823f484545f979477dded27064876363"} Dec 05 13:06:24.495553 master-0 kubenswrapper[29936]: I1205 13:06:24.495469 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:24.509610 master-0 kubenswrapper[29936]: I1205 13:06:24.509426 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-scripts\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.509610 master-0 kubenswrapper[29936]: I1205 13:06:24.509534 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.509610 master-0 kubenswrapper[29936]: I1205 13:06:24.509614 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgbw5\" (UniqueName: \"kubernetes.io/projected/1690e553-8b77-483f-9f31-4f3968e6bd28-kube-api-access-cgbw5\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.509973 master-0 kubenswrapper[29936]: I1205 13:06:24.509662 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1690e553-8b77-483f-9f31-4f3968e6bd28-etc-podinfo\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.509973 master-0 kubenswrapper[29936]: I1205 13:06:24.509715 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data-merged\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.509973 master-0 kubenswrapper[29936]: I1205 13:06:24.509755 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-combined-ca-bundle\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.518026 master-0 kubenswrapper[29936]: I1205 13:06:24.515842 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data-merged\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.519495 master-0 kubenswrapper[29936]: I1205 13:06:24.519451 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-scripts\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.530734 master-0 kubenswrapper[29936]: I1205 13:06:24.530554 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1690e553-8b77-483f-9f31-4f3968e6bd28-etc-podinfo\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.531467 master-0 kubenswrapper[29936]: I1205 13:06:24.531305 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-combined-ca-bundle\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.543301 master-0 kubenswrapper[29936]: I1205 13:06:24.537374 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgbw5\" (UniqueName: \"kubernetes.io/projected/1690e553-8b77-483f-9f31-4f3968e6bd28-kube-api-access-cgbw5\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.543301 master-0 kubenswrapper[29936]: I1205 13:06:24.537764 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data\") pod \"ironic-db-sync-h9l4n\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.584732 master-0 kubenswrapper[29936]: I1205 13:06:24.584513 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hxjkn"] Dec 05 13:06:24.599467 master-0 kubenswrapper[29936]: I1205 13:06:24.597871 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:24.606224 master-0 kubenswrapper[29936]: I1205 13:06:24.606139 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hxjkn"] Dec 05 13:06:24.623439 master-0 kubenswrapper[29936]: I1205 13:06:24.623338 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kcsv8"] Dec 05 13:06:24.625805 master-0 kubenswrapper[29936]: I1205 13:06:24.625768 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.633284 master-0 kubenswrapper[29936]: I1205 13:06:24.632650 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 13:06:24.633284 master-0 kubenswrapper[29936]: I1205 13:06:24.632921 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 13:06:24.633875 master-0 kubenswrapper[29936]: I1205 13:06:24.633844 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 13:06:24.649906 master-0 kubenswrapper[29936]: I1205 13:06:24.649834 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kcsv8"] Dec 05 13:06:24.707736 master-0 kubenswrapper[29936]: I1205 13:06:24.706724 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:06:24.819415 master-0 kubenswrapper[29936]: I1205 13:06:24.819311 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-config-data\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.819766 master-0 kubenswrapper[29936]: I1205 13:06:24.819466 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-combined-ca-bundle\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.819766 master-0 kubenswrapper[29936]: I1205 13:06:24.819528 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-credential-keys\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.819766 master-0 kubenswrapper[29936]: I1205 13:06:24.819560 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-fernet-keys\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.819766 master-0 kubenswrapper[29936]: I1205 13:06:24.819732 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcbkf\" (UniqueName: \"kubernetes.io/projected/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-kube-api-access-jcbkf\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.819967 master-0 kubenswrapper[29936]: I1205 13:06:24.819778 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-scripts\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.925446 master-0 kubenswrapper[29936]: I1205 13:06:24.925371 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-config-data\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.925769 master-0 kubenswrapper[29936]: I1205 13:06:24.925471 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-combined-ca-bundle\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.925769 master-0 kubenswrapper[29936]: I1205 13:06:24.925517 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-credential-keys\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.925769 master-0 kubenswrapper[29936]: I1205 13:06:24.925659 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-fernet-keys\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.925769 master-0 kubenswrapper[29936]: I1205 13:06:24.925741 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcbkf\" (UniqueName: \"kubernetes.io/projected/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-kube-api-access-jcbkf\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.925912 master-0 kubenswrapper[29936]: I1205 13:06:24.925771 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-scripts\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.931007 master-0 kubenswrapper[29936]: I1205 13:06:24.929924 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-scripts\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.931007 master-0 kubenswrapper[29936]: I1205 13:06:24.930308 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-config-data\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.937807 master-0 kubenswrapper[29936]: I1205 13:06:24.937744 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-credential-keys\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.938032 master-0 kubenswrapper[29936]: I1205 13:06:24.937973 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-combined-ca-bundle\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.943517 master-0 kubenswrapper[29936]: I1205 13:06:24.943348 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-fernet-keys\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:24.953642 master-0 kubenswrapper[29936]: I1205 13:06:24.953578 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcbkf\" (UniqueName: \"kubernetes.io/projected/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-kube-api-access-jcbkf\") pod \"keystone-bootstrap-kcsv8\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:25.211387 master-0 kubenswrapper[29936]: I1205 13:06:25.211263 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5" path="/var/lib/kubelet/pods/dc57f24f-5221-49a8-b6ae-03a6a4d0c5d5/volumes" Dec 05 13:06:25.250269 master-0 kubenswrapper[29936]: I1205 13:06:25.250082 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:25.253090 master-0 kubenswrapper[29936]: I1205 13:06:25.253002 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:25.477580 master-0 kubenswrapper[29936]: I1205 13:06:25.477395 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:25.518716 master-0 kubenswrapper[29936]: I1205 13:06:25.518417 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"6a2c9ca2-9c02-4227-bece-76f7c63b253e","Type":"ContainerStarted","Data":"c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e"} Dec 05 13:06:25.609972 master-0 kubenswrapper[29936]: I1205 13:06:25.609837 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:26.533270 master-0 kubenswrapper[29936]: I1205 13:06:26.533209 29936 generic.go:334] "Generic (PLEG): container finished" podID="21fcc891-90c5-47e7-97e9-e852adfae2bb" containerID="159c7b8004572d99bb7e1d0ce96a95f4f155c990d66f4df41aef77d1abfb8361" exitCode=0 Dec 05 13:06:26.534151 master-0 kubenswrapper[29936]: I1205 13:06:26.533270 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fqqdf" event={"ID":"21fcc891-90c5-47e7-97e9-e852adfae2bb","Type":"ContainerDied","Data":"159c7b8004572d99bb7e1d0ce96a95f4f155c990d66f4df41aef77d1abfb8361"} Dec 05 13:06:26.719538 master-0 kubenswrapper[29936]: I1205 13:06:26.719305 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:06:26.816507 master-0 kubenswrapper[29936]: I1205 13:06:26.815646 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-bp8lq"] Dec 05 13:06:26.816507 master-0 kubenswrapper[29936]: I1205 13:06:26.816101 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" podUID="259e2263-d738-441d-adc4-e0aab272b64a" containerName="dnsmasq-dns" containerID="cri-o://0547e2b8393ec7541feb08e2b2861a9fd2d114c8f44418cce551dbe6163a957a" gracePeriod=10 Dec 05 13:06:27.561203 master-0 kubenswrapper[29936]: I1205 13:06:27.561100 29936 generic.go:334] "Generic (PLEG): container finished" podID="259e2263-d738-441d-adc4-e0aab272b64a" containerID="0547e2b8393ec7541feb08e2b2861a9fd2d114c8f44418cce551dbe6163a957a" exitCode=0 Dec 05 13:06:27.562113 master-0 kubenswrapper[29936]: I1205 13:06:27.561193 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" event={"ID":"259e2263-d738-441d-adc4-e0aab272b64a","Type":"ContainerDied","Data":"0547e2b8393ec7541feb08e2b2861a9fd2d114c8f44418cce551dbe6163a957a"} Dec 05 13:06:30.002435 master-0 kubenswrapper[29936]: I1205 13:06:30.002312 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" podUID="259e2263-d738-441d-adc4-e0aab272b64a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.186:5353: connect: connection refused" Dec 05 13:06:35.002590 master-0 kubenswrapper[29936]: I1205 13:06:35.002489 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" podUID="259e2263-d738-441d-adc4-e0aab272b64a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.186:5353: connect: connection refused" Dec 05 13:06:36.128674 master-0 kubenswrapper[29936]: W1205 13:06:36.126387 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2456ca5_ce7b_4eb5_a500_48c27677c9d8.slice/crio-6b34de3b40a1c4afe72eed353f7c25187850b884bdd16b2200a8524371fc99a8 WatchSource:0}: Error finding container 6b34de3b40a1c4afe72eed353f7c25187850b884bdd16b2200a8524371fc99a8: Status 404 returned error can't find the container with id 6b34de3b40a1c4afe72eed353f7c25187850b884bdd16b2200a8524371fc99a8 Dec 05 13:06:36.333066 master-0 kubenswrapper[29936]: I1205 13:06:36.332924 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:36.364043 master-0 kubenswrapper[29936]: I1205 13:06:36.361914 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkwhv\" (UniqueName: \"kubernetes.io/projected/21fcc891-90c5-47e7-97e9-e852adfae2bb-kube-api-access-gkwhv\") pod \"21fcc891-90c5-47e7-97e9-e852adfae2bb\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " Dec 05 13:06:36.364043 master-0 kubenswrapper[29936]: I1205 13:06:36.362251 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-config-data\") pod \"21fcc891-90c5-47e7-97e9-e852adfae2bb\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " Dec 05 13:06:36.365539 master-0 kubenswrapper[29936]: I1205 13:06:36.365498 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-combined-ca-bundle\") pod \"21fcc891-90c5-47e7-97e9-e852adfae2bb\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " Dec 05 13:06:36.365693 master-0 kubenswrapper[29936]: I1205 13:06:36.365672 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-scripts\") pod \"21fcc891-90c5-47e7-97e9-e852adfae2bb\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " Dec 05 13:06:36.365787 master-0 kubenswrapper[29936]: I1205 13:06:36.365756 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21fcc891-90c5-47e7-97e9-e852adfae2bb-logs\") pod \"21fcc891-90c5-47e7-97e9-e852adfae2bb\" (UID: \"21fcc891-90c5-47e7-97e9-e852adfae2bb\") " Dec 05 13:06:36.369278 master-0 kubenswrapper[29936]: I1205 13:06:36.369234 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21fcc891-90c5-47e7-97e9-e852adfae2bb-logs" (OuterVolumeSpecName: "logs") pod "21fcc891-90c5-47e7-97e9-e852adfae2bb" (UID: "21fcc891-90c5-47e7-97e9-e852adfae2bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:06:36.378543 master-0 kubenswrapper[29936]: I1205 13:06:36.378404 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21fcc891-90c5-47e7-97e9-e852adfae2bb-kube-api-access-gkwhv" (OuterVolumeSpecName: "kube-api-access-gkwhv") pod "21fcc891-90c5-47e7-97e9-e852adfae2bb" (UID: "21fcc891-90c5-47e7-97e9-e852adfae2bb"). InnerVolumeSpecName "kube-api-access-gkwhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:36.378886 master-0 kubenswrapper[29936]: I1205 13:06:36.378767 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-scripts" (OuterVolumeSpecName: "scripts") pod "21fcc891-90c5-47e7-97e9-e852adfae2bb" (UID: "21fcc891-90c5-47e7-97e9-e852adfae2bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:36.444541 master-0 kubenswrapper[29936]: I1205 13:06:36.441984 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-config-data" (OuterVolumeSpecName: "config-data") pod "21fcc891-90c5-47e7-97e9-e852adfae2bb" (UID: "21fcc891-90c5-47e7-97e9-e852adfae2bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:36.456351 master-0 kubenswrapper[29936]: I1205 13:06:36.449672 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21fcc891-90c5-47e7-97e9-e852adfae2bb" (UID: "21fcc891-90c5-47e7-97e9-e852adfae2bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:36.473280 master-0 kubenswrapper[29936]: I1205 13:06:36.472034 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkwhv\" (UniqueName: \"kubernetes.io/projected/21fcc891-90c5-47e7-97e9-e852adfae2bb-kube-api-access-gkwhv\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:36.473280 master-0 kubenswrapper[29936]: I1205 13:06:36.472079 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:36.473280 master-0 kubenswrapper[29936]: I1205 13:06:36.472092 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:36.473280 master-0 kubenswrapper[29936]: I1205 13:06:36.472101 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21fcc891-90c5-47e7-97e9-e852adfae2bb-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:36.473280 master-0 kubenswrapper[29936]: I1205 13:06:36.472110 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21fcc891-90c5-47e7-97e9-e852adfae2bb-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:36.659753 master-0 kubenswrapper[29936]: I1205 13:06:36.659644 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:06:36.693610 master-0 kubenswrapper[29936]: I1205 13:06:36.692231 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-nb\") pod \"259e2263-d738-441d-adc4-e0aab272b64a\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " Dec 05 13:06:36.693610 master-0 kubenswrapper[29936]: I1205 13:06:36.692366 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbn7q\" (UniqueName: \"kubernetes.io/projected/259e2263-d738-441d-adc4-e0aab272b64a-kube-api-access-pbn7q\") pod \"259e2263-d738-441d-adc4-e0aab272b64a\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " Dec 05 13:06:36.693610 master-0 kubenswrapper[29936]: I1205 13:06:36.692487 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-dns-svc\") pod \"259e2263-d738-441d-adc4-e0aab272b64a\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " Dec 05 13:06:36.693610 master-0 kubenswrapper[29936]: I1205 13:06:36.692514 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-config\") pod \"259e2263-d738-441d-adc4-e0aab272b64a\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " Dec 05 13:06:36.693610 master-0 kubenswrapper[29936]: I1205 13:06:36.692574 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-sb\") pod \"259e2263-d738-441d-adc4-e0aab272b64a\" (UID: \"259e2263-d738-441d-adc4-e0aab272b64a\") " Dec 05 13:06:36.715829 master-0 kubenswrapper[29936]: I1205 13:06:36.715750 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259e2263-d738-441d-adc4-e0aab272b64a-kube-api-access-pbn7q" (OuterVolumeSpecName: "kube-api-access-pbn7q") pod "259e2263-d738-441d-adc4-e0aab272b64a" (UID: "259e2263-d738-441d-adc4-e0aab272b64a"). InnerVolumeSpecName "kube-api-access-pbn7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:36.760112 master-0 kubenswrapper[29936]: I1205 13:06:36.760024 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-config" (OuterVolumeSpecName: "config") pod "259e2263-d738-441d-adc4-e0aab272b64a" (UID: "259e2263-d738-441d-adc4-e0aab272b64a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:36.761486 master-0 kubenswrapper[29936]: I1205 13:06:36.761392 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "259e2263-d738-441d-adc4-e0aab272b64a" (UID: "259e2263-d738-441d-adc4-e0aab272b64a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:36.769480 master-0 kubenswrapper[29936]: I1205 13:06:36.768570 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" event={"ID":"259e2263-d738-441d-adc4-e0aab272b64a","Type":"ContainerDied","Data":"f6490271d9f5a822ef63775fc6a84b405fea934aa9587bcc2623bf8b75549502"} Dec 05 13:06:36.769480 master-0 kubenswrapper[29936]: I1205 13:06:36.768649 29936 scope.go:117] "RemoveContainer" containerID="0547e2b8393ec7541feb08e2b2861a9fd2d114c8f44418cce551dbe6163a957a" Dec 05 13:06:36.769480 master-0 kubenswrapper[29936]: I1205 13:06:36.768803 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4d74cb79-bp8lq" Dec 05 13:06:36.773030 master-0 kubenswrapper[29936]: I1205 13:06:36.772971 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "259e2263-d738-441d-adc4-e0aab272b64a" (UID: "259e2263-d738-441d-adc4-e0aab272b64a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:36.773109 master-0 kubenswrapper[29936]: I1205 13:06:36.773068 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"b2456ca5-ce7b-4eb5-a500-48c27677c9d8","Type":"ContainerStarted","Data":"6b34de3b40a1c4afe72eed353f7c25187850b884bdd16b2200a8524371fc99a8"} Dec 05 13:06:36.777007 master-0 kubenswrapper[29936]: I1205 13:06:36.776934 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fqqdf" event={"ID":"21fcc891-90c5-47e7-97e9-e852adfae2bb","Type":"ContainerDied","Data":"a26842610f9b59d6310b47d52ca35deb002ea4ccae97fdf45e5df9c05ffd9dc7"} Dec 05 13:06:36.777121 master-0 kubenswrapper[29936]: I1205 13:06:36.777015 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a26842610f9b59d6310b47d52ca35deb002ea4ccae97fdf45e5df9c05ffd9dc7" Dec 05 13:06:36.777121 master-0 kubenswrapper[29936]: I1205 13:06:36.777048 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fqqdf" Dec 05 13:06:36.788351 master-0 kubenswrapper[29936]: I1205 13:06:36.788207 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "259e2263-d738-441d-adc4-e0aab272b64a" (UID: "259e2263-d738-441d-adc4-e0aab272b64a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:36.812273 master-0 kubenswrapper[29936]: I1205 13:06:36.809109 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:36.812273 master-0 kubenswrapper[29936]: I1205 13:06:36.809218 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbn7q\" (UniqueName: \"kubernetes.io/projected/259e2263-d738-441d-adc4-e0aab272b64a-kube-api-access-pbn7q\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:36.812273 master-0 kubenswrapper[29936]: I1205 13:06:36.809238 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:36.812273 master-0 kubenswrapper[29936]: I1205 13:06:36.809248 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:36.812273 master-0 kubenswrapper[29936]: I1205 13:06:36.809256 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/259e2263-d738-441d-adc4-e0aab272b64a-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:36.825560 master-0 kubenswrapper[29936]: I1205 13:06:36.825472 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kcsv8"] Dec 05 13:06:36.858046 master-0 kubenswrapper[29936]: I1205 13:06:36.857782 29936 scope.go:117] "RemoveContainer" containerID="5b1fcfc50f531fdf41b481ccab7eff00ec8b2cb88e14202d7095dd1327a2f472" Dec 05 13:06:36.865619 master-0 kubenswrapper[29936]: W1205 13:06:36.865495 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4d7d168_a010_44ef_b2cb_1ec979fb38c6.slice/crio-2ee0420543936810b2ff2a6db8f57352f5013a4d719780c95894771a269d9f84 WatchSource:0}: Error finding container 2ee0420543936810b2ff2a6db8f57352f5013a4d719780c95894771a269d9f84: Status 404 returned error can't find the container with id 2ee0420543936810b2ff2a6db8f57352f5013a4d719780c95894771a269d9f84 Dec 05 13:06:36.868724 master-0 kubenswrapper[29936]: W1205 13:06:36.868686 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1690e553_8b77_483f_9f31_4f3968e6bd28.slice/crio-95cf6cde879a112c5eb38a3b0a63d3e6be680c28099c3aae0b686865a39447df WatchSource:0}: Error finding container 95cf6cde879a112c5eb38a3b0a63d3e6be680c28099c3aae0b686865a39447df: Status 404 returned error can't find the container with id 95cf6cde879a112c5eb38a3b0a63d3e6be680c28099c3aae0b686865a39447df Dec 05 13:06:36.908023 master-0 kubenswrapper[29936]: I1205 13:06:36.907933 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-h9l4n"] Dec 05 13:06:37.159465 master-0 kubenswrapper[29936]: I1205 13:06:37.159263 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-bp8lq"] Dec 05 13:06:37.170172 master-0 kubenswrapper[29936]: I1205 13:06:37.170078 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-bp8lq"] Dec 05 13:06:37.202036 master-0 kubenswrapper[29936]: I1205 13:06:37.201985 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="259e2263-d738-441d-adc4-e0aab272b64a" path="/var/lib/kubelet/pods/259e2263-d738-441d-adc4-e0aab272b64a/volumes" Dec 05 13:06:37.576520 master-0 kubenswrapper[29936]: I1205 13:06:37.576449 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-687c55df6d-h9cdt"] Dec 05 13:06:37.577593 master-0 kubenswrapper[29936]: E1205 13:06:37.577550 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259e2263-d738-441d-adc4-e0aab272b64a" containerName="init" Dec 05 13:06:37.577593 master-0 kubenswrapper[29936]: I1205 13:06:37.577585 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="259e2263-d738-441d-adc4-e0aab272b64a" containerName="init" Dec 05 13:06:37.577698 master-0 kubenswrapper[29936]: E1205 13:06:37.577631 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21fcc891-90c5-47e7-97e9-e852adfae2bb" containerName="placement-db-sync" Dec 05 13:06:37.577698 master-0 kubenswrapper[29936]: I1205 13:06:37.577655 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="21fcc891-90c5-47e7-97e9-e852adfae2bb" containerName="placement-db-sync" Dec 05 13:06:37.577698 master-0 kubenswrapper[29936]: E1205 13:06:37.577682 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259e2263-d738-441d-adc4-e0aab272b64a" containerName="dnsmasq-dns" Dec 05 13:06:37.577698 master-0 kubenswrapper[29936]: I1205 13:06:37.577692 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="259e2263-d738-441d-adc4-e0aab272b64a" containerName="dnsmasq-dns" Dec 05 13:06:37.578160 master-0 kubenswrapper[29936]: I1205 13:06:37.578122 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="259e2263-d738-441d-adc4-e0aab272b64a" containerName="dnsmasq-dns" Dec 05 13:06:37.578226 master-0 kubenswrapper[29936]: I1205 13:06:37.578170 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="21fcc891-90c5-47e7-97e9-e852adfae2bb" containerName="placement-db-sync" Dec 05 13:06:37.580704 master-0 kubenswrapper[29936]: I1205 13:06:37.580658 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.592395 master-0 kubenswrapper[29936]: I1205 13:06:37.592341 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 13:06:37.592650 master-0 kubenswrapper[29936]: I1205 13:06:37.592493 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 13:06:37.592650 master-0 kubenswrapper[29936]: I1205 13:06:37.592593 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 13:06:37.595573 master-0 kubenswrapper[29936]: I1205 13:06:37.595496 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 13:06:37.628646 master-0 kubenswrapper[29936]: I1205 13:06:37.628577 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-687c55df6d-h9cdt"] Dec 05 13:06:37.637505 master-0 kubenswrapper[29936]: I1205 13:06:37.637428 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phzzd\" (UniqueName: \"kubernetes.io/projected/d81510ca-2c9d-4582-8ce7-e101673e0397-kube-api-access-phzzd\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.638011 master-0 kubenswrapper[29936]: I1205 13:06:37.637925 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-public-tls-certs\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.638011 master-0 kubenswrapper[29936]: I1205 13:06:37.637975 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-combined-ca-bundle\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.638332 master-0 kubenswrapper[29936]: I1205 13:06:37.638027 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-config-data\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.638332 master-0 kubenswrapper[29936]: I1205 13:06:37.638121 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-internal-tls-certs\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.638554 master-0 kubenswrapper[29936]: I1205 13:06:37.638523 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d81510ca-2c9d-4582-8ce7-e101673e0397-logs\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.638642 master-0 kubenswrapper[29936]: I1205 13:06:37.638579 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-scripts\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.741910 master-0 kubenswrapper[29936]: I1205 13:06:37.741819 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d81510ca-2c9d-4582-8ce7-e101673e0397-logs\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.742154 master-0 kubenswrapper[29936]: I1205 13:06:37.742034 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-scripts\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.742234 master-0 kubenswrapper[29936]: I1205 13:06:37.742155 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phzzd\" (UniqueName: \"kubernetes.io/projected/d81510ca-2c9d-4582-8ce7-e101673e0397-kube-api-access-phzzd\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.742278 master-0 kubenswrapper[29936]: I1205 13:06:37.742236 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-public-tls-certs\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.742315 master-0 kubenswrapper[29936]: I1205 13:06:37.742264 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-combined-ca-bundle\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.742347 master-0 kubenswrapper[29936]: I1205 13:06:37.742334 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-config-data\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.742426 master-0 kubenswrapper[29936]: I1205 13:06:37.742400 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-internal-tls-certs\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.742668 master-0 kubenswrapper[29936]: I1205 13:06:37.742458 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d81510ca-2c9d-4582-8ce7-e101673e0397-logs\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.747541 master-0 kubenswrapper[29936]: I1205 13:06:37.747494 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-config-data\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.747739 master-0 kubenswrapper[29936]: I1205 13:06:37.747703 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-internal-tls-certs\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.755163 master-0 kubenswrapper[29936]: I1205 13:06:37.754985 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-scripts\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.762868 master-0 kubenswrapper[29936]: I1205 13:06:37.762781 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-public-tls-certs\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.763216 master-0 kubenswrapper[29936]: I1205 13:06:37.762942 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d81510ca-2c9d-4582-8ce7-e101673e0397-combined-ca-bundle\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.767233 master-0 kubenswrapper[29936]: I1205 13:06:37.767142 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phzzd\" (UniqueName: \"kubernetes.io/projected/d81510ca-2c9d-4582-8ce7-e101673e0397-kube-api-access-phzzd\") pod \"placement-687c55df6d-h9cdt\" (UID: \"d81510ca-2c9d-4582-8ce7-e101673e0397\") " pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.829905 master-0 kubenswrapper[29936]: I1205 13:06:37.829830 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-db-sync-6scb5" event={"ID":"b82bd984-5f64-433b-a41a-f5186287a0f7","Type":"ContainerStarted","Data":"8c9fdd2f2167aecbf9110e725bed0a387f56041962376d469528fcc3bbcc45a3"} Dec 05 13:06:37.866950 master-0 kubenswrapper[29936]: I1205 13:06:37.866873 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"b2456ca5-ce7b-4eb5-a500-48c27677c9d8","Type":"ContainerStarted","Data":"9ded3aeca2213c6c58323ea1046091985fedc44fbba7630e7378acdb7c606b5b"} Dec 05 13:06:37.880390 master-0 kubenswrapper[29936]: I1205 13:06:37.880271 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b46d8-db-sync-6scb5" podStartSLOduration=4.2174491849999995 podStartE2EDuration="24.880242327s" podCreationTimestamp="2025-12-05 13:06:13 +0000 UTC" firstStartedPulling="2025-12-05 13:06:15.682624457 +0000 UTC m=+972.814704138" lastFinishedPulling="2025-12-05 13:06:36.345417599 +0000 UTC m=+993.477497280" observedRunningTime="2025-12-05 13:06:37.867393046 +0000 UTC m=+994.999472747" watchObservedRunningTime="2025-12-05 13:06:37.880242327 +0000 UTC m=+995.012322028" Dec 05 13:06:37.886331 master-0 kubenswrapper[29936]: I1205 13:06:37.886238 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kcsv8" event={"ID":"d4d7d168-a010-44ef-b2cb-1ec979fb38c6","Type":"ContainerStarted","Data":"141ab7a29ac7f33ed8f5ff26052076988c5f1c4aeaf98072d1684f5c36ae252d"} Dec 05 13:06:37.886331 master-0 kubenswrapper[29936]: I1205 13:06:37.886327 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kcsv8" event={"ID":"d4d7d168-a010-44ef-b2cb-1ec979fb38c6","Type":"ContainerStarted","Data":"2ee0420543936810b2ff2a6db8f57352f5013a4d719780c95894771a269d9f84"} Dec 05 13:06:37.888136 master-0 kubenswrapper[29936]: I1205 13:06:37.888065 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-h9l4n" event={"ID":"1690e553-8b77-483f-9f31-4f3968e6bd28","Type":"ContainerStarted","Data":"95cf6cde879a112c5eb38a3b0a63d3e6be680c28099c3aae0b686865a39447df"} Dec 05 13:06:37.894931 master-0 kubenswrapper[29936]: I1205 13:06:37.894854 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"6a2c9ca2-9c02-4227-bece-76f7c63b253e","Type":"ContainerStarted","Data":"c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71"} Dec 05 13:06:37.895619 master-0 kubenswrapper[29936]: I1205 13:06:37.895560 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b46d8-default-external-api-0" podUID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" containerName="glance-log" containerID="cri-o://c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e" gracePeriod=30 Dec 05 13:06:37.895738 master-0 kubenswrapper[29936]: I1205 13:06:37.895724 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b46d8-default-external-api-0" podUID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" containerName="glance-httpd" containerID="cri-o://c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71" gracePeriod=30 Dec 05 13:06:37.918589 master-0 kubenswrapper[29936]: I1205 13:06:37.918429 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kcsv8" podStartSLOduration=13.918399242 podStartE2EDuration="13.918399242s" podCreationTimestamp="2025-12-05 13:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:37.910877788 +0000 UTC m=+995.042957489" watchObservedRunningTime="2025-12-05 13:06:37.918399242 +0000 UTC m=+995.050478923" Dec 05 13:06:37.939435 master-0 kubenswrapper[29936]: E1205 13:06:37.939104 29936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a2c9ca2_9c02_4227_bece_76f7c63b253e.slice/crio-c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e.scope\": RecentStats: unable to find data in memory cache]" Dec 05 13:06:37.955144 master-0 kubenswrapper[29936]: I1205 13:06:37.955037 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:37.975057 master-0 kubenswrapper[29936]: I1205 13:06:37.974852 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b46d8-default-external-api-0" podStartSLOduration=18.97482131 podStartE2EDuration="18.97482131s" podCreationTimestamp="2025-12-05 13:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:37.954774641 +0000 UTC m=+995.086854342" watchObservedRunningTime="2025-12-05 13:06:37.97482131 +0000 UTC m=+995.106900991" Dec 05 13:06:38.511464 master-0 kubenswrapper[29936]: I1205 13:06:38.511331 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-687c55df6d-h9cdt"] Dec 05 13:06:38.528027 master-0 kubenswrapper[29936]: W1205 13:06:38.527943 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd81510ca_2c9d_4582_8ce7_e101673e0397.slice/crio-bd54f7cc9bed2ba2cd738b5a68023bafeed30dd387f525ff43d8173aa45c0352 WatchSource:0}: Error finding container bd54f7cc9bed2ba2cd738b5a68023bafeed30dd387f525ff43d8173aa45c0352: Status 404 returned error can't find the container with id bd54f7cc9bed2ba2cd738b5a68023bafeed30dd387f525ff43d8173aa45c0352 Dec 05 13:06:38.793564 master-0 kubenswrapper[29936]: I1205 13:06:38.793510 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:38.896660 master-0 kubenswrapper[29936]: I1205 13:06:38.895032 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-httpd-run\") pod \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " Dec 05 13:06:38.896660 master-0 kubenswrapper[29936]: I1205 13:06:38.895677 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " Dec 05 13:06:38.896660 master-0 kubenswrapper[29936]: I1205 13:06:38.895732 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fjnk\" (UniqueName: \"kubernetes.io/projected/6a2c9ca2-9c02-4227-bece-76f7c63b253e-kube-api-access-7fjnk\") pod \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " Dec 05 13:06:38.896660 master-0 kubenswrapper[29936]: I1205 13:06:38.895798 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-logs\") pod \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " Dec 05 13:06:38.896660 master-0 kubenswrapper[29936]: I1205 13:06:38.896277 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-config-data\") pod \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " Dec 05 13:06:38.896660 master-0 kubenswrapper[29936]: I1205 13:06:38.896388 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-scripts\") pod \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " Dec 05 13:06:38.896660 master-0 kubenswrapper[29936]: I1205 13:06:38.896480 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-combined-ca-bundle\") pod \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\" (UID: \"6a2c9ca2-9c02-4227-bece-76f7c63b253e\") " Dec 05 13:06:38.897022 master-0 kubenswrapper[29936]: I1205 13:06:38.896671 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-logs" (OuterVolumeSpecName: "logs") pod "6a2c9ca2-9c02-4227-bece-76f7c63b253e" (UID: "6a2c9ca2-9c02-4227-bece-76f7c63b253e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:06:38.897530 master-0 kubenswrapper[29936]: I1205 13:06:38.897483 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a2c9ca2-9c02-4227-bece-76f7c63b253e" (UID: "6a2c9ca2-9c02-4227-bece-76f7c63b253e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:06:38.911939 master-0 kubenswrapper[29936]: I1205 13:06:38.898221 29936 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:38.911939 master-0 kubenswrapper[29936]: I1205 13:06:38.898264 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2c9ca2-9c02-4227-bece-76f7c63b253e-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:38.911939 master-0 kubenswrapper[29936]: I1205 13:06:38.903688 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a2c9ca2-9c02-4227-bece-76f7c63b253e-kube-api-access-7fjnk" (OuterVolumeSpecName: "kube-api-access-7fjnk") pod "6a2c9ca2-9c02-4227-bece-76f7c63b253e" (UID: "6a2c9ca2-9c02-4227-bece-76f7c63b253e"). InnerVolumeSpecName "kube-api-access-7fjnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:38.928742 master-0 kubenswrapper[29936]: I1205 13:06:38.912703 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-scripts" (OuterVolumeSpecName: "scripts") pod "6a2c9ca2-9c02-4227-bece-76f7c63b253e" (UID: "6a2c9ca2-9c02-4227-bece-76f7c63b253e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:38.938973 master-0 kubenswrapper[29936]: I1205 13:06:38.938879 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"b2456ca5-ce7b-4eb5-a500-48c27677c9d8","Type":"ContainerStarted","Data":"3907560de483e0b5746488074ef7c48cd4801668951115628434b7aa76ec315b"} Dec 05 13:06:38.939316 master-0 kubenswrapper[29936]: I1205 13:06:38.939111 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b46d8-default-internal-api-0" podUID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" containerName="glance-log" containerID="cri-o://9ded3aeca2213c6c58323ea1046091985fedc44fbba7630e7378acdb7c606b5b" gracePeriod=30 Dec 05 13:06:38.939397 master-0 kubenswrapper[29936]: I1205 13:06:38.939365 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b46d8-default-internal-api-0" podUID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" containerName="glance-httpd" containerID="cri-o://3907560de483e0b5746488074ef7c48cd4801668951115628434b7aa76ec315b" gracePeriod=30 Dec 05 13:06:38.945127 master-0 kubenswrapper[29936]: I1205 13:06:38.945046 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c" (OuterVolumeSpecName: "glance") pod "6a2c9ca2-9c02-4227-bece-76f7c63b253e" (UID: "6a2c9ca2-9c02-4227-bece-76f7c63b253e"). InnerVolumeSpecName "pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 13:06:38.945542 master-0 kubenswrapper[29936]: I1205 13:06:38.945511 29936 generic.go:334] "Generic (PLEG): container finished" podID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" containerID="c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71" exitCode=0 Dec 05 13:06:38.945542 master-0 kubenswrapper[29936]: I1205 13:06:38.945539 29936 generic.go:334] "Generic (PLEG): container finished" podID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" containerID="c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e" exitCode=143 Dec 05 13:06:38.945733 master-0 kubenswrapper[29936]: I1205 13:06:38.945593 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"6a2c9ca2-9c02-4227-bece-76f7c63b253e","Type":"ContainerDied","Data":"c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71"} Dec 05 13:06:38.945733 master-0 kubenswrapper[29936]: I1205 13:06:38.945645 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"6a2c9ca2-9c02-4227-bece-76f7c63b253e","Type":"ContainerDied","Data":"c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e"} Dec 05 13:06:38.945733 master-0 kubenswrapper[29936]: I1205 13:06:38.945658 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"6a2c9ca2-9c02-4227-bece-76f7c63b253e","Type":"ContainerDied","Data":"64894524f5da148b9cf066b28c1d4bcf823f484545f979477dded27064876363"} Dec 05 13:06:38.945733 master-0 kubenswrapper[29936]: I1205 13:06:38.945655 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:38.946929 master-0 kubenswrapper[29936]: I1205 13:06:38.945680 29936 scope.go:117] "RemoveContainer" containerID="c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71" Dec 05 13:06:38.951037 master-0 kubenswrapper[29936]: I1205 13:06:38.950588 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687c55df6d-h9cdt" event={"ID":"d81510ca-2c9d-4582-8ce7-e101673e0397","Type":"ContainerStarted","Data":"ac8ea8507964893787d6d79e4e4ff331cd5ebe4c0840c73ad34eb98395ebb584"} Dec 05 13:06:38.951037 master-0 kubenswrapper[29936]: I1205 13:06:38.950689 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687c55df6d-h9cdt" event={"ID":"d81510ca-2c9d-4582-8ce7-e101673e0397","Type":"ContainerStarted","Data":"bd54f7cc9bed2ba2cd738b5a68023bafeed30dd387f525ff43d8173aa45c0352"} Dec 05 13:06:38.956761 master-0 kubenswrapper[29936]: I1205 13:06:38.952077 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a2c9ca2-9c02-4227-bece-76f7c63b253e" (UID: "6a2c9ca2-9c02-4227-bece-76f7c63b253e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:38.983068 master-0 kubenswrapper[29936]: I1205 13:06:38.983009 29936 scope.go:117] "RemoveContainer" containerID="c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e" Dec 05 13:06:38.994576 master-0 kubenswrapper[29936]: I1205 13:06:38.994278 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b46d8-default-internal-api-0" podStartSLOduration=17.994259441 podStartE2EDuration="17.994259441s" podCreationTimestamp="2025-12-05 13:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:38.988339037 +0000 UTC m=+996.120418738" watchObservedRunningTime="2025-12-05 13:06:38.994259441 +0000 UTC m=+996.126339122" Dec 05 13:06:39.007761 master-0 kubenswrapper[29936]: I1205 13:06:39.007672 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:39.007761 master-0 kubenswrapper[29936]: I1205 13:06:39.007739 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:39.007761 master-0 kubenswrapper[29936]: I1205 13:06:39.007770 29936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") on node \"master-0\" " Dec 05 13:06:39.008054 master-0 kubenswrapper[29936]: I1205 13:06:39.007802 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fjnk\" (UniqueName: \"kubernetes.io/projected/6a2c9ca2-9c02-4227-bece-76f7c63b253e-kube-api-access-7fjnk\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:39.016150 master-0 kubenswrapper[29936]: I1205 13:06:39.016079 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-config-data" (OuterVolumeSpecName: "config-data") pod "6a2c9ca2-9c02-4227-bece-76f7c63b253e" (UID: "6a2c9ca2-9c02-4227-bece-76f7c63b253e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:39.048292 master-0 kubenswrapper[29936]: I1205 13:06:39.048239 29936 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 13:06:39.048574 master-0 kubenswrapper[29936]: I1205 13:06:39.048474 29936 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe" (UniqueName: "kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c") on node "master-0" Dec 05 13:06:39.098455 master-0 kubenswrapper[29936]: I1205 13:06:39.097958 29936 scope.go:117] "RemoveContainer" containerID="c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71" Dec 05 13:06:39.099515 master-0 kubenswrapper[29936]: E1205 13:06:39.099196 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71\": container with ID starting with c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71 not found: ID does not exist" containerID="c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71" Dec 05 13:06:39.100613 master-0 kubenswrapper[29936]: I1205 13:06:39.099278 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71"} err="failed to get container status \"c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71\": rpc error: code = NotFound desc = could not find container \"c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71\": container with ID starting with c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71 not found: ID does not exist" Dec 05 13:06:39.100613 master-0 kubenswrapper[29936]: I1205 13:06:39.100378 29936 scope.go:117] "RemoveContainer" containerID="c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e" Dec 05 13:06:39.102738 master-0 kubenswrapper[29936]: E1205 13:06:39.101594 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e\": container with ID starting with c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e not found: ID does not exist" containerID="c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e" Dec 05 13:06:39.102738 master-0 kubenswrapper[29936]: I1205 13:06:39.101656 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e"} err="failed to get container status \"c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e\": rpc error: code = NotFound desc = could not find container \"c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e\": container with ID starting with c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e not found: ID does not exist" Dec 05 13:06:39.102738 master-0 kubenswrapper[29936]: I1205 13:06:39.101692 29936 scope.go:117] "RemoveContainer" containerID="c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71" Dec 05 13:06:39.102738 master-0 kubenswrapper[29936]: I1205 13:06:39.102638 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71"} err="failed to get container status \"c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71\": rpc error: code = NotFound desc = could not find container \"c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71\": container with ID starting with c6db90935c15012e0d4390e14c95534358ec6471794f35bfd7a03ecc2fb2cb71 not found: ID does not exist" Dec 05 13:06:39.102738 master-0 kubenswrapper[29936]: I1205 13:06:39.102708 29936 scope.go:117] "RemoveContainer" containerID="c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e" Dec 05 13:06:39.103293 master-0 kubenswrapper[29936]: I1205 13:06:39.103248 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e"} err="failed to get container status \"c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e\": rpc error: code = NotFound desc = could not find container \"c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e\": container with ID starting with c4a70624eb0ce1a50723a2ed35a3295b4ab4bef02b543f74214adead5ff0841e not found: ID does not exist" Dec 05 13:06:39.114589 master-0 kubenswrapper[29936]: I1205 13:06:39.113211 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2c9ca2-9c02-4227-bece-76f7c63b253e-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:39.114589 master-0 kubenswrapper[29936]: I1205 13:06:39.113268 29936 reconciler_common.go:293] "Volume detached for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:39.315112 master-0 kubenswrapper[29936]: I1205 13:06:39.315006 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:39.336832 master-0 kubenswrapper[29936]: I1205 13:06:39.335879 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:39.355690 master-0 kubenswrapper[29936]: I1205 13:06:39.353734 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:39.358552 master-0 kubenswrapper[29936]: E1205 13:06:39.358425 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" containerName="glance-log" Dec 05 13:06:39.366153 master-0 kubenswrapper[29936]: I1205 13:06:39.366079 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" containerName="glance-log" Dec 05 13:06:39.366395 master-0 kubenswrapper[29936]: E1205 13:06:39.366222 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" containerName="glance-httpd" Dec 05 13:06:39.366395 master-0 kubenswrapper[29936]: I1205 13:06:39.366237 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" containerName="glance-httpd" Dec 05 13:06:39.367009 master-0 kubenswrapper[29936]: I1205 13:06:39.366958 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" containerName="glance-log" Dec 05 13:06:39.367009 master-0 kubenswrapper[29936]: I1205 13:06:39.366996 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" containerName="glance-httpd" Dec 05 13:06:39.391877 master-0 kubenswrapper[29936]: I1205 13:06:39.369502 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.391877 master-0 kubenswrapper[29936]: I1205 13:06:39.373273 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:39.391877 master-0 kubenswrapper[29936]: I1205 13:06:39.377963 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b46d8-default-external-config-data" Dec 05 13:06:39.391877 master-0 kubenswrapper[29936]: I1205 13:06:39.379294 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 13:06:39.425071 master-0 kubenswrapper[29936]: I1205 13:06:39.419792 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.425071 master-0 kubenswrapper[29936]: I1205 13:06:39.419884 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fw2z\" (UniqueName: \"kubernetes.io/projected/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-kube-api-access-5fw2z\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.425071 master-0 kubenswrapper[29936]: I1205 13:06:39.419923 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-public-tls-certs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.425071 master-0 kubenswrapper[29936]: I1205 13:06:39.419976 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.425071 master-0 kubenswrapper[29936]: I1205 13:06:39.420006 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.425071 master-0 kubenswrapper[29936]: I1205 13:06:39.420063 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.425071 master-0 kubenswrapper[29936]: I1205 13:06:39.420102 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.425071 master-0 kubenswrapper[29936]: I1205 13:06:39.420369 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.522411 master-0 kubenswrapper[29936]: I1205 13:06:39.522350 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fw2z\" (UniqueName: \"kubernetes.io/projected/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-kube-api-access-5fw2z\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.523204 master-0 kubenswrapper[29936]: I1205 13:06:39.523162 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-public-tls-certs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.523358 master-0 kubenswrapper[29936]: I1205 13:06:39.523338 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.523482 master-0 kubenswrapper[29936]: I1205 13:06:39.523465 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.523624 master-0 kubenswrapper[29936]: I1205 13:06:39.523606 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.523938 master-0 kubenswrapper[29936]: I1205 13:06:39.523922 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.524137 master-0 kubenswrapper[29936]: I1205 13:06:39.524120 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.524272 master-0 kubenswrapper[29936]: I1205 13:06:39.524257 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.524728 master-0 kubenswrapper[29936]: I1205 13:06:39.524676 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.526074 master-0 kubenswrapper[29936]: I1205 13:06:39.525510 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.530398 master-0 kubenswrapper[29936]: I1205 13:06:39.530320 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-public-tls-certs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.548308 master-0 kubenswrapper[29936]: I1205 13:06:39.531424 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.548308 master-0 kubenswrapper[29936]: I1205 13:06:39.534310 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:06:39.548308 master-0 kubenswrapper[29936]: I1205 13:06:39.534366 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/49c247ccb75d82d7aa2d53a927c5c3fb8512a27de6927aa154d2e3366fa1652b/globalmount\"" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.548308 master-0 kubenswrapper[29936]: I1205 13:06:39.543126 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.568904 master-0 kubenswrapper[29936]: I1205 13:06:39.568818 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.581213 master-0 kubenswrapper[29936]: I1205 13:06:39.580771 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fw2z\" (UniqueName: \"kubernetes.io/projected/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-kube-api-access-5fw2z\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:39.969029 master-0 kubenswrapper[29936]: I1205 13:06:39.968947 29936 generic.go:334] "Generic (PLEG): container finished" podID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" containerID="3907560de483e0b5746488074ef7c48cd4801668951115628434b7aa76ec315b" exitCode=0 Dec 05 13:06:39.969029 master-0 kubenswrapper[29936]: I1205 13:06:39.968998 29936 generic.go:334] "Generic (PLEG): container finished" podID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" containerID="9ded3aeca2213c6c58323ea1046091985fedc44fbba7630e7378acdb7c606b5b" exitCode=143 Dec 05 13:06:39.969029 master-0 kubenswrapper[29936]: I1205 13:06:39.969050 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"b2456ca5-ce7b-4eb5-a500-48c27677c9d8","Type":"ContainerDied","Data":"3907560de483e0b5746488074ef7c48cd4801668951115628434b7aa76ec315b"} Dec 05 13:06:39.969721 master-0 kubenswrapper[29936]: I1205 13:06:39.969089 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"b2456ca5-ce7b-4eb5-a500-48c27677c9d8","Type":"ContainerDied","Data":"9ded3aeca2213c6c58323ea1046091985fedc44fbba7630e7378acdb7c606b5b"} Dec 05 13:06:39.975607 master-0 kubenswrapper[29936]: I1205 13:06:39.974828 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687c55df6d-h9cdt" event={"ID":"d81510ca-2c9d-4582-8ce7-e101673e0397","Type":"ContainerStarted","Data":"88ab0f938b0c4b1d83a21b7ed835e613f10e4b1b2153c3db80a1624b7e0cae6c"} Dec 05 13:06:39.975958 master-0 kubenswrapper[29936]: I1205 13:06:39.975735 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:39.975958 master-0 kubenswrapper[29936]: I1205 13:06:39.975801 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:06:40.043053 master-0 kubenswrapper[29936]: I1205 13:06:40.041588 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-687c55df6d-h9cdt" podStartSLOduration=3.041557091 podStartE2EDuration="3.041557091s" podCreationTimestamp="2025-12-05 13:06:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:40.030303971 +0000 UTC m=+997.162383672" watchObservedRunningTime="2025-12-05 13:06:40.041557091 +0000 UTC m=+997.173636772" Dec 05 13:06:41.033339 master-0 kubenswrapper[29936]: I1205 13:06:41.027634 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:41.205006 master-0 kubenswrapper[29936]: I1205 13:06:41.204900 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a2c9ca2-9c02-4227-bece-76f7c63b253e" path="/var/lib/kubelet/pods/6a2c9ca2-9c02-4227-bece-76f7c63b253e/volumes" Dec 05 13:06:41.219669 master-0 kubenswrapper[29936]: I1205 13:06:41.217866 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:42.011543 master-0 kubenswrapper[29936]: I1205 13:06:42.011446 29936 generic.go:334] "Generic (PLEG): container finished" podID="d4d7d168-a010-44ef-b2cb-1ec979fb38c6" containerID="141ab7a29ac7f33ed8f5ff26052076988c5f1c4aeaf98072d1684f5c36ae252d" exitCode=0 Dec 05 13:06:42.011543 master-0 kubenswrapper[29936]: I1205 13:06:42.011511 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kcsv8" event={"ID":"d4d7d168-a010-44ef-b2cb-1ec979fb38c6","Type":"ContainerDied","Data":"141ab7a29ac7f33ed8f5ff26052076988c5f1c4aeaf98072d1684f5c36ae252d"} Dec 05 13:06:42.208444 master-0 kubenswrapper[29936]: I1205 13:06:42.207758 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:42.306641 master-0 kubenswrapper[29936]: I1205 13:06:42.306459 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-httpd-run\") pod \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " Dec 05 13:06:42.306641 master-0 kubenswrapper[29936]: I1205 13:06:42.306537 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-scripts\") pod \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " Dec 05 13:06:42.306641 master-0 kubenswrapper[29936]: I1205 13:06:42.306561 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-logs\") pod \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " Dec 05 13:06:42.307020 master-0 kubenswrapper[29936]: I1205 13:06:42.306734 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " Dec 05 13:06:42.307020 master-0 kubenswrapper[29936]: I1205 13:06:42.306810 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-config-data\") pod \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " Dec 05 13:06:42.307020 master-0 kubenswrapper[29936]: I1205 13:06:42.306847 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhft2\" (UniqueName: \"kubernetes.io/projected/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-kube-api-access-lhft2\") pod \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " Dec 05 13:06:42.307020 master-0 kubenswrapper[29936]: I1205 13:06:42.306984 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-combined-ca-bundle\") pod \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\" (UID: \"b2456ca5-ce7b-4eb5-a500-48c27677c9d8\") " Dec 05 13:06:42.307394 master-0 kubenswrapper[29936]: I1205 13:06:42.307316 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b2456ca5-ce7b-4eb5-a500-48c27677c9d8" (UID: "b2456ca5-ce7b-4eb5-a500-48c27677c9d8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:06:42.307454 master-0 kubenswrapper[29936]: I1205 13:06:42.307343 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-logs" (OuterVolumeSpecName: "logs") pod "b2456ca5-ce7b-4eb5-a500-48c27677c9d8" (UID: "b2456ca5-ce7b-4eb5-a500-48c27677c9d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:06:42.311605 master-0 kubenswrapper[29936]: I1205 13:06:42.311523 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-scripts" (OuterVolumeSpecName: "scripts") pod "b2456ca5-ce7b-4eb5-a500-48c27677c9d8" (UID: "b2456ca5-ce7b-4eb5-a500-48c27677c9d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:42.313390 master-0 kubenswrapper[29936]: I1205 13:06:42.313331 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-kube-api-access-lhft2" (OuterVolumeSpecName: "kube-api-access-lhft2") pod "b2456ca5-ce7b-4eb5-a500-48c27677c9d8" (UID: "b2456ca5-ce7b-4eb5-a500-48c27677c9d8"). InnerVolumeSpecName "kube-api-access-lhft2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:42.338596 master-0 kubenswrapper[29936]: I1205 13:06:42.338502 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f" (OuterVolumeSpecName: "glance") pod "b2456ca5-ce7b-4eb5-a500-48c27677c9d8" (UID: "b2456ca5-ce7b-4eb5-a500-48c27677c9d8"). InnerVolumeSpecName "pvc-0d3529a7-2405-43a7-8986-74c66fb23772". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 13:06:42.347902 master-0 kubenswrapper[29936]: I1205 13:06:42.347793 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2456ca5-ce7b-4eb5-a500-48c27677c9d8" (UID: "b2456ca5-ce7b-4eb5-a500-48c27677c9d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:42.374219 master-0 kubenswrapper[29936]: I1205 13:06:42.374086 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-config-data" (OuterVolumeSpecName: "config-data") pod "b2456ca5-ce7b-4eb5-a500-48c27677c9d8" (UID: "b2456ca5-ce7b-4eb5-a500-48c27677c9d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:42.410077 master-0 kubenswrapper[29936]: I1205 13:06:42.409966 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:42.410238 master-0 kubenswrapper[29936]: I1205 13:06:42.410076 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhft2\" (UniqueName: \"kubernetes.io/projected/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-kube-api-access-lhft2\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:42.410238 master-0 kubenswrapper[29936]: I1205 13:06:42.410162 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:42.410238 master-0 kubenswrapper[29936]: I1205 13:06:42.410174 29936 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:42.410238 master-0 kubenswrapper[29936]: I1205 13:06:42.410212 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:42.410238 master-0 kubenswrapper[29936]: I1205 13:06:42.410221 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2456ca5-ce7b-4eb5-a500-48c27677c9d8-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:42.410425 master-0 kubenswrapper[29936]: I1205 13:06:42.410292 29936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") on node \"master-0\" " Dec 05 13:06:42.447516 master-0 kubenswrapper[29936]: I1205 13:06:42.447436 29936 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 13:06:42.447871 master-0 kubenswrapper[29936]: I1205 13:06:42.447680 29936 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0d3529a7-2405-43a7-8986-74c66fb23772" (UniqueName: "kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f") on node "master-0" Dec 05 13:06:42.517130 master-0 kubenswrapper[29936]: I1205 13:06:42.516997 29936 reconciler_common.go:293] "Volume detached for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:43.033844 master-0 kubenswrapper[29936]: I1205 13:06:43.033735 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"b2456ca5-ce7b-4eb5-a500-48c27677c9d8","Type":"ContainerDied","Data":"6b34de3b40a1c4afe72eed353f7c25187850b884bdd16b2200a8524371fc99a8"} Dec 05 13:06:43.033844 master-0 kubenswrapper[29936]: I1205 13:06:43.033842 29936 scope.go:117] "RemoveContainer" containerID="3907560de483e0b5746488074ef7c48cd4801668951115628434b7aa76ec315b" Dec 05 13:06:43.034447 master-0 kubenswrapper[29936]: I1205 13:06:43.033893 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.118533 master-0 kubenswrapper[29936]: I1205 13:06:43.118327 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:43.154220 master-0 kubenswrapper[29936]: I1205 13:06:43.145287 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:43.166236 master-0 kubenswrapper[29936]: I1205 13:06:43.164075 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:43.166236 master-0 kubenswrapper[29936]: E1205 13:06:43.164839 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" containerName="glance-log" Dec 05 13:06:43.166236 master-0 kubenswrapper[29936]: I1205 13:06:43.164863 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" containerName="glance-log" Dec 05 13:06:43.166236 master-0 kubenswrapper[29936]: E1205 13:06:43.164894 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" containerName="glance-httpd" Dec 05 13:06:43.166236 master-0 kubenswrapper[29936]: I1205 13:06:43.164901 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" containerName="glance-httpd" Dec 05 13:06:43.166236 master-0 kubenswrapper[29936]: I1205 13:06:43.165210 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" containerName="glance-log" Dec 05 13:06:43.166236 master-0 kubenswrapper[29936]: I1205 13:06:43.165259 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" containerName="glance-httpd" Dec 05 13:06:43.167913 master-0 kubenswrapper[29936]: I1205 13:06:43.166965 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.171836 master-0 kubenswrapper[29936]: I1205 13:06:43.170745 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b46d8-default-internal-config-data" Dec 05 13:06:43.174426 master-0 kubenswrapper[29936]: I1205 13:06:43.173857 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 13:06:43.180524 master-0 kubenswrapper[29936]: I1205 13:06:43.180436 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:43.289121 master-0 kubenswrapper[29936]: I1205 13:06:43.288893 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2456ca5-ce7b-4eb5-a500-48c27677c9d8" path="/var/lib/kubelet/pods/b2456ca5-ce7b-4eb5-a500-48c27677c9d8/volumes" Dec 05 13:06:43.348560 master-0 kubenswrapper[29936]: I1205 13:06:43.346738 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.348560 master-0 kubenswrapper[29936]: I1205 13:06:43.347989 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-internal-tls-certs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.348560 master-0 kubenswrapper[29936]: I1205 13:06:43.348350 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.348560 master-0 kubenswrapper[29936]: I1205 13:06:43.348435 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.349035 master-0 kubenswrapper[29936]: I1205 13:06:43.348738 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.349035 master-0 kubenswrapper[29936]: I1205 13:06:43.348883 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwvk2\" (UniqueName: \"kubernetes.io/projected/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-kube-api-access-bwvk2\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.349035 master-0 kubenswrapper[29936]: I1205 13:06:43.348940 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.349140 master-0 kubenswrapper[29936]: I1205 13:06:43.349047 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.452563 master-0 kubenswrapper[29936]: I1205 13:06:43.451361 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-internal-tls-certs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.452563 master-0 kubenswrapper[29936]: I1205 13:06:43.451496 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.452563 master-0 kubenswrapper[29936]: I1205 13:06:43.451522 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.452563 master-0 kubenswrapper[29936]: I1205 13:06:43.451641 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.452563 master-0 kubenswrapper[29936]: I1205 13:06:43.451687 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwvk2\" (UniqueName: \"kubernetes.io/projected/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-kube-api-access-bwvk2\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.452563 master-0 kubenswrapper[29936]: I1205 13:06:43.451711 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.452563 master-0 kubenswrapper[29936]: I1205 13:06:43.451748 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.452563 master-0 kubenswrapper[29936]: I1205 13:06:43.451793 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.454903 master-0 kubenswrapper[29936]: I1205 13:06:43.454867 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.455554 master-0 kubenswrapper[29936]: I1205 13:06:43.455379 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.456620 master-0 kubenswrapper[29936]: I1205 13:06:43.456379 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:06:43.456620 master-0 kubenswrapper[29936]: I1205 13:06:43.456440 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c11f3d51f30df7daf7a2bb71b828158f184983aebcc191306ab2ed71e7a567d1/globalmount\"" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.456918 master-0 kubenswrapper[29936]: I1205 13:06:43.456887 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-internal-tls-certs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.457127 master-0 kubenswrapper[29936]: I1205 13:06:43.457090 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.462241 master-0 kubenswrapper[29936]: I1205 13:06:43.461752 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.462841 master-0 kubenswrapper[29936]: I1205 13:06:43.462784 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:43.476384 master-0 kubenswrapper[29936]: I1205 13:06:43.476328 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwvk2\" (UniqueName: \"kubernetes.io/projected/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-kube-api-access-bwvk2\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:44.070362 master-0 kubenswrapper[29936]: I1205 13:06:44.070266 29936 scope.go:117] "RemoveContainer" containerID="9ded3aeca2213c6c58323ea1046091985fedc44fbba7630e7378acdb7c606b5b" Dec 05 13:06:44.158403 master-0 kubenswrapper[29936]: I1205 13:06:44.158325 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:44.282735 master-0 kubenswrapper[29936]: I1205 13:06:44.282357 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-combined-ca-bundle\") pod \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " Dec 05 13:06:44.282735 master-0 kubenswrapper[29936]: I1205 13:06:44.282507 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-fernet-keys\") pod \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " Dec 05 13:06:44.283571 master-0 kubenswrapper[29936]: I1205 13:06:44.283383 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcbkf\" (UniqueName: \"kubernetes.io/projected/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-kube-api-access-jcbkf\") pod \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " Dec 05 13:06:44.283571 master-0 kubenswrapper[29936]: I1205 13:06:44.283525 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-config-data\") pod \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " Dec 05 13:06:44.284022 master-0 kubenswrapper[29936]: I1205 13:06:44.283951 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-credential-keys\") pod \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " Dec 05 13:06:44.284112 master-0 kubenswrapper[29936]: I1205 13:06:44.284058 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-scripts\") pod \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\" (UID: \"d4d7d168-a010-44ef-b2cb-1ec979fb38c6\") " Dec 05 13:06:44.287739 master-0 kubenswrapper[29936]: I1205 13:06:44.287691 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d4d7d168-a010-44ef-b2cb-1ec979fb38c6" (UID: "d4d7d168-a010-44ef-b2cb-1ec979fb38c6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:44.290851 master-0 kubenswrapper[29936]: I1205 13:06:44.290788 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-kube-api-access-jcbkf" (OuterVolumeSpecName: "kube-api-access-jcbkf") pod "d4d7d168-a010-44ef-b2cb-1ec979fb38c6" (UID: "d4d7d168-a010-44ef-b2cb-1ec979fb38c6"). InnerVolumeSpecName "kube-api-access-jcbkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:44.293462 master-0 kubenswrapper[29936]: I1205 13:06:44.293423 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-scripts" (OuterVolumeSpecName: "scripts") pod "d4d7d168-a010-44ef-b2cb-1ec979fb38c6" (UID: "d4d7d168-a010-44ef-b2cb-1ec979fb38c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:44.295307 master-0 kubenswrapper[29936]: I1205 13:06:44.295216 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d4d7d168-a010-44ef-b2cb-1ec979fb38c6" (UID: "d4d7d168-a010-44ef-b2cb-1ec979fb38c6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:44.335840 master-0 kubenswrapper[29936]: I1205 13:06:44.327405 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4d7d168-a010-44ef-b2cb-1ec979fb38c6" (UID: "d4d7d168-a010-44ef-b2cb-1ec979fb38c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:44.335840 master-0 kubenswrapper[29936]: I1205 13:06:44.333282 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-config-data" (OuterVolumeSpecName: "config-data") pod "d4d7d168-a010-44ef-b2cb-1ec979fb38c6" (UID: "d4d7d168-a010-44ef-b2cb-1ec979fb38c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:44.390717 master-0 kubenswrapper[29936]: I1205 13:06:44.390618 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:44.390717 master-0 kubenswrapper[29936]: I1205 13:06:44.390679 29936 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-fernet-keys\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:44.390717 master-0 kubenswrapper[29936]: I1205 13:06:44.390695 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcbkf\" (UniqueName: \"kubernetes.io/projected/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-kube-api-access-jcbkf\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:44.390717 master-0 kubenswrapper[29936]: I1205 13:06:44.390710 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:44.390717 master-0 kubenswrapper[29936]: I1205 13:06:44.390721 29936 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-credential-keys\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:44.390717 master-0 kubenswrapper[29936]: I1205 13:06:44.390734 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d7d168-a010-44ef-b2cb-1ec979fb38c6-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:44.839415 master-0 kubenswrapper[29936]: W1205 13:06:44.839343 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fce82a4_d12a_4773_bcc6_37cfc2a46b3f.slice/crio-c385119d00ef28c6515cf4da5d767b06553bd05d74e71885802d9f10e26714e3 WatchSource:0}: Error finding container c385119d00ef28c6515cf4da5d767b06553bd05d74e71885802d9f10e26714e3: Status 404 returned error can't find the container with id c385119d00ef28c6515cf4da5d767b06553bd05d74e71885802d9f10e26714e3 Dec 05 13:06:44.847820 master-0 kubenswrapper[29936]: I1205 13:06:44.847266 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:06:44.891818 master-0 kubenswrapper[29936]: I1205 13:06:44.891733 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:45.041777 master-0 kubenswrapper[29936]: I1205 13:06:45.041694 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:45.069267 master-0 kubenswrapper[29936]: I1205 13:06:45.069167 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kcsv8" Dec 05 13:06:45.069662 master-0 kubenswrapper[29936]: I1205 13:06:45.069198 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kcsv8" event={"ID":"d4d7d168-a010-44ef-b2cb-1ec979fb38c6","Type":"ContainerDied","Data":"2ee0420543936810b2ff2a6db8f57352f5013a4d719780c95894771a269d9f84"} Dec 05 13:06:45.069662 master-0 kubenswrapper[29936]: I1205 13:06:45.069485 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ee0420543936810b2ff2a6db8f57352f5013a4d719780c95894771a269d9f84" Dec 05 13:06:45.071309 master-0 kubenswrapper[29936]: I1205 13:06:45.071244 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f","Type":"ContainerStarted","Data":"c385119d00ef28c6515cf4da5d767b06553bd05d74e71885802d9f10e26714e3"} Dec 05 13:06:45.074669 master-0 kubenswrapper[29936]: I1205 13:06:45.074328 29936 generic.go:334] "Generic (PLEG): container finished" podID="1690e553-8b77-483f-9f31-4f3968e6bd28" containerID="254d7dc18016a8d8ae26387e0351914c2065365423cc1ce28ac2831343ee5f6c" exitCode=0 Dec 05 13:06:45.074669 master-0 kubenswrapper[29936]: I1205 13:06:45.074411 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-h9l4n" event={"ID":"1690e553-8b77-483f-9f31-4f3968e6bd28","Type":"ContainerDied","Data":"254d7dc18016a8d8ae26387e0351914c2065365423cc1ce28ac2831343ee5f6c"} Dec 05 13:06:45.443741 master-0 kubenswrapper[29936]: I1205 13:06:45.443653 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6584d6f967-pjksk"] Dec 05 13:06:45.445559 master-0 kubenswrapper[29936]: E1205 13:06:45.444575 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d7d168-a010-44ef-b2cb-1ec979fb38c6" containerName="keystone-bootstrap" Dec 05 13:06:45.445559 master-0 kubenswrapper[29936]: I1205 13:06:45.444600 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d7d168-a010-44ef-b2cb-1ec979fb38c6" containerName="keystone-bootstrap" Dec 05 13:06:45.445559 master-0 kubenswrapper[29936]: I1205 13:06:45.444950 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d7d168-a010-44ef-b2cb-1ec979fb38c6" containerName="keystone-bootstrap" Dec 05 13:06:45.446161 master-0 kubenswrapper[29936]: I1205 13:06:45.446130 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.454812 master-0 kubenswrapper[29936]: I1205 13:06:45.454675 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 13:06:45.455224 master-0 kubenswrapper[29936]: I1205 13:06:45.454880 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 13:06:45.455416 master-0 kubenswrapper[29936]: I1205 13:06:45.455368 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 13:06:45.455638 master-0 kubenswrapper[29936]: I1205 13:06:45.455546 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 13:06:45.456546 master-0 kubenswrapper[29936]: I1205 13:06:45.455786 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 13:06:45.496803 master-0 kubenswrapper[29936]: I1205 13:06:45.481597 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6584d6f967-pjksk"] Dec 05 13:06:45.647219 master-0 kubenswrapper[29936]: I1205 13:06:45.638513 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-internal-tls-certs\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.647219 master-0 kubenswrapper[29936]: I1205 13:06:45.638614 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdct8\" (UniqueName: \"kubernetes.io/projected/3ea488c2-a874-45d8-9e61-16aba9ce44e6-kube-api-access-vdct8\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.647219 master-0 kubenswrapper[29936]: I1205 13:06:45.638750 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-config-data\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.647219 master-0 kubenswrapper[29936]: I1205 13:06:45.638917 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-public-tls-certs\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.647219 master-0 kubenswrapper[29936]: I1205 13:06:45.639015 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-scripts\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.647219 master-0 kubenswrapper[29936]: I1205 13:06:45.639102 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-credential-keys\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.647219 master-0 kubenswrapper[29936]: I1205 13:06:45.639239 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-fernet-keys\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.647219 master-0 kubenswrapper[29936]: I1205 13:06:45.639331 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-combined-ca-bundle\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.742637 master-0 kubenswrapper[29936]: I1205 13:06:45.741274 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdct8\" (UniqueName: \"kubernetes.io/projected/3ea488c2-a874-45d8-9e61-16aba9ce44e6-kube-api-access-vdct8\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.742637 master-0 kubenswrapper[29936]: I1205 13:06:45.741388 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-config-data\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.746277 master-0 kubenswrapper[29936]: I1205 13:06:45.744905 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-public-tls-certs\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.746277 master-0 kubenswrapper[29936]: I1205 13:06:45.745025 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-scripts\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.746277 master-0 kubenswrapper[29936]: I1205 13:06:45.745205 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-credential-keys\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.746277 master-0 kubenswrapper[29936]: I1205 13:06:45.745260 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-fernet-keys\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.746277 master-0 kubenswrapper[29936]: I1205 13:06:45.745378 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-combined-ca-bundle\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.746277 master-0 kubenswrapper[29936]: I1205 13:06:45.745469 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-internal-tls-certs\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.750213 master-0 kubenswrapper[29936]: I1205 13:06:45.746933 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-config-data\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.750213 master-0 kubenswrapper[29936]: I1205 13:06:45.749741 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-public-tls-certs\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.750397 master-0 kubenswrapper[29936]: W1205 13:06:45.750294 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ef1ef52_6a31_4620_bda8_d76e5eaa97f3.slice/crio-971f893e7f84c3e087cd861b6d7f2c0ffab437d179d0b089f41cb191a87ecc89 WatchSource:0}: Error finding container 971f893e7f84c3e087cd861b6d7f2c0ffab437d179d0b089f41cb191a87ecc89: Status 404 returned error can't find the container with id 971f893e7f84c3e087cd861b6d7f2c0ffab437d179d0b089f41cb191a87ecc89 Dec 05 13:06:45.754199 master-0 kubenswrapper[29936]: I1205 13:06:45.750577 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-scripts\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.754199 master-0 kubenswrapper[29936]: I1205 13:06:45.750607 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-combined-ca-bundle\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.754199 master-0 kubenswrapper[29936]: I1205 13:06:45.752615 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-fernet-keys\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.760412 master-0 kubenswrapper[29936]: I1205 13:06:45.759592 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-internal-tls-certs\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.764211 master-0 kubenswrapper[29936]: I1205 13:06:45.760951 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:06:45.774211 master-0 kubenswrapper[29936]: I1205 13:06:45.770863 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ea488c2-a874-45d8-9e61-16aba9ce44e6-credential-keys\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.779221 master-0 kubenswrapper[29936]: I1205 13:06:45.776070 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdct8\" (UniqueName: \"kubernetes.io/projected/3ea488c2-a874-45d8-9e61-16aba9ce44e6-kube-api-access-vdct8\") pod \"keystone-6584d6f967-pjksk\" (UID: \"3ea488c2-a874-45d8-9e61-16aba9ce44e6\") " pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:45.946839 master-0 kubenswrapper[29936]: I1205 13:06:45.946754 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:46.091807 master-0 kubenswrapper[29936]: I1205 13:06:46.091743 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3","Type":"ContainerStarted","Data":"971f893e7f84c3e087cd861b6d7f2c0ffab437d179d0b089f41cb191a87ecc89"} Dec 05 13:06:46.098806 master-0 kubenswrapper[29936]: I1205 13:06:46.098722 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f","Type":"ContainerStarted","Data":"32763bbc7419a7e1ea06531624d0b3477a23d9b540fc3fbe12ca7af9e3c4701b"} Dec 05 13:06:46.104849 master-0 kubenswrapper[29936]: I1205 13:06:46.103435 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-h9l4n" event={"ID":"1690e553-8b77-483f-9f31-4f3968e6bd28","Type":"ContainerStarted","Data":"87c7ae63b5c1f83dc43365b898e62fdbd922a603605747e0cef5ff0cc54f165a"} Dec 05 13:06:46.706680 master-0 kubenswrapper[29936]: I1205 13:06:46.706609 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-h9l4n" podStartSLOduration=15.341681301 podStartE2EDuration="22.706583738s" podCreationTimestamp="2025-12-05 13:06:24 +0000 UTC" firstStartedPulling="2025-12-05 13:06:36.872867758 +0000 UTC m=+994.004947439" lastFinishedPulling="2025-12-05 13:06:44.237770195 +0000 UTC m=+1001.369849876" observedRunningTime="2025-12-05 13:06:46.150099149 +0000 UTC m=+1003.282178840" watchObservedRunningTime="2025-12-05 13:06:46.706583738 +0000 UTC m=+1003.838663419" Dec 05 13:06:46.734210 master-0 kubenswrapper[29936]: I1205 13:06:46.731064 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6584d6f967-pjksk"] Dec 05 13:06:47.125210 master-0 kubenswrapper[29936]: I1205 13:06:47.124516 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3","Type":"ContainerStarted","Data":"91537211166e171704ba5738cc011df13401c26aa877dfe384da4a0d9bc16e8f"} Dec 05 13:06:47.129202 master-0 kubenswrapper[29936]: I1205 13:06:47.128770 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f","Type":"ContainerStarted","Data":"3d7fd03fb9af5943d7ca0daceaacc49e1f67c092ce546e5fe5caa7835b1d52b4"} Dec 05 13:06:47.134274 master-0 kubenswrapper[29936]: I1205 13:06:47.133852 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6584d6f967-pjksk" event={"ID":"3ea488c2-a874-45d8-9e61-16aba9ce44e6","Type":"ContainerStarted","Data":"983e1c1943210f9d01a412c4411f1996cf81a483b432dcbcbbb7fcd58269d991"} Dec 05 13:06:47.134274 master-0 kubenswrapper[29936]: I1205 13:06:47.133894 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:06:47.134274 master-0 kubenswrapper[29936]: I1205 13:06:47.133906 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6584d6f967-pjksk" event={"ID":"3ea488c2-a874-45d8-9e61-16aba9ce44e6","Type":"ContainerStarted","Data":"d8ae2bd3117e93381c59c10573c571574fef7ad96e03cd3a3b34291b238f5c2a"} Dec 05 13:06:47.181727 master-0 kubenswrapper[29936]: I1205 13:06:47.181621 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b46d8-default-external-api-0" podStartSLOduration=8.181593782 podStartE2EDuration="8.181593782s" podCreationTimestamp="2025-12-05 13:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:47.160477457 +0000 UTC m=+1004.292557148" watchObservedRunningTime="2025-12-05 13:06:47.181593782 +0000 UTC m=+1004.313673453" Dec 05 13:06:47.221347 master-0 kubenswrapper[29936]: I1205 13:06:47.219609 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6584d6f967-pjksk" podStartSLOduration=2.219578333 podStartE2EDuration="2.219578333s" podCreationTimestamp="2025-12-05 13:06:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:47.189885896 +0000 UTC m=+1004.321965597" watchObservedRunningTime="2025-12-05 13:06:47.219578333 +0000 UTC m=+1004.351658014" Dec 05 13:06:48.158105 master-0 kubenswrapper[29936]: I1205 13:06:48.158007 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3","Type":"ContainerStarted","Data":"583c77d7821dfe955594ede88ab11361e9b1c678ebf2612190f5915df17f6217"} Dec 05 13:06:48.162728 master-0 kubenswrapper[29936]: I1205 13:06:48.162649 29936 generic.go:334] "Generic (PLEG): container finished" podID="b82bd984-5f64-433b-a41a-f5186287a0f7" containerID="8c9fdd2f2167aecbf9110e725bed0a387f56041962376d469528fcc3bbcc45a3" exitCode=0 Dec 05 13:06:48.163964 master-0 kubenswrapper[29936]: I1205 13:06:48.163906 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-db-sync-6scb5" event={"ID":"b82bd984-5f64-433b-a41a-f5186287a0f7","Type":"ContainerDied","Data":"8c9fdd2f2167aecbf9110e725bed0a387f56041962376d469528fcc3bbcc45a3"} Dec 05 13:06:48.366384 master-0 kubenswrapper[29936]: I1205 13:06:48.366092 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b46d8-default-internal-api-0" podStartSLOduration=5.366060465 podStartE2EDuration="5.366060465s" podCreationTimestamp="2025-12-05 13:06:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:48.355165653 +0000 UTC m=+1005.487245364" watchObservedRunningTime="2025-12-05 13:06:48.366060465 +0000 UTC m=+1005.498140146" Dec 05 13:06:49.177235 master-0 kubenswrapper[29936]: I1205 13:06:49.177040 29936 generic.go:334] "Generic (PLEG): container finished" podID="549e5366-4e6e-4d97-aeb2-25f74ce81b4b" containerID="2ba41f0827deea62e1400136039f07960c0733677767da6118aec92285f52a1d" exitCode=0 Dec 05 13:06:49.177235 master-0 kubenswrapper[29936]: I1205 13:06:49.177143 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d6bnq" event={"ID":"549e5366-4e6e-4d97-aeb2-25f74ce81b4b","Type":"ContainerDied","Data":"2ba41f0827deea62e1400136039f07960c0733677767da6118aec92285f52a1d"} Dec 05 13:06:49.620407 master-0 kubenswrapper[29936]: I1205 13:06:49.620343 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:49.802725 master-0 kubenswrapper[29936]: I1205 13:06:49.802576 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-config-data\") pod \"b82bd984-5f64-433b-a41a-f5186287a0f7\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " Dec 05 13:06:49.802725 master-0 kubenswrapper[29936]: I1205 13:06:49.802695 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b82bd984-5f64-433b-a41a-f5186287a0f7-etc-machine-id\") pod \"b82bd984-5f64-433b-a41a-f5186287a0f7\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " Dec 05 13:06:49.803042 master-0 kubenswrapper[29936]: I1205 13:06:49.802736 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-combined-ca-bundle\") pod \"b82bd984-5f64-433b-a41a-f5186287a0f7\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " Dec 05 13:06:49.803042 master-0 kubenswrapper[29936]: I1205 13:06:49.802845 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-db-sync-config-data\") pod \"b82bd984-5f64-433b-a41a-f5186287a0f7\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " Dec 05 13:06:49.803042 master-0 kubenswrapper[29936]: I1205 13:06:49.802841 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b82bd984-5f64-433b-a41a-f5186287a0f7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b82bd984-5f64-433b-a41a-f5186287a0f7" (UID: "b82bd984-5f64-433b-a41a-f5186287a0f7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:06:49.803042 master-0 kubenswrapper[29936]: I1205 13:06:49.802877 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxgsx\" (UniqueName: \"kubernetes.io/projected/b82bd984-5f64-433b-a41a-f5186287a0f7-kube-api-access-sxgsx\") pod \"b82bd984-5f64-433b-a41a-f5186287a0f7\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " Dec 05 13:06:49.803196 master-0 kubenswrapper[29936]: I1205 13:06:49.803096 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-scripts\") pod \"b82bd984-5f64-433b-a41a-f5186287a0f7\" (UID: \"b82bd984-5f64-433b-a41a-f5186287a0f7\") " Dec 05 13:06:49.803653 master-0 kubenswrapper[29936]: I1205 13:06:49.803624 29936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b82bd984-5f64-433b-a41a-f5186287a0f7-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:49.807018 master-0 kubenswrapper[29936]: I1205 13:06:49.806923 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b82bd984-5f64-433b-a41a-f5186287a0f7" (UID: "b82bd984-5f64-433b-a41a-f5186287a0f7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:49.807994 master-0 kubenswrapper[29936]: I1205 13:06:49.807915 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-scripts" (OuterVolumeSpecName: "scripts") pod "b82bd984-5f64-433b-a41a-f5186287a0f7" (UID: "b82bd984-5f64-433b-a41a-f5186287a0f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:49.808234 master-0 kubenswrapper[29936]: I1205 13:06:49.808152 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82bd984-5f64-433b-a41a-f5186287a0f7-kube-api-access-sxgsx" (OuterVolumeSpecName: "kube-api-access-sxgsx") pod "b82bd984-5f64-433b-a41a-f5186287a0f7" (UID: "b82bd984-5f64-433b-a41a-f5186287a0f7"). InnerVolumeSpecName "kube-api-access-sxgsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:49.837941 master-0 kubenswrapper[29936]: I1205 13:06:49.837850 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b82bd984-5f64-433b-a41a-f5186287a0f7" (UID: "b82bd984-5f64-433b-a41a-f5186287a0f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:49.871732 master-0 kubenswrapper[29936]: I1205 13:06:49.871573 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-config-data" (OuterVolumeSpecName: "config-data") pod "b82bd984-5f64-433b-a41a-f5186287a0f7" (UID: "b82bd984-5f64-433b-a41a-f5186287a0f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:49.906419 master-0 kubenswrapper[29936]: I1205 13:06:49.906353 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:49.906982 master-0 kubenswrapper[29936]: I1205 13:06:49.906968 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:49.907130 master-0 kubenswrapper[29936]: I1205 13:06:49.907073 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:49.907267 master-0 kubenswrapper[29936]: I1205 13:06:49.907245 29936 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b82bd984-5f64-433b-a41a-f5186287a0f7-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:49.907355 master-0 kubenswrapper[29936]: I1205 13:06:49.907340 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxgsx\" (UniqueName: \"kubernetes.io/projected/b82bd984-5f64-433b-a41a-f5186287a0f7-kube-api-access-sxgsx\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:50.225600 master-0 kubenswrapper[29936]: I1205 13:06:50.225533 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-db-sync-6scb5" Dec 05 13:06:50.225600 master-0 kubenswrapper[29936]: I1205 13:06:50.225604 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-db-sync-6scb5" event={"ID":"b82bd984-5f64-433b-a41a-f5186287a0f7","Type":"ContainerDied","Data":"79b83ccd8e3bf4e69a70ccd6f3ac4837419a37e1205366adf6336ac214099e09"} Dec 05 13:06:50.226390 master-0 kubenswrapper[29936]: I1205 13:06:50.225636 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79b83ccd8e3bf4e69a70ccd6f3ac4837419a37e1205366adf6336ac214099e09" Dec 05 13:06:50.487078 master-0 kubenswrapper[29936]: I1205 13:06:50.486910 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b46d8-scheduler-0"] Dec 05 13:06:50.487804 master-0 kubenswrapper[29936]: E1205 13:06:50.487769 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82bd984-5f64-433b-a41a-f5186287a0f7" containerName="cinder-b46d8-db-sync" Dec 05 13:06:50.487804 master-0 kubenswrapper[29936]: I1205 13:06:50.487802 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82bd984-5f64-433b-a41a-f5186287a0f7" containerName="cinder-b46d8-db-sync" Dec 05 13:06:50.488259 master-0 kubenswrapper[29936]: I1205 13:06:50.488228 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82bd984-5f64-433b-a41a-f5186287a0f7" containerName="cinder-b46d8-db-sync" Dec 05 13:06:50.490420 master-0 kubenswrapper[29936]: I1205 13:06:50.490351 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.499898 master-0 kubenswrapper[29936]: I1205 13:06:50.499797 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-config-data" Dec 05 13:06:50.502483 master-0 kubenswrapper[29936]: I1205 13:06:50.501506 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-scripts" Dec 05 13:06:50.502483 master-0 kubenswrapper[29936]: I1205 13:06:50.501815 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-scheduler-config-data" Dec 05 13:06:50.530904 master-0 kubenswrapper[29936]: I1205 13:06:50.527371 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b46d8-volume-lvm-iscsi-0"] Dec 05 13:06:50.530904 master-0 kubenswrapper[29936]: I1205 13:06:50.530124 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.542206 master-0 kubenswrapper[29936]: I1205 13:06:50.535787 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-volume-lvm-iscsi-config-data" Dec 05 13:06:50.610295 master-0 kubenswrapper[29936]: I1205 13:06:50.610074 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-scheduler-0"] Dec 05 13:06:50.638491 master-0 kubenswrapper[29936]: I1205 13:06:50.638422 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-dev\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.638824 master-0 kubenswrapper[29936]: I1205 13:06:50.638803 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-sys\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.638996 master-0 kubenswrapper[29936]: I1205 13:06:50.638975 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.639388 master-0 kubenswrapper[29936]: I1205 13:06:50.639261 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-scripts\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.639827 master-0 kubenswrapper[29936]: I1205 13:06:50.639412 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twkxl\" (UniqueName: \"kubernetes.io/projected/c2913bf6-55a0-42ff-b4ca-e8e39335d588-kube-api-access-twkxl\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.639827 master-0 kubenswrapper[29936]: I1205 13:06:50.639581 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data-custom\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.639827 master-0 kubenswrapper[29936]: I1205 13:06:50.639805 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-brick\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.639995 master-0 kubenswrapper[29936]: I1205 13:06:50.639834 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-nvme\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.640148 master-0 kubenswrapper[29936]: I1205 13:06:50.640118 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9wjg\" (UniqueName: \"kubernetes.io/projected/062a3b87-3828-49fb-8b3b-099ef02fe5a3-kube-api-access-c9wjg\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.640370 master-0 kubenswrapper[29936]: I1205 13:06:50.640334 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/062a3b87-3828-49fb-8b3b-099ef02fe5a3-etc-machine-id\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.640438 master-0 kubenswrapper[29936]: I1205 13:06:50.640401 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-iscsi\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.640490 master-0 kubenswrapper[29936]: I1205 13:06:50.640443 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.640490 master-0 kubenswrapper[29936]: I1205 13:06:50.640481 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-machine-id\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.640580 master-0 kubenswrapper[29936]: I1205 13:06:50.640541 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-lib-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.641427 master-0 kubenswrapper[29936]: I1205 13:06:50.640676 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.641427 master-0 kubenswrapper[29936]: I1205 13:06:50.640739 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data-custom\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.641427 master-0 kubenswrapper[29936]: I1205 13:06:50.640830 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-combined-ca-bundle\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.641427 master-0 kubenswrapper[29936]: I1205 13:06:50.640926 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-scripts\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.641427 master-0 kubenswrapper[29936]: I1205 13:06:50.640950 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-lib-modules\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.641427 master-0 kubenswrapper[29936]: I1205 13:06:50.640979 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-combined-ca-bundle\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.641427 master-0 kubenswrapper[29936]: I1205 13:06:50.641024 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-run\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.662412 master-0 kubenswrapper[29936]: I1205 13:06:50.662321 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b46d8-backup-0"] Dec 05 13:06:50.667413 master-0 kubenswrapper[29936]: I1205 13:06:50.667123 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.671207 master-0 kubenswrapper[29936]: I1205 13:06:50.670891 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-backup-config-data" Dec 05 13:06:50.709550 master-0 kubenswrapper[29936]: I1205 13:06:50.709477 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-volume-lvm-iscsi-0"] Dec 05 13:06:50.741377 master-0 kubenswrapper[29936]: I1205 13:06:50.739524 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-backup-0"] Dec 05 13:06:50.749352 master-0 kubenswrapper[29936]: I1205 13:06:50.748951 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/062a3b87-3828-49fb-8b3b-099ef02fe5a3-etc-machine-id\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.749352 master-0 kubenswrapper[29936]: I1205 13:06:50.749041 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-iscsi\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749352 master-0 kubenswrapper[29936]: I1205 13:06:50.749132 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-iscsi\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749352 master-0 kubenswrapper[29936]: I1205 13:06:50.749170 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/062a3b87-3828-49fb-8b3b-099ef02fe5a3-etc-machine-id\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.749352 master-0 kubenswrapper[29936]: I1205 13:06:50.749287 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749352 master-0 kubenswrapper[29936]: I1205 13:06:50.749329 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-machine-id\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749352 master-0 kubenswrapper[29936]: I1205 13:06:50.749378 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-lib-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749432 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749469 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data-custom\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749525 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-combined-ca-bundle\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749578 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-lib-modules\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749600 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-combined-ca-bundle\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749625 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-scripts\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749651 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-run\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749743 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-dev\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749766 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-sys\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749839 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.749883 master-0 kubenswrapper[29936]: I1205 13:06:50.749873 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-scripts\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.750311 master-0 kubenswrapper[29936]: I1205 13:06:50.749902 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twkxl\" (UniqueName: \"kubernetes.io/projected/c2913bf6-55a0-42ff-b4ca-e8e39335d588-kube-api-access-twkxl\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.750311 master-0 kubenswrapper[29936]: I1205 13:06:50.749948 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data-custom\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.750311 master-0 kubenswrapper[29936]: I1205 13:06:50.750039 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-brick\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.750311 master-0 kubenswrapper[29936]: I1205 13:06:50.750070 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-nvme\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.750311 master-0 kubenswrapper[29936]: I1205 13:06:50.750153 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9wjg\" (UniqueName: \"kubernetes.io/projected/062a3b87-3828-49fb-8b3b-099ef02fe5a3-kube-api-access-c9wjg\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.753125 master-0 kubenswrapper[29936]: I1205 13:06:50.751711 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-run\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.753125 master-0 kubenswrapper[29936]: I1205 13:06:50.751981 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-nvme\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.753125 master-0 kubenswrapper[29936]: I1205 13:06:50.752080 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-dev\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.753125 master-0 kubenswrapper[29936]: I1205 13:06:50.752369 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-lib-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.753125 master-0 kubenswrapper[29936]: I1205 13:06:50.752712 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.753417 master-0 kubenswrapper[29936]: I1205 13:06:50.753137 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-machine-id\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.753417 master-0 kubenswrapper[29936]: I1205 13:06:50.753317 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-lib-modules\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.760301 master-0 kubenswrapper[29936]: I1205 13:06:50.757896 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-brick\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.762262 master-0 kubenswrapper[29936]: I1205 13:06:50.762206 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-combined-ca-bundle\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.762385 master-0 kubenswrapper[29936]: I1205 13:06:50.762304 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-sys\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.763759 master-0 kubenswrapper[29936]: I1205 13:06:50.763597 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.765150 master-0 kubenswrapper[29936]: I1205 13:06:50.765016 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data-custom\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.772703 master-0 kubenswrapper[29936]: I1205 13:06:50.767091 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-scripts\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.772703 master-0 kubenswrapper[29936]: I1205 13:06:50.767341 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data-custom\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.772703 master-0 kubenswrapper[29936]: I1205 13:06:50.767795 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.772703 master-0 kubenswrapper[29936]: I1205 13:06:50.769982 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-scripts\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.772703 master-0 kubenswrapper[29936]: I1205 13:06:50.770133 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-combined-ca-bundle\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.774435 master-0 kubenswrapper[29936]: I1205 13:06:50.774372 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6566b9889c-g47n7"] Dec 05 13:06:50.776748 master-0 kubenswrapper[29936]: I1205 13:06:50.776715 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:50.797158 master-0 kubenswrapper[29936]: I1205 13:06:50.796063 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twkxl\" (UniqueName: \"kubernetes.io/projected/c2913bf6-55a0-42ff-b4ca-e8e39335d588-kube-api-access-twkxl\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.803067 master-0 kubenswrapper[29936]: I1205 13:06:50.802979 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9wjg\" (UniqueName: \"kubernetes.io/projected/062a3b87-3828-49fb-8b3b-099ef02fe5a3-kube-api-access-c9wjg\") pod \"cinder-b46d8-scheduler-0\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.840261 master-0 kubenswrapper[29936]: I1205 13:06:50.840111 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6566b9889c-g47n7"] Dec 05 13:06:50.855943 master-0 kubenswrapper[29936]: I1205 13:06:50.855828 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-combined-ca-bundle\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856235 master-0 kubenswrapper[29936]: I1205 13:06:50.856000 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-machine-id\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856235 master-0 kubenswrapper[29936]: I1205 13:06:50.856048 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-dev\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856235 master-0 kubenswrapper[29936]: I1205 13:06:50.856112 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-scripts\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856235 master-0 kubenswrapper[29936]: I1205 13:06:50.856151 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data-custom\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856235 master-0 kubenswrapper[29936]: I1205 13:06:50.856224 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-lib-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856408 master-0 kubenswrapper[29936]: I1205 13:06:50.856284 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856408 master-0 kubenswrapper[29936]: I1205 13:06:50.856380 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-run\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856482 master-0 kubenswrapper[29936]: I1205 13:06:50.856460 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-sys\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856517 master-0 kubenswrapper[29936]: I1205 13:06:50.856481 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtgzd\" (UniqueName: \"kubernetes.io/projected/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-kube-api-access-wtgzd\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856629 master-0 kubenswrapper[29936]: I1205 13:06:50.856583 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.856694 master-0 kubenswrapper[29936]: I1205 13:06:50.856670 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-brick\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.857073 master-0 kubenswrapper[29936]: I1205 13:06:50.856788 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-lib-modules\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.857073 master-0 kubenswrapper[29936]: I1205 13:06:50.856879 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-iscsi\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.860968 master-0 kubenswrapper[29936]: I1205 13:06:50.859650 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-nvme\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.899146 master-0 kubenswrapper[29936]: I1205 13:06:50.898473 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:06:50.899146 master-0 kubenswrapper[29936]: I1205 13:06:50.898613 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:06:50.906989 master-0 kubenswrapper[29936]: I1205 13:06:50.906365 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b46d8-api-0"] Dec 05 13:06:50.909455 master-0 kubenswrapper[29936]: I1205 13:06:50.909114 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:50.912813 master-0 kubenswrapper[29936]: I1205 13:06:50.912775 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-api-config-data" Dec 05 13:06:50.949794 master-0 kubenswrapper[29936]: I1205 13:06:50.943745 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:50.962337 master-0 kubenswrapper[29936]: I1205 13:06:50.961851 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-api-0"] Dec 05 13:06:50.962924 master-0 kubenswrapper[29936]: I1205 13:06:50.962877 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-dev\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.963003 master-0 kubenswrapper[29936]: I1205 13:06:50.962957 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-scripts\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.963059 master-0 kubenswrapper[29936]: I1205 13:06:50.962999 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data-custom\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.963059 master-0 kubenswrapper[29936]: I1205 13:06:50.963027 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-lib-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.963162 master-0 kubenswrapper[29936]: I1205 13:06:50.963074 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.963162 master-0 kubenswrapper[29936]: I1205 13:06:50.963125 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-sb\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:50.964408 master-0 kubenswrapper[29936]: I1205 13:06:50.963159 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-run\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.964408 master-0 kubenswrapper[29936]: I1205 13:06:50.963286 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtgzd\" (UniqueName: \"kubernetes.io/projected/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-kube-api-access-wtgzd\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.964408 master-0 kubenswrapper[29936]: I1205 13:06:50.963318 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-sys\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.964408 master-0 kubenswrapper[29936]: I1205 13:06:50.963351 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-config\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:50.964408 master-0 kubenswrapper[29936]: I1205 13:06:50.964293 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.964408 master-0 kubenswrapper[29936]: I1205 13:06:50.964325 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-brick\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.964703 master-0 kubenswrapper[29936]: I1205 13:06:50.964423 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-swift-storage-0\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:50.964703 master-0 kubenswrapper[29936]: I1205 13:06:50.964458 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-lib-modules\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.964703 master-0 kubenswrapper[29936]: I1205 13:06:50.964534 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-iscsi\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.964703 master-0 kubenswrapper[29936]: I1205 13:06:50.964580 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-nvme\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.964703 master-0 kubenswrapper[29936]: I1205 13:06:50.964635 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5hgf\" (UniqueName: \"kubernetes.io/projected/757653e2-456c-426e-b7d5-bc1ad57fda3f-kube-api-access-n5hgf\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:50.964923 master-0 kubenswrapper[29936]: I1205 13:06:50.964722 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-svc\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:50.964923 master-0 kubenswrapper[29936]: I1205 13:06:50.964774 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-combined-ca-bundle\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.965213 master-0 kubenswrapper[29936]: I1205 13:06:50.964881 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-nb\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:50.965516 master-0 kubenswrapper[29936]: I1205 13:06:50.965442 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-run\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.966629 master-0 kubenswrapper[29936]: I1205 13:06:50.966304 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-sys\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.966629 master-0 kubenswrapper[29936]: I1205 13:06:50.966472 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.966629 master-0 kubenswrapper[29936]: I1205 13:06:50.966530 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-lib-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.968026 master-0 kubenswrapper[29936]: I1205 13:06:50.967993 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-iscsi\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.968154 master-0 kubenswrapper[29936]: I1205 13:06:50.968045 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-machine-id\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.969155 master-0 kubenswrapper[29936]: I1205 13:06:50.969116 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-machine-id\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.969257 master-0 kubenswrapper[29936]: I1205 13:06:50.969202 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-brick\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.969257 master-0 kubenswrapper[29936]: I1205 13:06:50.969254 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-nvme\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.969355 master-0 kubenswrapper[29936]: I1205 13:06:50.969272 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-lib-modules\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.969915 master-0 kubenswrapper[29936]: I1205 13:06:50.969879 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-dev\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.972831 master-0 kubenswrapper[29936]: I1205 13:06:50.972781 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-scripts\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.974883 master-0 kubenswrapper[29936]: I1205 13:06:50.974819 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-combined-ca-bundle\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.980050 master-0 kubenswrapper[29936]: I1205 13:06:50.977210 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.980050 master-0 kubenswrapper[29936]: I1205 13:06:50.978762 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data-custom\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.988772 master-0 kubenswrapper[29936]: I1205 13:06:50.988695 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtgzd\" (UniqueName: \"kubernetes.io/projected/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-kube-api-access-wtgzd\") pod \"cinder-b46d8-backup-0\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:50.993999 master-0 kubenswrapper[29936]: I1205 13:06:50.993882 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:06:51.072752 master-0 kubenswrapper[29936]: I1205 13:06:51.072683 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-config\") pod \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " Dec 05 13:06:51.073086 master-0 kubenswrapper[29936]: I1205 13:06:51.073061 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8wwf\" (UniqueName: \"kubernetes.io/projected/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-kube-api-access-q8wwf\") pod \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " Dec 05 13:06:51.073160 master-0 kubenswrapper[29936]: I1205 13:06:51.073098 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-combined-ca-bundle\") pod \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\" (UID: \"549e5366-4e6e-4d97-aeb2-25f74ce81b4b\") " Dec 05 13:06:51.073567 master-0 kubenswrapper[29936]: I1205 13:06:51.073539 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9875035-8030-4010-9fdf-e44664936896-etc-machine-id\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.073673 master-0 kubenswrapper[29936]: I1205 13:06:51.073577 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.073673 master-0 kubenswrapper[29936]: I1205 13:06:51.073613 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-scripts\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.073673 master-0 kubenswrapper[29936]: I1205 13:06:51.073640 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5hgf\" (UniqueName: \"kubernetes.io/projected/757653e2-456c-426e-b7d5-bc1ad57fda3f-kube-api-access-n5hgf\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.073673 master-0 kubenswrapper[29936]: I1205 13:06:51.073658 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data-custom\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.073815 master-0 kubenswrapper[29936]: I1205 13:06:51.073703 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-svc\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.073815 master-0 kubenswrapper[29936]: I1205 13:06:51.073754 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-nb\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.073880 master-0 kubenswrapper[29936]: I1205 13:06:51.073817 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-sb\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.073880 master-0 kubenswrapper[29936]: I1205 13:06:51.073863 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-combined-ca-bundle\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.073943 master-0 kubenswrapper[29936]: I1205 13:06:51.073886 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mtdn\" (UniqueName: \"kubernetes.io/projected/f9875035-8030-4010-9fdf-e44664936896-kube-api-access-5mtdn\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.073943 master-0 kubenswrapper[29936]: I1205 13:06:51.073918 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-config\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.073943 master-0 kubenswrapper[29936]: I1205 13:06:51.073935 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9875035-8030-4010-9fdf-e44664936896-logs\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.074034 master-0 kubenswrapper[29936]: I1205 13:06:51.073982 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-swift-storage-0\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.075393 master-0 kubenswrapper[29936]: I1205 13:06:51.075349 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-swift-storage-0\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.077503 master-0 kubenswrapper[29936]: I1205 13:06:51.076862 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-nb\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.077503 master-0 kubenswrapper[29936]: I1205 13:06:51.076959 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-sb\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.077675 master-0 kubenswrapper[29936]: I1205 13:06:51.077624 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-config\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.078328 master-0 kubenswrapper[29936]: I1205 13:06:51.077926 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-svc\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.079486 master-0 kubenswrapper[29936]: I1205 13:06:51.079427 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-kube-api-access-q8wwf" (OuterVolumeSpecName: "kube-api-access-q8wwf") pod "549e5366-4e6e-4d97-aeb2-25f74ce81b4b" (UID: "549e5366-4e6e-4d97-aeb2-25f74ce81b4b"). InnerVolumeSpecName "kube-api-access-q8wwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:51.100250 master-0 kubenswrapper[29936]: I1205 13:06:51.100163 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5hgf\" (UniqueName: \"kubernetes.io/projected/757653e2-456c-426e-b7d5-bc1ad57fda3f-kube-api-access-n5hgf\") pod \"dnsmasq-dns-6566b9889c-g47n7\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.128115 master-0 kubenswrapper[29936]: I1205 13:06:51.128009 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-config" (OuterVolumeSpecName: "config") pod "549e5366-4e6e-4d97-aeb2-25f74ce81b4b" (UID: "549e5366-4e6e-4d97-aeb2-25f74ce81b4b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:51.166142 master-0 kubenswrapper[29936]: I1205 13:06:51.166039 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "549e5366-4e6e-4d97-aeb2-25f74ce81b4b" (UID: "549e5366-4e6e-4d97-aeb2-25f74ce81b4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:51.177683 master-0 kubenswrapper[29936]: I1205 13:06:51.177595 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9875035-8030-4010-9fdf-e44664936896-etc-machine-id\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.177931 master-0 kubenswrapper[29936]: I1205 13:06:51.177734 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9875035-8030-4010-9fdf-e44664936896-etc-machine-id\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.178188 master-0 kubenswrapper[29936]: I1205 13:06:51.178141 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.178923 master-0 kubenswrapper[29936]: I1205 13:06:51.178883 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-scripts\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.179011 master-0 kubenswrapper[29936]: I1205 13:06:51.178971 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data-custom\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.181593 master-0 kubenswrapper[29936]: I1205 13:06:51.180614 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-combined-ca-bundle\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.181593 master-0 kubenswrapper[29936]: I1205 13:06:51.180681 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mtdn\" (UniqueName: \"kubernetes.io/projected/f9875035-8030-4010-9fdf-e44664936896-kube-api-access-5mtdn\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.181593 master-0 kubenswrapper[29936]: I1205 13:06:51.180788 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9875035-8030-4010-9fdf-e44664936896-logs\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.181593 master-0 kubenswrapper[29936]: I1205 13:06:51.181146 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:51.181593 master-0 kubenswrapper[29936]: I1205 13:06:51.181219 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8wwf\" (UniqueName: \"kubernetes.io/projected/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-kube-api-access-q8wwf\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:51.181593 master-0 kubenswrapper[29936]: I1205 13:06:51.181233 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/549e5366-4e6e-4d97-aeb2-25f74ce81b4b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:51.181593 master-0 kubenswrapper[29936]: I1205 13:06:51.181343 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9875035-8030-4010-9fdf-e44664936896-logs\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.183066 master-0 kubenswrapper[29936]: I1205 13:06:51.182941 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data-custom\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.186950 master-0 kubenswrapper[29936]: I1205 13:06:51.186893 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-scripts\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.187858 master-0 kubenswrapper[29936]: I1205 13:06:51.187802 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.202136 master-0 kubenswrapper[29936]: I1205 13:06:51.202044 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mtdn\" (UniqueName: \"kubernetes.io/projected/f9875035-8030-4010-9fdf-e44664936896-kube-api-access-5mtdn\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.219654 master-0 kubenswrapper[29936]: I1205 13:06:51.218818 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:51.220052 master-0 kubenswrapper[29936]: I1205 13:06:51.219978 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-combined-ca-bundle\") pod \"cinder-b46d8-api-0\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.221273 master-0 kubenswrapper[29936]: I1205 13:06:51.221245 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:51.250620 master-0 kubenswrapper[29936]: I1205 13:06:51.250298 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-d6bnq" Dec 05 13:06:51.251242 master-0 kubenswrapper[29936]: I1205 13:06:51.250851 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-d6bnq" event={"ID":"549e5366-4e6e-4d97-aeb2-25f74ce81b4b","Type":"ContainerDied","Data":"ca835c4a093dce5df433a43aa8a98644fb19627d451c3f4f3614978ab61beec4"} Dec 05 13:06:51.251242 master-0 kubenswrapper[29936]: I1205 13:06:51.250925 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca835c4a093dce5df433a43aa8a98644fb19627d451c3f4f3614978ab61beec4" Dec 05 13:06:51.284333 master-0 kubenswrapper[29936]: I1205 13:06:51.280428 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:51.289300 master-0 kubenswrapper[29936]: I1205 13:06:51.289241 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:51.381875 master-0 kubenswrapper[29936]: I1205 13:06:51.381758 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:51.397453 master-0 kubenswrapper[29936]: I1205 13:06:51.397365 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:51.569471 master-0 kubenswrapper[29936]: I1205 13:06:51.569405 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-volume-lvm-iscsi-0"] Dec 05 13:06:51.774796 master-0 kubenswrapper[29936]: I1205 13:06:51.772991 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6566b9889c-g47n7"] Dec 05 13:06:51.798370 master-0 kubenswrapper[29936]: I1205 13:06:51.798301 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f9d98978f-74tz7"] Dec 05 13:06:51.803278 master-0 kubenswrapper[29936]: E1205 13:06:51.798956 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549e5366-4e6e-4d97-aeb2-25f74ce81b4b" containerName="neutron-db-sync" Dec 05 13:06:51.803278 master-0 kubenswrapper[29936]: I1205 13:06:51.798982 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="549e5366-4e6e-4d97-aeb2-25f74ce81b4b" containerName="neutron-db-sync" Dec 05 13:06:51.803278 master-0 kubenswrapper[29936]: I1205 13:06:51.799281 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="549e5366-4e6e-4d97-aeb2-25f74ce81b4b" containerName="neutron-db-sync" Dec 05 13:06:51.803278 master-0 kubenswrapper[29936]: I1205 13:06:51.801473 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.835489 master-0 kubenswrapper[29936]: I1205 13:06:51.830018 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.835489 master-0 kubenswrapper[29936]: I1205 13:06:51.830107 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.835489 master-0 kubenswrapper[29936]: I1205 13:06:51.830160 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-scheduler-0"] Dec 05 13:06:51.835489 master-0 kubenswrapper[29936]: I1205 13:06:51.830526 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.835489 master-0 kubenswrapper[29936]: I1205 13:06:51.830592 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-svc\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.835489 master-0 kubenswrapper[29936]: I1205 13:06:51.830725 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-config\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.835489 master-0 kubenswrapper[29936]: I1205 13:06:51.833278 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndmt4\" (UniqueName: \"kubernetes.io/projected/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-kube-api-access-ndmt4\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.848974 master-0 kubenswrapper[29936]: I1205 13:06:51.848867 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5f5d648448-8m7zn"] Dec 05 13:06:51.852053 master-0 kubenswrapper[29936]: I1205 13:06:51.851991 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:51.885426 master-0 kubenswrapper[29936]: I1205 13:06:51.856024 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 13:06:51.885426 master-0 kubenswrapper[29936]: I1205 13:06:51.856711 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 13:06:51.885426 master-0 kubenswrapper[29936]: I1205 13:06:51.857044 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 13:06:51.895543 master-0 kubenswrapper[29936]: I1205 13:06:51.895419 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f9d98978f-74tz7"] Dec 05 13:06:51.937478 master-0 kubenswrapper[29936]: I1205 13:06:51.937349 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-config\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.937478 master-0 kubenswrapper[29936]: I1205 13:06:51.937486 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-combined-ca-bundle\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:51.937845 master-0 kubenswrapper[29936]: I1205 13:06:51.937554 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndmt4\" (UniqueName: \"kubernetes.io/projected/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-kube-api-access-ndmt4\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.937845 master-0 kubenswrapper[29936]: I1205 13:06:51.937616 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-config\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:51.938968 master-0 kubenswrapper[29936]: I1205 13:06:51.938542 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-ovndb-tls-certs\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:51.938968 master-0 kubenswrapper[29936]: I1205 13:06:51.938657 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.938968 master-0 kubenswrapper[29936]: I1205 13:06:51.938717 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-config\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.938968 master-0 kubenswrapper[29936]: I1205 13:06:51.938729 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.940494 master-0 kubenswrapper[29936]: I1205 13:06:51.940338 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.943460 master-0 kubenswrapper[29936]: I1205 13:06:51.943343 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztlch\" (UniqueName: \"kubernetes.io/projected/8ee01710-7ad7-47e9-8268-09a33572ab6a-kube-api-access-ztlch\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:51.945140 master-0 kubenswrapper[29936]: I1205 13:06:51.945074 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.947046 master-0 kubenswrapper[29936]: I1205 13:06:51.946983 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-svc\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.947240 master-0 kubenswrapper[29936]: I1205 13:06:51.947208 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-httpd-config\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:51.947769 master-0 kubenswrapper[29936]: I1205 13:06:51.947723 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.948932 master-0 kubenswrapper[29936]: I1205 13:06:51.948885 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.950819 master-0 kubenswrapper[29936]: I1205 13:06:51.950755 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-svc\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.967377 master-0 kubenswrapper[29936]: I1205 13:06:51.964248 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndmt4\" (UniqueName: \"kubernetes.io/projected/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-kube-api-access-ndmt4\") pod \"dnsmasq-dns-5f9d98978f-74tz7\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:51.971423 master-0 kubenswrapper[29936]: I1205 13:06:51.971357 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f5d648448-8m7zn"] Dec 05 13:06:51.999207 master-0 kubenswrapper[29936]: I1205 13:06:51.999086 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-backup-0"] Dec 05 13:06:52.059933 master-0 kubenswrapper[29936]: I1205 13:06:52.050350 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztlch\" (UniqueName: \"kubernetes.io/projected/8ee01710-7ad7-47e9-8268-09a33572ab6a-kube-api-access-ztlch\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.059933 master-0 kubenswrapper[29936]: I1205 13:06:52.050554 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-httpd-config\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.059933 master-0 kubenswrapper[29936]: I1205 13:06:52.050624 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-combined-ca-bundle\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.059933 master-0 kubenswrapper[29936]: I1205 13:06:52.050671 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-config\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.059933 master-0 kubenswrapper[29936]: I1205 13:06:52.050738 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-ovndb-tls-certs\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.084962 master-0 kubenswrapper[29936]: I1205 13:06:52.084875 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-combined-ca-bundle\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.085341 master-0 kubenswrapper[29936]: I1205 13:06:52.085009 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-ovndb-tls-certs\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.095747 master-0 kubenswrapper[29936]: I1205 13:06:52.095005 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-httpd-config\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.095747 master-0 kubenswrapper[29936]: I1205 13:06:52.095238 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztlch\" (UniqueName: \"kubernetes.io/projected/8ee01710-7ad7-47e9-8268-09a33572ab6a-kube-api-access-ztlch\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.097746 master-0 kubenswrapper[29936]: I1205 13:06:52.095784 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-config\") pod \"neutron-5f5d648448-8m7zn\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.155711 master-0 kubenswrapper[29936]: I1205 13:06:52.154523 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:52.185036 master-0 kubenswrapper[29936]: I1205 13:06:52.184951 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:52.201405 master-0 kubenswrapper[29936]: I1205 13:06:52.198833 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6566b9889c-g47n7"] Dec 05 13:06:52.269012 master-0 kubenswrapper[29936]: I1205 13:06:52.268924 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-api-0"] Dec 05 13:06:52.286425 master-0 kubenswrapper[29936]: I1205 13:06:52.286326 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" event={"ID":"c2913bf6-55a0-42ff-b4ca-e8e39335d588","Type":"ContainerStarted","Data":"cb2a279acfaf030181eefd5e4aeffcb61214a2999f7482505bcbc582dcfacffc"} Dec 05 13:06:52.292649 master-0 kubenswrapper[29936]: I1205 13:06:52.292230 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-scheduler-0" event={"ID":"062a3b87-3828-49fb-8b3b-099ef02fe5a3","Type":"ContainerStarted","Data":"a37989d2f420716ea334bf7c018b26bffd325dfc42ef9329af3d2ccd5ca04567"} Dec 05 13:06:52.295536 master-0 kubenswrapper[29936]: I1205 13:06:52.295241 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-backup-0" event={"ID":"46decc62-ea7e-4ec1-ad45-7ff4812f77a5","Type":"ContainerStarted","Data":"141c9bba7eac5148ac0d908d62b93b2b23daf83214ba40941d20ad2ec6d276d5"} Dec 05 13:06:52.298759 master-0 kubenswrapper[29936]: I1205 13:06:52.298598 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6566b9889c-g47n7" event={"ID":"757653e2-456c-426e-b7d5-bc1ad57fda3f","Type":"ContainerStarted","Data":"51dbb6c6e23a80ba8b041d11a7cadf5de4d562af2cbb8a9b20456d383150990d"} Dec 05 13:06:52.299422 master-0 kubenswrapper[29936]: I1205 13:06:52.299284 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:52.299422 master-0 kubenswrapper[29936]: I1205 13:06:52.299386 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:52.831192 master-0 kubenswrapper[29936]: I1205 13:06:52.831061 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f9d98978f-74tz7"] Dec 05 13:06:52.902917 master-0 kubenswrapper[29936]: I1205 13:06:52.901990 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f5d648448-8m7zn"] Dec 05 13:06:52.921122 master-0 kubenswrapper[29936]: W1205 13:06:52.921039 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ee01710_7ad7_47e9_8268_09a33572ab6a.slice/crio-fd0a6ba0af535b40632b7f9ecd4f3c5083e72c0c4d90447a211980846ab4ad55 WatchSource:0}: Error finding container fd0a6ba0af535b40632b7f9ecd4f3c5083e72c0c4d90447a211980846ab4ad55: Status 404 returned error can't find the container with id fd0a6ba0af535b40632b7f9ecd4f3c5083e72c0c4d90447a211980846ab4ad55 Dec 05 13:06:53.314968 master-0 kubenswrapper[29936]: I1205 13:06:53.314895 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-api-0" event={"ID":"f9875035-8030-4010-9fdf-e44664936896","Type":"ContainerStarted","Data":"5fd22cc25300662ac79cec0b48a4c454d23fa48b62f7e79f87e85fa7b9500f4b"} Dec 05 13:06:53.317341 master-0 kubenswrapper[29936]: I1205 13:06:53.317276 29936 generic.go:334] "Generic (PLEG): container finished" podID="757653e2-456c-426e-b7d5-bc1ad57fda3f" containerID="41169359ec3c597beeee9a90dcb50a3caab0d83e783731a75c2549bc3df5ee9d" exitCode=0 Dec 05 13:06:53.317442 master-0 kubenswrapper[29936]: I1205 13:06:53.317377 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6566b9889c-g47n7" event={"ID":"757653e2-456c-426e-b7d5-bc1ad57fda3f","Type":"ContainerDied","Data":"41169359ec3c597beeee9a90dcb50a3caab0d83e783731a75c2549bc3df5ee9d"} Dec 05 13:06:53.321468 master-0 kubenswrapper[29936]: I1205 13:06:53.320846 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" event={"ID":"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a","Type":"ContainerStarted","Data":"7e21017d8ec106b9b45de27fb2cc4416d064d4c60a9f6da2964a998629894b86"} Dec 05 13:06:53.323306 master-0 kubenswrapper[29936]: I1205 13:06:53.323238 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f5d648448-8m7zn" event={"ID":"8ee01710-7ad7-47e9-8268-09a33572ab6a","Type":"ContainerStarted","Data":"fd0a6ba0af535b40632b7f9ecd4f3c5083e72c0c4d90447a211980846ab4ad55"} Dec 05 13:06:54.384595 master-0 kubenswrapper[29936]: I1205 13:06:54.382240 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6566b9889c-g47n7" event={"ID":"757653e2-456c-426e-b7d5-bc1ad57fda3f","Type":"ContainerDied","Data":"51dbb6c6e23a80ba8b041d11a7cadf5de4d562af2cbb8a9b20456d383150990d"} Dec 05 13:06:54.391866 master-0 kubenswrapper[29936]: I1205 13:06:54.391085 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51dbb6c6e23a80ba8b041d11a7cadf5de4d562af2cbb8a9b20456d383150990d" Dec 05 13:06:54.404864 master-0 kubenswrapper[29936]: I1205 13:06:54.404775 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-api-0" event={"ID":"f9875035-8030-4010-9fdf-e44664936896","Type":"ContainerStarted","Data":"f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9"} Dec 05 13:06:54.524711 master-0 kubenswrapper[29936]: I1205 13:06:54.524244 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:54.628214 master-0 kubenswrapper[29936]: I1205 13:06:54.619897 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-sb\") pod \"757653e2-456c-426e-b7d5-bc1ad57fda3f\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " Dec 05 13:06:54.628214 master-0 kubenswrapper[29936]: I1205 13:06:54.620119 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-config\") pod \"757653e2-456c-426e-b7d5-bc1ad57fda3f\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " Dec 05 13:06:54.628214 master-0 kubenswrapper[29936]: I1205 13:06:54.620137 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-svc\") pod \"757653e2-456c-426e-b7d5-bc1ad57fda3f\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " Dec 05 13:06:54.628214 master-0 kubenswrapper[29936]: I1205 13:06:54.620194 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-nb\") pod \"757653e2-456c-426e-b7d5-bc1ad57fda3f\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " Dec 05 13:06:54.628214 master-0 kubenswrapper[29936]: I1205 13:06:54.620326 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5hgf\" (UniqueName: \"kubernetes.io/projected/757653e2-456c-426e-b7d5-bc1ad57fda3f-kube-api-access-n5hgf\") pod \"757653e2-456c-426e-b7d5-bc1ad57fda3f\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " Dec 05 13:06:54.628214 master-0 kubenswrapper[29936]: I1205 13:06:54.620481 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-swift-storage-0\") pod \"757653e2-456c-426e-b7d5-bc1ad57fda3f\" (UID: \"757653e2-456c-426e-b7d5-bc1ad57fda3f\") " Dec 05 13:06:54.662209 master-0 kubenswrapper[29936]: I1205 13:06:54.657504 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/757653e2-456c-426e-b7d5-bc1ad57fda3f-kube-api-access-n5hgf" (OuterVolumeSpecName: "kube-api-access-n5hgf") pod "757653e2-456c-426e-b7d5-bc1ad57fda3f" (UID: "757653e2-456c-426e-b7d5-bc1ad57fda3f"). InnerVolumeSpecName "kube-api-access-n5hgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:54.675209 master-0 kubenswrapper[29936]: I1205 13:06:54.670501 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5hgf\" (UniqueName: \"kubernetes.io/projected/757653e2-456c-426e-b7d5-bc1ad57fda3f-kube-api-access-n5hgf\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:54.709219 master-0 kubenswrapper[29936]: I1205 13:06:54.708776 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "757653e2-456c-426e-b7d5-bc1ad57fda3f" (UID: "757653e2-456c-426e-b7d5-bc1ad57fda3f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:54.724382 master-0 kubenswrapper[29936]: I1205 13:06:54.724312 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "757653e2-456c-426e-b7d5-bc1ad57fda3f" (UID: "757653e2-456c-426e-b7d5-bc1ad57fda3f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:54.742286 master-0 kubenswrapper[29936]: I1205 13:06:54.740799 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-config" (OuterVolumeSpecName: "config") pod "757653e2-456c-426e-b7d5-bc1ad57fda3f" (UID: "757653e2-456c-426e-b7d5-bc1ad57fda3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:54.778991 master-0 kubenswrapper[29936]: I1205 13:06:54.778910 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:54.778991 master-0 kubenswrapper[29936]: I1205 13:06:54.778971 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:54.778991 master-0 kubenswrapper[29936]: I1205 13:06:54.778984 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:54.781203 master-0 kubenswrapper[29936]: I1205 13:06:54.780158 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:54.781203 master-0 kubenswrapper[29936]: I1205 13:06:54.780328 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 13:06:54.861041 master-0 kubenswrapper[29936]: I1205 13:06:54.860956 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "757653e2-456c-426e-b7d5-bc1ad57fda3f" (UID: "757653e2-456c-426e-b7d5-bc1ad57fda3f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:54.861992 master-0 kubenswrapper[29936]: I1205 13:06:54.861907 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "757653e2-456c-426e-b7d5-bc1ad57fda3f" (UID: "757653e2-456c-426e-b7d5-bc1ad57fda3f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:06:54.883500 master-0 kubenswrapper[29936]: I1205 13:06:54.883427 29936 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:54.883500 master-0 kubenswrapper[29936]: I1205 13:06:54.883500 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/757653e2-456c-426e-b7d5-bc1ad57fda3f-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:54.982279 master-0 kubenswrapper[29936]: I1205 13:06:54.982125 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:06:55.054586 master-0 kubenswrapper[29936]: I1205 13:06:55.043601 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:55.054586 master-0 kubenswrapper[29936]: I1205 13:06:55.044595 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:55.226529 master-0 kubenswrapper[29936]: I1205 13:06:55.226482 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:55.296621 master-0 kubenswrapper[29936]: I1205 13:06:55.296567 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:55.435806 master-0 kubenswrapper[29936]: I1205 13:06:55.433339 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f5d648448-8m7zn" event={"ID":"8ee01710-7ad7-47e9-8268-09a33572ab6a","Type":"ContainerStarted","Data":"9a3b05f6ff88b9c96448238538c8a34f6caf50d50918bd415887609dfb3fc194"} Dec 05 13:06:55.458211 master-0 kubenswrapper[29936]: I1205 13:06:55.455056 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" event={"ID":"c2913bf6-55a0-42ff-b4ca-e8e39335d588","Type":"ContainerStarted","Data":"c585694c0ac0650448f8079c818538fb475b6349292cd835a880cf8f876333a8"} Dec 05 13:06:55.475830 master-0 kubenswrapper[29936]: I1205 13:06:55.468217 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-backup-0" event={"ID":"46decc62-ea7e-4ec1-ad45-7ff4812f77a5","Type":"ContainerStarted","Data":"a34264df621ab7c0263683464a9f81f32b4cbf3c98c6c3b16fccf927093379c6"} Dec 05 13:06:55.489357 master-0 kubenswrapper[29936]: I1205 13:06:55.488938 29936 generic.go:334] "Generic (PLEG): container finished" podID="efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" containerID="8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345" exitCode=0 Dec 05 13:06:55.489357 master-0 kubenswrapper[29936]: I1205 13:06:55.489015 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" event={"ID":"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a","Type":"ContainerDied","Data":"8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345"} Dec 05 13:06:55.493139 master-0 kubenswrapper[29936]: I1205 13:06:55.489585 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6566b9889c-g47n7" Dec 05 13:06:55.493139 master-0 kubenswrapper[29936]: I1205 13:06:55.490213 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:55.493139 master-0 kubenswrapper[29936]: I1205 13:06:55.490245 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:55.845764 master-0 kubenswrapper[29936]: I1205 13:06:55.845695 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6566b9889c-g47n7"] Dec 05 13:06:55.902436 master-0 kubenswrapper[29936]: I1205 13:06:55.902228 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6566b9889c-g47n7"] Dec 05 13:06:55.925632 master-0 kubenswrapper[29936]: I1205 13:06:55.925515 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b46d8-api-0"] Dec 05 13:06:56.524213 master-0 kubenswrapper[29936]: I1205 13:06:56.523283 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-backup-0" event={"ID":"46decc62-ea7e-4ec1-ad45-7ff4812f77a5","Type":"ContainerStarted","Data":"dff7bda528eab88b606bbabb6adcb24512060bf1ae567119c564f3cea491ccf0"} Dec 05 13:06:56.534239 master-0 kubenswrapper[29936]: I1205 13:06:56.532006 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" event={"ID":"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a","Type":"ContainerStarted","Data":"ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1"} Dec 05 13:06:56.534239 master-0 kubenswrapper[29936]: I1205 13:06:56.533362 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:06:56.549228 master-0 kubenswrapper[29936]: I1205 13:06:56.544424 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f5d648448-8m7zn" event={"ID":"8ee01710-7ad7-47e9-8268-09a33572ab6a","Type":"ContainerStarted","Data":"629a2e1c62ee9742cb533842e08303c5cee1a87130f73b0f0f822f74d7518e8d"} Dec 05 13:06:56.549228 master-0 kubenswrapper[29936]: I1205 13:06:56.544628 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:06:56.567219 master-0 kubenswrapper[29936]: I1205 13:06:56.551057 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" event={"ID":"c2913bf6-55a0-42ff-b4ca-e8e39335d588","Type":"ContainerStarted","Data":"1047b1b600f624e4e0b35df0d24baa1d48d897dc07d5f766de4f8f648de136e8"} Dec 05 13:06:56.567219 master-0 kubenswrapper[29936]: I1205 13:06:56.561599 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-scheduler-0" event={"ID":"062a3b87-3828-49fb-8b3b-099ef02fe5a3","Type":"ContainerStarted","Data":"70707515162ff0f31d64d73855efdd346ed6d3a1dd42e12a3b6d12f42fa16bf3"} Dec 05 13:06:56.567219 master-0 kubenswrapper[29936]: I1205 13:06:56.562332 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b46d8-backup-0" podStartSLOduration=4.023402024 podStartE2EDuration="6.562304327s" podCreationTimestamp="2025-12-05 13:06:50 +0000 UTC" firstStartedPulling="2025-12-05 13:06:51.955785649 +0000 UTC m=+1009.087865330" lastFinishedPulling="2025-12-05 13:06:54.494687952 +0000 UTC m=+1011.626767633" observedRunningTime="2025-12-05 13:06:56.561707472 +0000 UTC m=+1013.693787173" watchObservedRunningTime="2025-12-05 13:06:56.562304327 +0000 UTC m=+1013.694384008" Dec 05 13:06:56.588215 master-0 kubenswrapper[29936]: I1205 13:06:56.583368 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b46d8-api-0" podUID="f9875035-8030-4010-9fdf-e44664936896" containerName="cinder-b46d8-api-log" containerID="cri-o://f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9" gracePeriod=30 Dec 05 13:06:56.588215 master-0 kubenswrapper[29936]: I1205 13:06:56.583735 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-api-0" event={"ID":"f9875035-8030-4010-9fdf-e44664936896","Type":"ContainerStarted","Data":"906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a"} Dec 05 13:06:56.588215 master-0 kubenswrapper[29936]: I1205 13:06:56.583783 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:56.588215 master-0 kubenswrapper[29936]: I1205 13:06:56.583815 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b46d8-api-0" podUID="f9875035-8030-4010-9fdf-e44664936896" containerName="cinder-api" containerID="cri-o://906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a" gracePeriod=30 Dec 05 13:06:56.620224 master-0 kubenswrapper[29936]: I1205 13:06:56.600964 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5f5d648448-8m7zn" podStartSLOduration=5.6009400750000005 podStartE2EDuration="5.600940075s" podCreationTimestamp="2025-12-05 13:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:56.596776918 +0000 UTC m=+1013.728856599" watchObservedRunningTime="2025-12-05 13:06:56.600940075 +0000 UTC m=+1013.733019756" Dec 05 13:06:56.656221 master-0 kubenswrapper[29936]: I1205 13:06:56.653112 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" podStartSLOduration=5.31762025 podStartE2EDuration="6.653065751s" podCreationTimestamp="2025-12-05 13:06:50 +0000 UTC" firstStartedPulling="2025-12-05 13:06:51.585841738 +0000 UTC m=+1008.717921419" lastFinishedPulling="2025-12-05 13:06:52.921287239 +0000 UTC m=+1010.053366920" observedRunningTime="2025-12-05 13:06:56.650686909 +0000 UTC m=+1013.782766590" watchObservedRunningTime="2025-12-05 13:06:56.653065751 +0000 UTC m=+1013.785145432" Dec 05 13:06:56.740240 master-0 kubenswrapper[29936]: I1205 13:06:56.735709 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" podStartSLOduration=5.735684434 podStartE2EDuration="5.735684434s" podCreationTimestamp="2025-12-05 13:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:56.692273933 +0000 UTC m=+1013.824353624" watchObservedRunningTime="2025-12-05 13:06:56.735684434 +0000 UTC m=+1013.867764115" Dec 05 13:06:56.749232 master-0 kubenswrapper[29936]: I1205 13:06:56.745148 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b46d8-api-0" podStartSLOduration=6.745123808 podStartE2EDuration="6.745123808s" podCreationTimestamp="2025-12-05 13:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:06:56.736282509 +0000 UTC m=+1013.868362220" watchObservedRunningTime="2025-12-05 13:06:56.745123808 +0000 UTC m=+1013.877203489" Dec 05 13:06:57.319198 master-0 kubenswrapper[29936]: I1205 13:06:57.318850 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="757653e2-456c-426e-b7d5-bc1ad57fda3f" path="/var/lib/kubelet/pods/757653e2-456c-426e-b7d5-bc1ad57fda3f/volumes" Dec 05 13:06:57.586212 master-0 kubenswrapper[29936]: I1205 13:06:57.585509 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:57.671323 master-0 kubenswrapper[29936]: I1205 13:06:57.666782 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-scheduler-0" event={"ID":"062a3b87-3828-49fb-8b3b-099ef02fe5a3","Type":"ContainerStarted","Data":"72ae10d5d443f237be1f1fb9b2d450e0ae86534923bac9663291bac8056c11af"} Dec 05 13:06:57.684074 master-0 kubenswrapper[29936]: I1205 13:06:57.683771 29936 generic.go:334] "Generic (PLEG): container finished" podID="f9875035-8030-4010-9fdf-e44664936896" containerID="906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a" exitCode=0 Dec 05 13:06:57.684074 master-0 kubenswrapper[29936]: I1205 13:06:57.683839 29936 generic.go:334] "Generic (PLEG): container finished" podID="f9875035-8030-4010-9fdf-e44664936896" containerID="f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9" exitCode=143 Dec 05 13:06:57.688207 master-0 kubenswrapper[29936]: I1205 13:06:57.684752 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-api-0" event={"ID":"f9875035-8030-4010-9fdf-e44664936896","Type":"ContainerDied","Data":"906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a"} Dec 05 13:06:57.688207 master-0 kubenswrapper[29936]: I1205 13:06:57.684891 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-api-0" event={"ID":"f9875035-8030-4010-9fdf-e44664936896","Type":"ContainerDied","Data":"f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9"} Dec 05 13:06:57.688207 master-0 kubenswrapper[29936]: I1205 13:06:57.684909 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-api-0" event={"ID":"f9875035-8030-4010-9fdf-e44664936896","Type":"ContainerDied","Data":"5fd22cc25300662ac79cec0b48a4c454d23fa48b62f7e79f87e85fa7b9500f4b"} Dec 05 13:06:57.688207 master-0 kubenswrapper[29936]: I1205 13:06:57.684989 29936 scope.go:117] "RemoveContainer" containerID="906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a" Dec 05 13:06:57.688207 master-0 kubenswrapper[29936]: I1205 13:06:57.685428 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:57.728354 master-0 kubenswrapper[29936]: I1205 13:06:57.725509 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b46d8-scheduler-0" podStartSLOduration=6.583530995 podStartE2EDuration="7.72547566s" podCreationTimestamp="2025-12-05 13:06:50 +0000 UTC" firstStartedPulling="2025-12-05 13:06:51.782650989 +0000 UTC m=+1008.914730670" lastFinishedPulling="2025-12-05 13:06:52.924595654 +0000 UTC m=+1010.056675335" observedRunningTime="2025-12-05 13:06:57.695441994 +0000 UTC m=+1014.827521675" watchObservedRunningTime="2025-12-05 13:06:57.72547566 +0000 UTC m=+1014.857555341" Dec 05 13:06:57.765065 master-0 kubenswrapper[29936]: I1205 13:06:57.763230 29936 scope.go:117] "RemoveContainer" containerID="f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9" Dec 05 13:06:57.769219 master-0 kubenswrapper[29936]: I1205 13:06:57.765523 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data\") pod \"f9875035-8030-4010-9fdf-e44664936896\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " Dec 05 13:06:57.769219 master-0 kubenswrapper[29936]: I1205 13:06:57.765762 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-combined-ca-bundle\") pod \"f9875035-8030-4010-9fdf-e44664936896\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " Dec 05 13:06:57.769219 master-0 kubenswrapper[29936]: I1205 13:06:57.765935 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9875035-8030-4010-9fdf-e44664936896-etc-machine-id\") pod \"f9875035-8030-4010-9fdf-e44664936896\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " Dec 05 13:06:57.769219 master-0 kubenswrapper[29936]: I1205 13:06:57.766040 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9875035-8030-4010-9fdf-e44664936896-logs\") pod \"f9875035-8030-4010-9fdf-e44664936896\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " Dec 05 13:06:57.769219 master-0 kubenswrapper[29936]: I1205 13:06:57.766095 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-scripts\") pod \"f9875035-8030-4010-9fdf-e44664936896\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " Dec 05 13:06:57.769219 master-0 kubenswrapper[29936]: I1205 13:06:57.766145 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mtdn\" (UniqueName: \"kubernetes.io/projected/f9875035-8030-4010-9fdf-e44664936896-kube-api-access-5mtdn\") pod \"f9875035-8030-4010-9fdf-e44664936896\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " Dec 05 13:06:57.769219 master-0 kubenswrapper[29936]: I1205 13:06:57.766205 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data-custom\") pod \"f9875035-8030-4010-9fdf-e44664936896\" (UID: \"f9875035-8030-4010-9fdf-e44664936896\") " Dec 05 13:06:57.769219 master-0 kubenswrapper[29936]: I1205 13:06:57.768437 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9875035-8030-4010-9fdf-e44664936896-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f9875035-8030-4010-9fdf-e44664936896" (UID: "f9875035-8030-4010-9fdf-e44664936896"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:06:57.769681 master-0 kubenswrapper[29936]: I1205 13:06:57.769562 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9875035-8030-4010-9fdf-e44664936896-logs" (OuterVolumeSpecName: "logs") pod "f9875035-8030-4010-9fdf-e44664936896" (UID: "f9875035-8030-4010-9fdf-e44664936896"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:06:57.776219 master-0 kubenswrapper[29936]: I1205 13:06:57.774858 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9875035-8030-4010-9fdf-e44664936896-kube-api-access-5mtdn" (OuterVolumeSpecName: "kube-api-access-5mtdn") pod "f9875035-8030-4010-9fdf-e44664936896" (UID: "f9875035-8030-4010-9fdf-e44664936896"). InnerVolumeSpecName "kube-api-access-5mtdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:06:57.783223 master-0 kubenswrapper[29936]: I1205 13:06:57.777434 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f9875035-8030-4010-9fdf-e44664936896" (UID: "f9875035-8030-4010-9fdf-e44664936896"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:57.795223 master-0 kubenswrapper[29936]: I1205 13:06:57.790744 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-scripts" (OuterVolumeSpecName: "scripts") pod "f9875035-8030-4010-9fdf-e44664936896" (UID: "f9875035-8030-4010-9fdf-e44664936896"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:57.810221 master-0 kubenswrapper[29936]: I1205 13:06:57.804644 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9875035-8030-4010-9fdf-e44664936896" (UID: "f9875035-8030-4010-9fdf-e44664936896"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:57.871301 master-0 kubenswrapper[29936]: I1205 13:06:57.870820 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data" (OuterVolumeSpecName: "config-data") pod "f9875035-8030-4010-9fdf-e44664936896" (UID: "f9875035-8030-4010-9fdf-e44664936896"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:06:57.874320 master-0 kubenswrapper[29936]: I1205 13:06:57.871694 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:57.874320 master-0 kubenswrapper[29936]: I1205 13:06:57.871720 29936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9875035-8030-4010-9fdf-e44664936896-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:57.874320 master-0 kubenswrapper[29936]: I1205 13:06:57.871733 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9875035-8030-4010-9fdf-e44664936896-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:57.874320 master-0 kubenswrapper[29936]: I1205 13:06:57.871743 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:57.874320 master-0 kubenswrapper[29936]: I1205 13:06:57.871753 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mtdn\" (UniqueName: \"kubernetes.io/projected/f9875035-8030-4010-9fdf-e44664936896-kube-api-access-5mtdn\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:57.874320 master-0 kubenswrapper[29936]: I1205 13:06:57.871764 29936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:57.874320 master-0 kubenswrapper[29936]: I1205 13:06:57.871774 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9875035-8030-4010-9fdf-e44664936896-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:06:58.000914 master-0 kubenswrapper[29936]: I1205 13:06:57.997885 29936 scope.go:117] "RemoveContainer" containerID="906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a" Dec 05 13:06:58.001936 master-0 kubenswrapper[29936]: E1205 13:06:58.001869 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a\": container with ID starting with 906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a not found: ID does not exist" containerID="906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a" Dec 05 13:06:58.002033 master-0 kubenswrapper[29936]: I1205 13:06:58.001952 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a"} err="failed to get container status \"906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a\": rpc error: code = NotFound desc = could not find container \"906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a\": container with ID starting with 906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a not found: ID does not exist" Dec 05 13:06:58.002033 master-0 kubenswrapper[29936]: I1205 13:06:58.001992 29936 scope.go:117] "RemoveContainer" containerID="f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9" Dec 05 13:06:58.005916 master-0 kubenswrapper[29936]: E1205 13:06:58.005622 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9\": container with ID starting with f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9 not found: ID does not exist" containerID="f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9" Dec 05 13:06:58.005916 master-0 kubenswrapper[29936]: I1205 13:06:58.005685 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9"} err="failed to get container status \"f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9\": rpc error: code = NotFound desc = could not find container \"f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9\": container with ID starting with f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9 not found: ID does not exist" Dec 05 13:06:58.005916 master-0 kubenswrapper[29936]: I1205 13:06:58.005721 29936 scope.go:117] "RemoveContainer" containerID="906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a" Dec 05 13:06:58.007116 master-0 kubenswrapper[29936]: I1205 13:06:58.007003 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a"} err="failed to get container status \"906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a\": rpc error: code = NotFound desc = could not find container \"906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a\": container with ID starting with 906dc206be86011e98a00b82b2119b11a9babd76ccc82baa03e0e6660cc2242a not found: ID does not exist" Dec 05 13:06:58.007116 master-0 kubenswrapper[29936]: I1205 13:06:58.007097 29936 scope.go:117] "RemoveContainer" containerID="f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9" Dec 05 13:06:58.007912 master-0 kubenswrapper[29936]: I1205 13:06:58.007815 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9"} err="failed to get container status \"f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9\": rpc error: code = NotFound desc = could not find container \"f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9\": container with ID starting with f511d302dbbdf152f18efd62bd8b0e567dd1e689b4950ae68c1a26f93a7ad2d9 not found: ID does not exist" Dec 05 13:06:58.062452 master-0 kubenswrapper[29936]: I1205 13:06:58.062344 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b46d8-api-0"] Dec 05 13:06:58.077130 master-0 kubenswrapper[29936]: I1205 13:06:58.077044 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b46d8-api-0"] Dec 05 13:06:58.107868 master-0 kubenswrapper[29936]: I1205 13:06:58.107394 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b46d8-api-0"] Dec 05 13:06:58.108363 master-0 kubenswrapper[29936]: E1205 13:06:58.108314 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9875035-8030-4010-9fdf-e44664936896" containerName="cinder-b46d8-api-log" Dec 05 13:06:58.108363 master-0 kubenswrapper[29936]: I1205 13:06:58.108364 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9875035-8030-4010-9fdf-e44664936896" containerName="cinder-b46d8-api-log" Dec 05 13:06:58.108473 master-0 kubenswrapper[29936]: E1205 13:06:58.108388 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9875035-8030-4010-9fdf-e44664936896" containerName="cinder-api" Dec 05 13:06:58.108473 master-0 kubenswrapper[29936]: I1205 13:06:58.108395 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9875035-8030-4010-9fdf-e44664936896" containerName="cinder-api" Dec 05 13:06:58.108542 master-0 kubenswrapper[29936]: E1205 13:06:58.108511 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="757653e2-456c-426e-b7d5-bc1ad57fda3f" containerName="init" Dec 05 13:06:58.108542 master-0 kubenswrapper[29936]: I1205 13:06:58.108521 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="757653e2-456c-426e-b7d5-bc1ad57fda3f" containerName="init" Dec 05 13:06:58.109960 master-0 kubenswrapper[29936]: I1205 13:06:58.108924 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9875035-8030-4010-9fdf-e44664936896" containerName="cinder-b46d8-api-log" Dec 05 13:06:58.109960 master-0 kubenswrapper[29936]: I1205 13:06:58.108960 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="757653e2-456c-426e-b7d5-bc1ad57fda3f" containerName="init" Dec 05 13:06:58.114241 master-0 kubenswrapper[29936]: I1205 13:06:58.113418 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9875035-8030-4010-9fdf-e44664936896" containerName="cinder-api" Dec 05 13:06:58.115355 master-0 kubenswrapper[29936]: I1205 13:06:58.115316 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.124332 master-0 kubenswrapper[29936]: I1205 13:06:58.124258 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 13:06:58.124645 master-0 kubenswrapper[29936]: I1205 13:06:58.124482 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-api-config-data" Dec 05 13:06:58.124645 master-0 kubenswrapper[29936]: I1205 13:06:58.124595 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 13:06:58.178007 master-0 kubenswrapper[29936]: I1205 13:06:58.166718 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-api-0"] Dec 05 13:06:58.225587 master-0 kubenswrapper[29936]: I1205 13:06:58.219828 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-public-tls-certs\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.225587 master-0 kubenswrapper[29936]: I1205 13:06:58.219934 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-internal-tls-certs\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.225587 master-0 kubenswrapper[29936]: I1205 13:06:58.219967 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-scripts\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.225587 master-0 kubenswrapper[29936]: I1205 13:06:58.220005 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-combined-ca-bundle\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.225587 master-0 kubenswrapper[29936]: I1205 13:06:58.220056 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd4wh\" (UniqueName: \"kubernetes.io/projected/cb932c61-e0ab-441e-a231-bff0899fc045-kube-api-access-nd4wh\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.225587 master-0 kubenswrapper[29936]: I1205 13:06:58.220142 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb932c61-e0ab-441e-a231-bff0899fc045-etc-machine-id\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.225587 master-0 kubenswrapper[29936]: I1205 13:06:58.220166 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-config-data\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.225587 master-0 kubenswrapper[29936]: I1205 13:06:58.220238 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb932c61-e0ab-441e-a231-bff0899fc045-logs\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.225587 master-0 kubenswrapper[29936]: I1205 13:06:58.220263 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-config-data-custom\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.324959 master-0 kubenswrapper[29936]: I1205 13:06:58.324865 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-combined-ca-bundle\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.325816 master-0 kubenswrapper[29936]: I1205 13:06:58.325785 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd4wh\" (UniqueName: \"kubernetes.io/projected/cb932c61-e0ab-441e-a231-bff0899fc045-kube-api-access-nd4wh\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.327328 master-0 kubenswrapper[29936]: I1205 13:06:58.327259 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb932c61-e0ab-441e-a231-bff0899fc045-etc-machine-id\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.328864 master-0 kubenswrapper[29936]: I1205 13:06:58.328840 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-config-data\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.329251 master-0 kubenswrapper[29936]: I1205 13:06:58.329210 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb932c61-e0ab-441e-a231-bff0899fc045-logs\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.329415 master-0 kubenswrapper[29936]: I1205 13:06:58.329396 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-config-data-custom\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.329676 master-0 kubenswrapper[29936]: I1205 13:06:58.327375 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb932c61-e0ab-441e-a231-bff0899fc045-etc-machine-id\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.330457 master-0 kubenswrapper[29936]: I1205 13:06:58.330425 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb932c61-e0ab-441e-a231-bff0899fc045-logs\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.334173 master-0 kubenswrapper[29936]: I1205 13:06:58.334126 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-public-tls-certs\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.334391 master-0 kubenswrapper[29936]: I1205 13:06:58.334356 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-internal-tls-certs\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.334516 master-0 kubenswrapper[29936]: I1205 13:06:58.334438 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-scripts\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.336072 master-0 kubenswrapper[29936]: I1205 13:06:58.336023 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-config-data-custom\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.341244 master-0 kubenswrapper[29936]: I1205 13:06:58.341197 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-public-tls-certs\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.341380 master-0 kubenswrapper[29936]: I1205 13:06:58.341329 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-combined-ca-bundle\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.346726 master-0 kubenswrapper[29936]: E1205 13:06:58.346663 29936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9875035_8030_4010_9fdf_e44664936896.slice\": RecentStats: unable to find data in memory cache]" Dec 05 13:06:58.348989 master-0 kubenswrapper[29936]: I1205 13:06:58.348854 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-scripts\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.350554 master-0 kubenswrapper[29936]: I1205 13:06:58.350487 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-config-data\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.352909 master-0 kubenswrapper[29936]: I1205 13:06:58.352843 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb932c61-e0ab-441e-a231-bff0899fc045-internal-tls-certs\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.354054 master-0 kubenswrapper[29936]: I1205 13:06:58.353995 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd4wh\" (UniqueName: \"kubernetes.io/projected/cb932c61-e0ab-441e-a231-bff0899fc045-kube-api-access-nd4wh\") pod \"cinder-b46d8-api-0\" (UID: \"cb932c61-e0ab-441e-a231-bff0899fc045\") " pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.440332 master-0 kubenswrapper[29936]: I1205 13:06:58.437156 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:58.440332 master-0 kubenswrapper[29936]: I1205 13:06:58.437400 29936 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 13:06:58.532600 master-0 kubenswrapper[29936]: I1205 13:06:58.532495 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-api-0" Dec 05 13:06:58.943535 master-0 kubenswrapper[29936]: I1205 13:06:58.939491 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c565df557-xrffm"] Dec 05 13:06:58.943535 master-0 kubenswrapper[29936]: I1205 13:06:58.942424 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:58.962949 master-0 kubenswrapper[29936]: I1205 13:06:58.952832 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 13:06:58.962949 master-0 kubenswrapper[29936]: I1205 13:06:58.953165 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 13:06:58.962949 master-0 kubenswrapper[29936]: I1205 13:06:58.957611 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c565df557-xrffm"] Dec 05 13:06:59.135218 master-0 kubenswrapper[29936]: I1205 13:06:59.135002 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-public-tls-certs\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.135218 master-0 kubenswrapper[29936]: I1205 13:06:59.135173 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlg82\" (UniqueName: \"kubernetes.io/projected/de8efd16-e046-4cd1-aa8c-ba49c605aa89-kube-api-access-nlg82\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.135218 master-0 kubenswrapper[29936]: I1205 13:06:59.135220 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-httpd-config\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.135642 master-0 kubenswrapper[29936]: I1205 13:06:59.135274 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-config\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.135642 master-0 kubenswrapper[29936]: I1205 13:06:59.135357 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-internal-tls-certs\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.135642 master-0 kubenswrapper[29936]: I1205 13:06:59.135431 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-combined-ca-bundle\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.135642 master-0 kubenswrapper[29936]: I1205 13:06:59.135449 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-ovndb-tls-certs\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.169935 master-0 kubenswrapper[29936]: I1205 13:06:59.158673 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-api-0"] Dec 05 13:06:59.224492 master-0 kubenswrapper[29936]: I1205 13:06:59.224398 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9875035-8030-4010-9fdf-e44664936896" path="/var/lib/kubelet/pods/f9875035-8030-4010-9fdf-e44664936896/volumes" Dec 05 13:06:59.238384 master-0 kubenswrapper[29936]: I1205 13:06:59.237868 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlg82\" (UniqueName: \"kubernetes.io/projected/de8efd16-e046-4cd1-aa8c-ba49c605aa89-kube-api-access-nlg82\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.238384 master-0 kubenswrapper[29936]: I1205 13:06:59.237953 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-httpd-config\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.238384 master-0 kubenswrapper[29936]: I1205 13:06:59.238000 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-config\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.238384 master-0 kubenswrapper[29936]: I1205 13:06:59.238056 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-internal-tls-certs\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.238384 master-0 kubenswrapper[29936]: I1205 13:06:59.238109 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-combined-ca-bundle\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.238384 master-0 kubenswrapper[29936]: I1205 13:06:59.238127 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-ovndb-tls-certs\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.238384 master-0 kubenswrapper[29936]: I1205 13:06:59.238240 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-public-tls-certs\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.241025 master-0 kubenswrapper[29936]: I1205 13:06:59.240572 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:06:59.249656 master-0 kubenswrapper[29936]: I1205 13:06:59.247061 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-internal-tls-certs\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.249656 master-0 kubenswrapper[29936]: I1205 13:06:59.249414 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-public-tls-certs\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.252338 master-0 kubenswrapper[29936]: I1205 13:06:59.251263 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-ovndb-tls-certs\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.252338 master-0 kubenswrapper[29936]: I1205 13:06:59.251714 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-combined-ca-bundle\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.255571 master-0 kubenswrapper[29936]: I1205 13:06:59.255435 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlg82\" (UniqueName: \"kubernetes.io/projected/de8efd16-e046-4cd1-aa8c-ba49c605aa89-kube-api-access-nlg82\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.258129 master-0 kubenswrapper[29936]: I1205 13:06:59.256716 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-config\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.263206 master-0 kubenswrapper[29936]: I1205 13:06:59.261014 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de8efd16-e046-4cd1-aa8c-ba49c605aa89-httpd-config\") pod \"neutron-6c565df557-xrffm\" (UID: \"de8efd16-e046-4cd1-aa8c-ba49c605aa89\") " pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.285452 master-0 kubenswrapper[29936]: I1205 13:06:59.284327 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:06:59.780402 master-0 kubenswrapper[29936]: I1205 13:06:59.779516 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-api-0" event={"ID":"cb932c61-e0ab-441e-a231-bff0899fc045","Type":"ContainerStarted","Data":"db537bd0302852453b66fca177440e56363f939318fde1494685efaf4a572a46"} Dec 05 13:07:00.134901 master-0 kubenswrapper[29936]: I1205 13:07:00.134818 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c565df557-xrffm"] Dec 05 13:07:00.825931 master-0 kubenswrapper[29936]: I1205 13:07:00.825870 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c565df557-xrffm" event={"ID":"de8efd16-e046-4cd1-aa8c-ba49c605aa89","Type":"ContainerStarted","Data":"93d4711c7994fe78eabb85b368cb4e6593cf9ad3a6eff3f9a478d98835daaa8c"} Dec 05 13:07:00.826104 master-0 kubenswrapper[29936]: I1205 13:07:00.825941 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c565df557-xrffm" event={"ID":"de8efd16-e046-4cd1-aa8c-ba49c605aa89","Type":"ContainerStarted","Data":"4ee7fb60ea66dc157c27811188c8480a8c723dfa5f75274bd58a2619a6e6a437"} Dec 05 13:07:00.829766 master-0 kubenswrapper[29936]: I1205 13:07:00.829689 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-api-0" event={"ID":"cb932c61-e0ab-441e-a231-bff0899fc045","Type":"ContainerStarted","Data":"552fa5a072ce5fb45d5ba9ef389d2104da089f55c7c2edbe8af973232cda6940"} Dec 05 13:07:00.899907 master-0 kubenswrapper[29936]: I1205 13:07:00.899730 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:00.902576 master-0 kubenswrapper[29936]: I1205 13:07:00.902536 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:00.995674 master-0 kubenswrapper[29936]: I1205 13:07:00.995581 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:01.182010 master-0 kubenswrapper[29936]: I1205 13:07:01.181926 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:01.286777 master-0 kubenswrapper[29936]: I1205 13:07:01.286613 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:01.845499 master-0 kubenswrapper[29936]: I1205 13:07:01.845421 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-api-0" event={"ID":"cb932c61-e0ab-441e-a231-bff0899fc045","Type":"ContainerStarted","Data":"7b20b3fe52b3f665f1b000ed95467b66dd102b8c47fa04cc2f2226178e29c307"} Dec 05 13:07:01.845851 master-0 kubenswrapper[29936]: I1205 13:07:01.845564 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-b46d8-api-0" Dec 05 13:07:01.848122 master-0 kubenswrapper[29936]: I1205 13:07:01.848075 29936 generic.go:334] "Generic (PLEG): container finished" podID="1690e553-8b77-483f-9f31-4f3968e6bd28" containerID="87c7ae63b5c1f83dc43365b898e62fdbd922a603605747e0cef5ff0cc54f165a" exitCode=0 Dec 05 13:07:01.848409 master-0 kubenswrapper[29936]: I1205 13:07:01.848149 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-h9l4n" event={"ID":"1690e553-8b77-483f-9f31-4f3968e6bd28","Type":"ContainerDied","Data":"87c7ae63b5c1f83dc43365b898e62fdbd922a603605747e0cef5ff0cc54f165a"} Dec 05 13:07:01.853585 master-0 kubenswrapper[29936]: I1205 13:07:01.853531 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c565df557-xrffm" event={"ID":"de8efd16-e046-4cd1-aa8c-ba49c605aa89","Type":"ContainerStarted","Data":"7cb1aa928d4d47b1938e80b441240b1c3fe91e67c42d5eb81dbf999a45fb7b8a"} Dec 05 13:07:01.853585 master-0 kubenswrapper[29936]: I1205 13:07:01.853582 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:07:01.926342 master-0 kubenswrapper[29936]: I1205 13:07:01.925423 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b46d8-api-0" podStartSLOduration=3.925398789 podStartE2EDuration="3.925398789s" podCreationTimestamp="2025-12-05 13:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:07:01.871727554 +0000 UTC m=+1019.003807235" watchObservedRunningTime="2025-12-05 13:07:01.925398789 +0000 UTC m=+1019.057478470" Dec 05 13:07:01.955279 master-0 kubenswrapper[29936]: I1205 13:07:01.955160 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b46d8-volume-lvm-iscsi-0"] Dec 05 13:07:02.001426 master-0 kubenswrapper[29936]: I1205 13:07:02.000814 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c565df557-xrffm" podStartSLOduration=4.000785786 podStartE2EDuration="4.000785786s" podCreationTimestamp="2025-12-05 13:06:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:07:01.967003153 +0000 UTC m=+1019.099082844" watchObservedRunningTime="2025-12-05 13:07:02.000785786 +0000 UTC m=+1019.132865487" Dec 05 13:07:02.041890 master-0 kubenswrapper[29936]: I1205 13:07:02.041785 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b46d8-backup-0"] Dec 05 13:07:02.157240 master-0 kubenswrapper[29936]: I1205 13:07:02.157131 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:07:02.259648 master-0 kubenswrapper[29936]: I1205 13:07:02.259548 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dbf54b6fc-9q2hj"] Dec 05 13:07:02.260561 master-0 kubenswrapper[29936]: I1205 13:07:02.259869 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" podUID="9a285173-334c-409d-87a5-9c8e18c77f50" containerName="dnsmasq-dns" containerID="cri-o://70524485e77b0f63ff22a28523a8a5fd365423c09029a91b199af308f5ff9cc9" gracePeriod=10 Dec 05 13:07:02.870740 master-0 kubenswrapper[29936]: I1205 13:07:02.870562 29936 generic.go:334] "Generic (PLEG): container finished" podID="9a285173-334c-409d-87a5-9c8e18c77f50" containerID="70524485e77b0f63ff22a28523a8a5fd365423c09029a91b199af308f5ff9cc9" exitCode=0 Dec 05 13:07:02.870740 master-0 kubenswrapper[29936]: I1205 13:07:02.870646 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" event={"ID":"9a285173-334c-409d-87a5-9c8e18c77f50","Type":"ContainerDied","Data":"70524485e77b0f63ff22a28523a8a5fd365423c09029a91b199af308f5ff9cc9"} Dec 05 13:07:02.870740 master-0 kubenswrapper[29936]: I1205 13:07:02.870713 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" event={"ID":"9a285173-334c-409d-87a5-9c8e18c77f50","Type":"ContainerDied","Data":"2bcbfaf79ebe2dbd77659cb5bbd923a480043161f71b8d7ec94a9fe4b7ca35a7"} Dec 05 13:07:02.870740 master-0 kubenswrapper[29936]: I1205 13:07:02.870727 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bcbfaf79ebe2dbd77659cb5bbd923a480043161f71b8d7ec94a9fe4b7ca35a7" Dec 05 13:07:02.871719 master-0 kubenswrapper[29936]: I1205 13:07:02.871566 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b46d8-backup-0" podUID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" containerName="cinder-backup" containerID="cri-o://a34264df621ab7c0263683464a9f81f32b4cbf3c98c6c3b16fccf927093379c6" gracePeriod=30 Dec 05 13:07:02.873304 master-0 kubenswrapper[29936]: I1205 13:07:02.872372 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b46d8-backup-0" podUID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" containerName="probe" containerID="cri-o://dff7bda528eab88b606bbabb6adcb24512060bf1ae567119c564f3cea491ccf0" gracePeriod=30 Dec 05 13:07:02.873304 master-0 kubenswrapper[29936]: I1205 13:07:02.872495 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" podUID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" containerName="cinder-volume" containerID="cri-o://c585694c0ac0650448f8079c818538fb475b6349292cd835a880cf8f876333a8" gracePeriod=30 Dec 05 13:07:02.873304 master-0 kubenswrapper[29936]: I1205 13:07:02.872683 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" podUID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" containerName="probe" containerID="cri-o://1047b1b600f624e4e0b35df0d24baa1d48d897dc07d5f766de4f8f648de136e8" gracePeriod=30 Dec 05 13:07:02.913197 master-0 kubenswrapper[29936]: I1205 13:07:02.913107 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:07:03.000136 master-0 kubenswrapper[29936]: I1205 13:07:03.000071 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv84q\" (UniqueName: \"kubernetes.io/projected/9a285173-334c-409d-87a5-9c8e18c77f50-kube-api-access-bv84q\") pod \"9a285173-334c-409d-87a5-9c8e18c77f50\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " Dec 05 13:07:03.000479 master-0 kubenswrapper[29936]: I1205 13:07:03.000443 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-config\") pod \"9a285173-334c-409d-87a5-9c8e18c77f50\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " Dec 05 13:07:03.000546 master-0 kubenswrapper[29936]: I1205 13:07:03.000505 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-svc\") pod \"9a285173-334c-409d-87a5-9c8e18c77f50\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " Dec 05 13:07:03.000799 master-0 kubenswrapper[29936]: I1205 13:07:03.000769 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-nb\") pod \"9a285173-334c-409d-87a5-9c8e18c77f50\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " Dec 05 13:07:03.000853 master-0 kubenswrapper[29936]: I1205 13:07:03.000834 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-sb\") pod \"9a285173-334c-409d-87a5-9c8e18c77f50\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " Dec 05 13:07:03.000979 master-0 kubenswrapper[29936]: I1205 13:07:03.000962 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-swift-storage-0\") pod \"9a285173-334c-409d-87a5-9c8e18c77f50\" (UID: \"9a285173-334c-409d-87a5-9c8e18c77f50\") " Dec 05 13:07:03.018726 master-0 kubenswrapper[29936]: I1205 13:07:03.018573 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a285173-334c-409d-87a5-9c8e18c77f50-kube-api-access-bv84q" (OuterVolumeSpecName: "kube-api-access-bv84q") pod "9a285173-334c-409d-87a5-9c8e18c77f50" (UID: "9a285173-334c-409d-87a5-9c8e18c77f50"). InnerVolumeSpecName "kube-api-access-bv84q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:03.092014 master-0 kubenswrapper[29936]: I1205 13:07:03.091914 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-config" (OuterVolumeSpecName: "config") pod "9a285173-334c-409d-87a5-9c8e18c77f50" (UID: "9a285173-334c-409d-87a5-9c8e18c77f50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:03.098640 master-0 kubenswrapper[29936]: I1205 13:07:03.098534 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a285173-334c-409d-87a5-9c8e18c77f50" (UID: "9a285173-334c-409d-87a5-9c8e18c77f50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:03.108928 master-0 kubenswrapper[29936]: I1205 13:07:03.108801 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv84q\" (UniqueName: \"kubernetes.io/projected/9a285173-334c-409d-87a5-9c8e18c77f50-kube-api-access-bv84q\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.108928 master-0 kubenswrapper[29936]: I1205 13:07:03.108866 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.108928 master-0 kubenswrapper[29936]: I1205 13:07:03.108879 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.137513 master-0 kubenswrapper[29936]: I1205 13:07:03.137386 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9a285173-334c-409d-87a5-9c8e18c77f50" (UID: "9a285173-334c-409d-87a5-9c8e18c77f50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:03.210823 master-0 kubenswrapper[29936]: I1205 13:07:03.210706 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.217954 master-0 kubenswrapper[29936]: I1205 13:07:03.217872 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a285173-334c-409d-87a5-9c8e18c77f50" (UID: "9a285173-334c-409d-87a5-9c8e18c77f50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:03.227979 master-0 kubenswrapper[29936]: I1205 13:07:03.224935 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9a285173-334c-409d-87a5-9c8e18c77f50" (UID: "9a285173-334c-409d-87a5-9c8e18c77f50"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:03.321427 master-0 kubenswrapper[29936]: I1205 13:07:03.318291 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.321427 master-0 kubenswrapper[29936]: I1205 13:07:03.318348 29936 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9a285173-334c-409d-87a5-9c8e18c77f50-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.532057 master-0 kubenswrapper[29936]: I1205 13:07:03.531989 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:07:03.629207 master-0 kubenswrapper[29936]: I1205 13:07:03.629004 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-scripts\") pod \"1690e553-8b77-483f-9f31-4f3968e6bd28\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " Dec 05 13:07:03.629207 master-0 kubenswrapper[29936]: I1205 13:07:03.629192 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data\") pod \"1690e553-8b77-483f-9f31-4f3968e6bd28\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " Dec 05 13:07:03.629463 master-0 kubenswrapper[29936]: I1205 13:07:03.629383 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data-merged\") pod \"1690e553-8b77-483f-9f31-4f3968e6bd28\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " Dec 05 13:07:03.629544 master-0 kubenswrapper[29936]: I1205 13:07:03.629517 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgbw5\" (UniqueName: \"kubernetes.io/projected/1690e553-8b77-483f-9f31-4f3968e6bd28-kube-api-access-cgbw5\") pod \"1690e553-8b77-483f-9f31-4f3968e6bd28\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " Dec 05 13:07:03.629599 master-0 kubenswrapper[29936]: I1205 13:07:03.629579 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1690e553-8b77-483f-9f31-4f3968e6bd28-etc-podinfo\") pod \"1690e553-8b77-483f-9f31-4f3968e6bd28\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " Dec 05 13:07:03.629665 master-0 kubenswrapper[29936]: I1205 13:07:03.629642 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-combined-ca-bundle\") pod \"1690e553-8b77-483f-9f31-4f3968e6bd28\" (UID: \"1690e553-8b77-483f-9f31-4f3968e6bd28\") " Dec 05 13:07:03.630017 master-0 kubenswrapper[29936]: I1205 13:07:03.629937 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "1690e553-8b77-483f-9f31-4f3968e6bd28" (UID: "1690e553-8b77-483f-9f31-4f3968e6bd28"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:03.630727 master-0 kubenswrapper[29936]: I1205 13:07:03.630665 29936 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data-merged\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.633589 master-0 kubenswrapper[29936]: I1205 13:07:03.633526 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1690e553-8b77-483f-9f31-4f3968e6bd28-kube-api-access-cgbw5" (OuterVolumeSpecName: "kube-api-access-cgbw5") pod "1690e553-8b77-483f-9f31-4f3968e6bd28" (UID: "1690e553-8b77-483f-9f31-4f3968e6bd28"). InnerVolumeSpecName "kube-api-access-cgbw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:03.635067 master-0 kubenswrapper[29936]: I1205 13:07:03.635024 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-scripts" (OuterVolumeSpecName: "scripts") pod "1690e553-8b77-483f-9f31-4f3968e6bd28" (UID: "1690e553-8b77-483f-9f31-4f3968e6bd28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:03.635536 master-0 kubenswrapper[29936]: I1205 13:07:03.635408 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1690e553-8b77-483f-9f31-4f3968e6bd28-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "1690e553-8b77-483f-9f31-4f3968e6bd28" (UID: "1690e553-8b77-483f-9f31-4f3968e6bd28"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 13:07:03.674227 master-0 kubenswrapper[29936]: I1205 13:07:03.674122 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data" (OuterVolumeSpecName: "config-data") pod "1690e553-8b77-483f-9f31-4f3968e6bd28" (UID: "1690e553-8b77-483f-9f31-4f3968e6bd28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:03.719053 master-0 kubenswrapper[29936]: I1205 13:07:03.718847 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1690e553-8b77-483f-9f31-4f3968e6bd28" (UID: "1690e553-8b77-483f-9f31-4f3968e6bd28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:03.734569 master-0 kubenswrapper[29936]: I1205 13:07:03.734458 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgbw5\" (UniqueName: \"kubernetes.io/projected/1690e553-8b77-483f-9f31-4f3968e6bd28-kube-api-access-cgbw5\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.734569 master-0 kubenswrapper[29936]: I1205 13:07:03.734559 29936 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1690e553-8b77-483f-9f31-4f3968e6bd28-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.734569 master-0 kubenswrapper[29936]: I1205 13:07:03.734576 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.734777 master-0 kubenswrapper[29936]: I1205 13:07:03.734592 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.734777 master-0 kubenswrapper[29936]: I1205 13:07:03.734610 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690e553-8b77-483f-9f31-4f3968e6bd28-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:03.887806 master-0 kubenswrapper[29936]: I1205 13:07:03.887621 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-h9l4n" event={"ID":"1690e553-8b77-483f-9f31-4f3968e6bd28","Type":"ContainerDied","Data":"95cf6cde879a112c5eb38a3b0a63d3e6be680c28099c3aae0b686865a39447df"} Dec 05 13:07:03.887806 master-0 kubenswrapper[29936]: I1205 13:07:03.887705 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95cf6cde879a112c5eb38a3b0a63d3e6be680c28099c3aae0b686865a39447df" Dec 05 13:07:03.888057 master-0 kubenswrapper[29936]: I1205 13:07:03.887845 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-h9l4n" Dec 05 13:07:03.893411 master-0 kubenswrapper[29936]: I1205 13:07:03.893343 29936 generic.go:334] "Generic (PLEG): container finished" podID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" containerID="c585694c0ac0650448f8079c818538fb475b6349292cd835a880cf8f876333a8" exitCode=0 Dec 05 13:07:03.893569 master-0 kubenswrapper[29936]: I1205 13:07:03.893526 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dbf54b6fc-9q2hj" Dec 05 13:07:03.894948 master-0 kubenswrapper[29936]: I1205 13:07:03.894908 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" event={"ID":"c2913bf6-55a0-42ff-b4ca-e8e39335d588","Type":"ContainerDied","Data":"c585694c0ac0650448f8079c818538fb475b6349292cd835a880cf8f876333a8"} Dec 05 13:07:03.969963 master-0 kubenswrapper[29936]: I1205 13:07:03.969832 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dbf54b6fc-9q2hj"] Dec 05 13:07:04.004156 master-0 kubenswrapper[29936]: I1205 13:07:04.003459 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dbf54b6fc-9q2hj"] Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: I1205 13:07:04.780261 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-m94x4"] Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: E1205 13:07:04.780978 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1690e553-8b77-483f-9f31-4f3968e6bd28" containerName="ironic-db-sync" Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: I1205 13:07:04.780995 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="1690e553-8b77-483f-9f31-4f3968e6bd28" containerName="ironic-db-sync" Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: E1205 13:07:04.781026 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1690e553-8b77-483f-9f31-4f3968e6bd28" containerName="init" Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: I1205 13:07:04.781036 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="1690e553-8b77-483f-9f31-4f3968e6bd28" containerName="init" Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: E1205 13:07:04.781051 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a285173-334c-409d-87a5-9c8e18c77f50" containerName="init" Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: I1205 13:07:04.781059 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a285173-334c-409d-87a5-9c8e18c77f50" containerName="init" Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: E1205 13:07:04.781084 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a285173-334c-409d-87a5-9c8e18c77f50" containerName="dnsmasq-dns" Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: I1205 13:07:04.781092 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a285173-334c-409d-87a5-9c8e18c77f50" containerName="dnsmasq-dns" Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: I1205 13:07:04.781347 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a285173-334c-409d-87a5-9c8e18c77f50" containerName="dnsmasq-dns" Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: I1205 13:07:04.781381 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="1690e553-8b77-483f-9f31-4f3968e6bd28" containerName="ironic-db-sync" Dec 05 13:07:04.787892 master-0 kubenswrapper[29936]: I1205 13:07:04.782312 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-m94x4" Dec 05 13:07:04.873209 master-0 kubenswrapper[29936]: I1205 13:07:04.843604 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-m94x4"] Dec 05 13:07:04.873209 master-0 kubenswrapper[29936]: I1205 13:07:04.853546 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-8458c7d7db-8c5lp"] Dec 05 13:07:04.873209 master-0 kubenswrapper[29936]: I1205 13:07:04.855677 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:04.873209 master-0 kubenswrapper[29936]: I1205 13:07:04.867060 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Dec 05 13:07:04.873209 master-0 kubenswrapper[29936]: I1205 13:07:04.867563 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-8458c7d7db-8c5lp"] Dec 05 13:07:04.897291 master-0 kubenswrapper[29936]: I1205 13:07:04.892353 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7777eb1-fa7a-4f4b-8887-da54c42cff61-operator-scripts\") pod \"ironic-inspector-db-create-m94x4\" (UID: \"e7777eb1-fa7a-4f4b-8887-da54c42cff61\") " pod="openstack/ironic-inspector-db-create-m94x4" Dec 05 13:07:04.897291 master-0 kubenswrapper[29936]: I1205 13:07:04.892620 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhtng\" (UniqueName: \"kubernetes.io/projected/e7777eb1-fa7a-4f4b-8887-da54c42cff61-kube-api-access-fhtng\") pod \"ironic-inspector-db-create-m94x4\" (UID: \"e7777eb1-fa7a-4f4b-8887-da54c42cff61\") " pod="openstack/ironic-inspector-db-create-m94x4" Dec 05 13:07:04.949208 master-0 kubenswrapper[29936]: I1205 13:07:04.939720 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-2c1e-account-create-update-6qdwp"] Dec 05 13:07:04.949208 master-0 kubenswrapper[29936]: I1205 13:07:04.941421 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-2c1e-account-create-update-6qdwp"] Dec 05 13:07:04.949208 master-0 kubenswrapper[29936]: I1205 13:07:04.941540 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" Dec 05 13:07:04.973208 master-0 kubenswrapper[29936]: I1205 13:07:04.970954 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Dec 05 13:07:04.999203 master-0 kubenswrapper[29936]: I1205 13:07:04.997423 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtng\" (UniqueName: \"kubernetes.io/projected/e7777eb1-fa7a-4f4b-8887-da54c42cff61-kube-api-access-fhtng\") pod \"ironic-inspector-db-create-m94x4\" (UID: \"e7777eb1-fa7a-4f4b-8887-da54c42cff61\") " pod="openstack/ironic-inspector-db-create-m94x4" Dec 05 13:07:04.999203 master-0 kubenswrapper[29936]: I1205 13:07:04.997540 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lpp5\" (UniqueName: \"kubernetes.io/projected/1eb892f5-7ab8-4503-b7b8-1e233a1042bb-kube-api-access-4lpp5\") pod \"ironic-neutron-agent-8458c7d7db-8c5lp\" (UID: \"1eb892f5-7ab8-4503-b7b8-1e233a1042bb\") " pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:04.999203 master-0 kubenswrapper[29936]: I1205 13:07:04.997607 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7777eb1-fa7a-4f4b-8887-da54c42cff61-operator-scripts\") pod \"ironic-inspector-db-create-m94x4\" (UID: \"e7777eb1-fa7a-4f4b-8887-da54c42cff61\") " pod="openstack/ironic-inspector-db-create-m94x4" Dec 05 13:07:04.999203 master-0 kubenswrapper[29936]: I1205 13:07:04.997723 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1eb892f5-7ab8-4503-b7b8-1e233a1042bb-config\") pod \"ironic-neutron-agent-8458c7d7db-8c5lp\" (UID: \"1eb892f5-7ab8-4503-b7b8-1e233a1042bb\") " pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:04.999203 master-0 kubenswrapper[29936]: I1205 13:07:04.997781 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpdcc\" (UniqueName: \"kubernetes.io/projected/188a05a6-3a61-4472-b383-f44f2d022d08-kube-api-access-wpdcc\") pod \"ironic-inspector-2c1e-account-create-update-6qdwp\" (UID: \"188a05a6-3a61-4472-b383-f44f2d022d08\") " pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" Dec 05 13:07:04.999203 master-0 kubenswrapper[29936]: I1205 13:07:04.997826 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb892f5-7ab8-4503-b7b8-1e233a1042bb-combined-ca-bundle\") pod \"ironic-neutron-agent-8458c7d7db-8c5lp\" (UID: \"1eb892f5-7ab8-4503-b7b8-1e233a1042bb\") " pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:04.999203 master-0 kubenswrapper[29936]: I1205 13:07:04.997883 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/188a05a6-3a61-4472-b383-f44f2d022d08-operator-scripts\") pod \"ironic-inspector-2c1e-account-create-update-6qdwp\" (UID: \"188a05a6-3a61-4472-b383-f44f2d022d08\") " pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" Dec 05 13:07:05.053023 master-0 kubenswrapper[29936]: I1205 13:07:05.006543 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7777eb1-fa7a-4f4b-8887-da54c42cff61-operator-scripts\") pod \"ironic-inspector-db-create-m94x4\" (UID: \"e7777eb1-fa7a-4f4b-8887-da54c42cff61\") " pod="openstack/ironic-inspector-db-create-m94x4" Dec 05 13:07:05.078263 master-0 kubenswrapper[29936]: I1205 13:07:05.069516 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhtng\" (UniqueName: \"kubernetes.io/projected/e7777eb1-fa7a-4f4b-8887-da54c42cff61-kube-api-access-fhtng\") pod \"ironic-inspector-db-create-m94x4\" (UID: \"e7777eb1-fa7a-4f4b-8887-da54c42cff61\") " pod="openstack/ironic-inspector-db-create-m94x4" Dec 05 13:07:05.078263 master-0 kubenswrapper[29936]: I1205 13:07:05.074764 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-m94x4" Dec 05 13:07:05.101958 master-0 kubenswrapper[29936]: I1205 13:07:05.092607 29936 generic.go:334] "Generic (PLEG): container finished" podID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" containerID="1047b1b600f624e4e0b35df0d24baa1d48d897dc07d5f766de4f8f648de136e8" exitCode=0 Dec 05 13:07:05.101958 master-0 kubenswrapper[29936]: I1205 13:07:05.092715 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" event={"ID":"c2913bf6-55a0-42ff-b4ca-e8e39335d588","Type":"ContainerDied","Data":"1047b1b600f624e4e0b35df0d24baa1d48d897dc07d5f766de4f8f648de136e8"} Dec 05 13:07:05.122207 master-0 kubenswrapper[29936]: I1205 13:07:05.118057 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1eb892f5-7ab8-4503-b7b8-1e233a1042bb-config\") pod \"ironic-neutron-agent-8458c7d7db-8c5lp\" (UID: \"1eb892f5-7ab8-4503-b7b8-1e233a1042bb\") " pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:05.122207 master-0 kubenswrapper[29936]: I1205 13:07:05.118155 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpdcc\" (UniqueName: \"kubernetes.io/projected/188a05a6-3a61-4472-b383-f44f2d022d08-kube-api-access-wpdcc\") pod \"ironic-inspector-2c1e-account-create-update-6qdwp\" (UID: \"188a05a6-3a61-4472-b383-f44f2d022d08\") " pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" Dec 05 13:07:05.122207 master-0 kubenswrapper[29936]: I1205 13:07:05.118200 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb892f5-7ab8-4503-b7b8-1e233a1042bb-combined-ca-bundle\") pod \"ironic-neutron-agent-8458c7d7db-8c5lp\" (UID: \"1eb892f5-7ab8-4503-b7b8-1e233a1042bb\") " pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:05.122207 master-0 kubenswrapper[29936]: I1205 13:07:05.118242 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/188a05a6-3a61-4472-b383-f44f2d022d08-operator-scripts\") pod \"ironic-inspector-2c1e-account-create-update-6qdwp\" (UID: \"188a05a6-3a61-4472-b383-f44f2d022d08\") " pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" Dec 05 13:07:05.122207 master-0 kubenswrapper[29936]: I1205 13:07:05.118300 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lpp5\" (UniqueName: \"kubernetes.io/projected/1eb892f5-7ab8-4503-b7b8-1e233a1042bb-kube-api-access-4lpp5\") pod \"ironic-neutron-agent-8458c7d7db-8c5lp\" (UID: \"1eb892f5-7ab8-4503-b7b8-1e233a1042bb\") " pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:05.122207 master-0 kubenswrapper[29936]: I1205 13:07:05.119358 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/188a05a6-3a61-4472-b383-f44f2d022d08-operator-scripts\") pod \"ironic-inspector-2c1e-account-create-update-6qdwp\" (UID: \"188a05a6-3a61-4472-b383-f44f2d022d08\") " pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" Dec 05 13:07:05.140461 master-0 kubenswrapper[29936]: I1205 13:07:05.136570 29936 generic.go:334] "Generic (PLEG): container finished" podID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" containerID="dff7bda528eab88b606bbabb6adcb24512060bf1ae567119c564f3cea491ccf0" exitCode=0 Dec 05 13:07:05.140461 master-0 kubenswrapper[29936]: I1205 13:07:05.136623 29936 generic.go:334] "Generic (PLEG): container finished" podID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" containerID="a34264df621ab7c0263683464a9f81f32b4cbf3c98c6c3b16fccf927093379c6" exitCode=0 Dec 05 13:07:05.140461 master-0 kubenswrapper[29936]: I1205 13:07:05.136650 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-backup-0" event={"ID":"46decc62-ea7e-4ec1-ad45-7ff4812f77a5","Type":"ContainerDied","Data":"dff7bda528eab88b606bbabb6adcb24512060bf1ae567119c564f3cea491ccf0"} Dec 05 13:07:05.140461 master-0 kubenswrapper[29936]: I1205 13:07:05.136683 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-backup-0" event={"ID":"46decc62-ea7e-4ec1-ad45-7ff4812f77a5","Type":"ContainerDied","Data":"a34264df621ab7c0263683464a9f81f32b4cbf3c98c6c3b16fccf927093379c6"} Dec 05 13:07:05.141077 master-0 kubenswrapper[29936]: I1205 13:07:05.140564 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65b88b76d9-2zcht"] Dec 05 13:07:05.142772 master-0 kubenswrapper[29936]: I1205 13:07:05.142732 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.149338 master-0 kubenswrapper[29936]: I1205 13:07:05.149223 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eb892f5-7ab8-4503-b7b8-1e233a1042bb-combined-ca-bundle\") pod \"ironic-neutron-agent-8458c7d7db-8c5lp\" (UID: \"1eb892f5-7ab8-4503-b7b8-1e233a1042bb\") " pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:05.156633 master-0 kubenswrapper[29936]: I1205 13:07:05.151266 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lpp5\" (UniqueName: \"kubernetes.io/projected/1eb892f5-7ab8-4503-b7b8-1e233a1042bb-kube-api-access-4lpp5\") pod \"ironic-neutron-agent-8458c7d7db-8c5lp\" (UID: \"1eb892f5-7ab8-4503-b7b8-1e233a1042bb\") " pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:05.156633 master-0 kubenswrapper[29936]: I1205 13:07:05.152734 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpdcc\" (UniqueName: \"kubernetes.io/projected/188a05a6-3a61-4472-b383-f44f2d022d08-kube-api-access-wpdcc\") pod \"ironic-inspector-2c1e-account-create-update-6qdwp\" (UID: \"188a05a6-3a61-4472-b383-f44f2d022d08\") " pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" Dec 05 13:07:05.182537 master-0 kubenswrapper[29936]: I1205 13:07:05.167498 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1eb892f5-7ab8-4503-b7b8-1e233a1042bb-config\") pod \"ironic-neutron-agent-8458c7d7db-8c5lp\" (UID: \"1eb892f5-7ab8-4503-b7b8-1e233a1042bb\") " pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:05.182537 master-0 kubenswrapper[29936]: I1205 13:07:05.170124 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65b88b76d9-2zcht"] Dec 05 13:07:05.233157 master-0 kubenswrapper[29936]: I1205 13:07:05.231392 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mv4r\" (UniqueName: \"kubernetes.io/projected/d702847e-681f-49cd-8d28-029cae3b4bf5-kube-api-access-9mv4r\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.233157 master-0 kubenswrapper[29936]: I1205 13:07:05.231531 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-svc\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.233157 master-0 kubenswrapper[29936]: I1205 13:07:05.231562 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-nb\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.233157 master-0 kubenswrapper[29936]: I1205 13:07:05.231633 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-sb\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.233157 master-0 kubenswrapper[29936]: I1205 13:07:05.231771 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-swift-storage-0\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.233157 master-0 kubenswrapper[29936]: I1205 13:07:05.231792 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-config\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.255581 master-0 kubenswrapper[29936]: I1205 13:07:05.253965 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a285173-334c-409d-87a5-9c8e18c77f50" path="/var/lib/kubelet/pods/9a285173-334c-409d-87a5-9c8e18c77f50/volumes" Dec 05 13:07:05.255581 master-0 kubenswrapper[29936]: I1205 13:07:05.255156 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-5456ffdd9c-4qjcn"] Dec 05 13:07:05.261847 master-0 kubenswrapper[29936]: I1205 13:07:05.258848 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-5456ffdd9c-4qjcn"] Dec 05 13:07:05.261847 master-0 kubenswrapper[29936]: I1205 13:07:05.258990 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.263148 master-0 kubenswrapper[29936]: I1205 13:07:05.262757 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Dec 05 13:07:05.263265 master-0 kubenswrapper[29936]: I1205 13:07:05.263167 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Dec 05 13:07:05.265974 master-0 kubenswrapper[29936]: I1205 13:07:05.263392 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 13:07:05.265974 master-0 kubenswrapper[29936]: I1205 13:07:05.263521 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Dec 05 13:07:05.265974 master-0 kubenswrapper[29936]: I1205 13:07:05.264775 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Dec 05 13:07:05.364593 master-0 kubenswrapper[29936]: I1205 13:07:05.364501 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:05.368478 master-0 kubenswrapper[29936]: I1205 13:07:05.367407 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.395584 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-custom\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.405768 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-logs\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.406520 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-swift-storage-0\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.406602 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-config\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.406709 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mv4r\" (UniqueName: \"kubernetes.io/projected/d702847e-681f-49cd-8d28-029cae3b4bf5-kube-api-access-9mv4r\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.406813 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/66cad757-8699-4951-bbdf-b556fd09d35c-etc-podinfo\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.407651 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-merged\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.407729 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-combined-ca-bundle\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.407763 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-svc\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.407812 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-nb\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.407978 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.408011 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-sb\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.408033 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-scripts\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.408093 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45r8\" (UniqueName: \"kubernetes.io/projected/66cad757-8699-4951-bbdf-b556fd09d35c-kube-api-access-g45r8\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.409833 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-config\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.410204 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-swift-storage-0\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.410961 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-sb\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.411538 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-nb\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.413839 master-0 kubenswrapper[29936]: I1205 13:07:05.412706 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-svc\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.466064 master-0 kubenswrapper[29936]: I1205 13:07:05.465784 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mv4r\" (UniqueName: \"kubernetes.io/projected/d702847e-681f-49cd-8d28-029cae3b4bf5-kube-api-access-9mv4r\") pod \"dnsmasq-dns-65b88b76d9-2zcht\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.541800 master-0 kubenswrapper[29936]: I1205 13:07:05.541649 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-merged\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.541800 master-0 kubenswrapper[29936]: I1205 13:07:05.541813 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-combined-ca-bundle\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.542445 master-0 kubenswrapper[29936]: I1205 13:07:05.542035 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.542445 master-0 kubenswrapper[29936]: I1205 13:07:05.542079 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-scripts\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.542445 master-0 kubenswrapper[29936]: I1205 13:07:05.542157 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g45r8\" (UniqueName: \"kubernetes.io/projected/66cad757-8699-4951-bbdf-b556fd09d35c-kube-api-access-g45r8\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.542594 master-0 kubenswrapper[29936]: I1205 13:07:05.542539 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-custom\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.542594 master-0 kubenswrapper[29936]: I1205 13:07:05.542579 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-logs\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.542753 master-0 kubenswrapper[29936]: I1205 13:07:05.542725 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/66cad757-8699-4951-bbdf-b556fd09d35c-etc-podinfo\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.543029 master-0 kubenswrapper[29936]: I1205 13:07:05.542875 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-merged\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.546365 master-0 kubenswrapper[29936]: I1205 13:07:05.546204 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-logs\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.572460 master-0 kubenswrapper[29936]: I1205 13:07:05.571756 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-scripts\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.572460 master-0 kubenswrapper[29936]: I1205 13:07:05.572008 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.572805 master-0 kubenswrapper[29936]: I1205 13:07:05.572661 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-combined-ca-bundle\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.581579 master-0 kubenswrapper[29936]: I1205 13:07:05.581503 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/66cad757-8699-4951-bbdf-b556fd09d35c-etc-podinfo\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.586214 master-0 kubenswrapper[29936]: I1205 13:07:05.586142 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45r8\" (UniqueName: \"kubernetes.io/projected/66cad757-8699-4951-bbdf-b556fd09d35c-kube-api-access-g45r8\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.587889 master-0 kubenswrapper[29936]: I1205 13:07:05.587841 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-custom\") pod \"ironic-5456ffdd9c-4qjcn\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.724998 master-0 kubenswrapper[29936]: I1205 13:07:05.724910 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:05.743137 master-0 kubenswrapper[29936]: I1205 13:07:05.739034 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:05.848358 master-0 kubenswrapper[29936]: I1205 13:07:05.848192 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-sys\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.848358 master-0 kubenswrapper[29936]: I1205 13:07:05.848291 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-nvme\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.848358 master-0 kubenswrapper[29936]: I1205 13:07:05.848350 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-sys" (OuterVolumeSpecName: "sys") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:05.848971 master-0 kubenswrapper[29936]: I1205 13:07:05.848364 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-cinder\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.848971 master-0 kubenswrapper[29936]: I1205 13:07:05.848393 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:05.848971 master-0 kubenswrapper[29936]: I1205 13:07:05.848422 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:05.848971 master-0 kubenswrapper[29936]: I1205 13:07:05.848439 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtgzd\" (UniqueName: \"kubernetes.io/projected/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-kube-api-access-wtgzd\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.848971 master-0 kubenswrapper[29936]: I1205 13:07:05.848647 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-scripts\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.848971 master-0 kubenswrapper[29936]: I1205 13:07:05.848770 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-iscsi\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.848971 master-0 kubenswrapper[29936]: I1205 13:07:05.848872 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-combined-ca-bundle\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.849265 master-0 kubenswrapper[29936]: I1205 13:07:05.848979 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-dev\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.849265 master-0 kubenswrapper[29936]: I1205 13:07:05.849026 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data-custom\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.849265 master-0 kubenswrapper[29936]: I1205 13:07:05.849086 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.849265 master-0 kubenswrapper[29936]: I1205 13:07:05.849113 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-brick\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.849265 master-0 kubenswrapper[29936]: I1205 13:07:05.849142 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-lib-modules\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.849424 master-0 kubenswrapper[29936]: I1205 13:07:05.849342 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-run\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.849424 master-0 kubenswrapper[29936]: I1205 13:07:05.849411 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-lib-cinder\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.849621 master-0 kubenswrapper[29936]: I1205 13:07:05.849444 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-machine-id\") pod \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\" (UID: \"46decc62-ea7e-4ec1-ad45-7ff4812f77a5\") " Dec 05 13:07:05.851002 master-0 kubenswrapper[29936]: I1205 13:07:05.850962 29936 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-sys\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.851002 master-0 kubenswrapper[29936]: I1205 13:07:05.850999 29936 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-nvme\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.851097 master-0 kubenswrapper[29936]: I1205 13:07:05.851014 29936 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.851097 master-0 kubenswrapper[29936]: I1205 13:07:05.851064 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:05.851160 master-0 kubenswrapper[29936]: I1205 13:07:05.851097 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:05.854285 master-0 kubenswrapper[29936]: I1205 13:07:05.852610 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:05.854285 master-0 kubenswrapper[29936]: I1205 13:07:05.852682 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-dev" (OuterVolumeSpecName: "dev") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:05.874984 master-0 kubenswrapper[29936]: I1205 13:07:05.871792 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-run" (OuterVolumeSpecName: "run") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:05.874984 master-0 kubenswrapper[29936]: I1205 13:07:05.871860 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:05.874984 master-0 kubenswrapper[29936]: I1205 13:07:05.871889 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:05.887556 master-0 kubenswrapper[29936]: I1205 13:07:05.885872 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:05.907772 master-0 kubenswrapper[29936]: I1205 13:07:05.907550 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-kube-api-access-wtgzd" (OuterVolumeSpecName: "kube-api-access-wtgzd") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "kube-api-access-wtgzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:05.914426 master-0 kubenswrapper[29936]: I1205 13:07:05.914305 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:05.915092 master-0 kubenswrapper[29936]: I1205 13:07:05.914977 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-scripts" (OuterVolumeSpecName: "scripts") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:05.988212 master-0 kubenswrapper[29936]: I1205 13:07:05.988128 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtgzd\" (UniqueName: \"kubernetes.io/projected/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-kube-api-access-wtgzd\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.988731 master-0 kubenswrapper[29936]: I1205 13:07:05.988711 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.988914 master-0 kubenswrapper[29936]: I1205 13:07:05.988897 29936 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.991059 master-0 kubenswrapper[29936]: I1205 13:07:05.991037 29936 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-dev\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.991388 master-0 kubenswrapper[29936]: I1205 13:07:05.991355 29936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.991509 master-0 kubenswrapper[29936]: I1205 13:07:05.991493 29936 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.991598 master-0 kubenswrapper[29936]: I1205 13:07:05.991587 29936 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-lib-modules\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.991824 master-0 kubenswrapper[29936]: I1205 13:07:05.991813 29936 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-run\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.992436 master-0 kubenswrapper[29936]: I1205 13:07:05.992421 29936 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:05.992616 master-0 kubenswrapper[29936]: I1205 13:07:05.992599 29936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.007928 master-0 kubenswrapper[29936]: I1205 13:07:05.999642 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:06.017364 master-0 kubenswrapper[29936]: I1205 13:07:06.017299 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-m94x4"] Dec 05 13:07:06.061606 master-0 kubenswrapper[29936]: I1205 13:07:06.054824 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:06.104305 master-0 kubenswrapper[29936]: I1205 13:07:06.104243 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.116543 master-0 kubenswrapper[29936]: I1205 13:07:06.116459 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-8458c7d7db-8c5lp"] Dec 05 13:07:06.187784 master-0 kubenswrapper[29936]: I1205 13:07:06.187728 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:06.195079 master-0 kubenswrapper[29936]: I1205 13:07:06.194996 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" event={"ID":"c2913bf6-55a0-42ff-b4ca-e8e39335d588","Type":"ContainerDied","Data":"cb2a279acfaf030181eefd5e4aeffcb61214a2999f7482505bcbc582dcfacffc"} Dec 05 13:07:06.195761 master-0 kubenswrapper[29936]: I1205 13:07:06.195743 29936 scope.go:117] "RemoveContainer" containerID="1047b1b600f624e4e0b35df0d24baa1d48d897dc07d5f766de4f8f648de136e8" Dec 05 13:07:06.196083 master-0 kubenswrapper[29936]: I1205 13:07:06.195689 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:06.206937 master-0 kubenswrapper[29936]: I1205 13:07:06.206886 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-brick\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207264 master-0 kubenswrapper[29936]: I1205 13:07:06.207045 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-cinder\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207264 master-0 kubenswrapper[29936]: I1205 13:07:06.207062 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-lib-modules\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207264 master-0 kubenswrapper[29936]: I1205 13:07:06.207121 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-run\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207264 master-0 kubenswrapper[29936]: I1205 13:07:06.207145 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-machine-id\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207469 master-0 kubenswrapper[29936]: I1205 13:07:06.207271 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-combined-ca-bundle\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207469 master-0 kubenswrapper[29936]: I1205 13:07:06.207334 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207469 master-0 kubenswrapper[29936]: I1205 13:07:06.207422 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-sys\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207744 master-0 kubenswrapper[29936]: I1205 13:07:06.207477 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-lib-cinder\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207744 master-0 kubenswrapper[29936]: I1205 13:07:06.207511 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-dev\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207744 master-0 kubenswrapper[29936]: I1205 13:07:06.207540 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-scripts\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.207849 master-0 kubenswrapper[29936]: I1205 13:07:06.207737 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:06.207887 master-0 kubenswrapper[29936]: I1205 13:07:06.207859 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:06.207887 master-0 kubenswrapper[29936]: I1205 13:07:06.207881 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:06.207951 master-0 kubenswrapper[29936]: I1205 13:07:06.207901 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:06.207951 master-0 kubenswrapper[29936]: I1205 13:07:06.207920 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-run" (OuterVolumeSpecName: "run") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:06.207951 master-0 kubenswrapper[29936]: I1205 13:07:06.207942 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-sys" (OuterVolumeSpecName: "sys") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:06.209070 master-0 kubenswrapper[29936]: I1205 13:07:06.208966 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twkxl\" (UniqueName: \"kubernetes.io/projected/c2913bf6-55a0-42ff-b4ca-e8e39335d588-kube-api-access-twkxl\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.209070 master-0 kubenswrapper[29936]: I1205 13:07:06.208996 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-nvme\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.209070 master-0 kubenswrapper[29936]: I1205 13:07:06.209017 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data-custom\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.209070 master-0 kubenswrapper[29936]: I1205 13:07:06.209038 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-iscsi\") pod \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\" (UID: \"c2913bf6-55a0-42ff-b4ca-e8e39335d588\") " Dec 05 13:07:06.210091 master-0 kubenswrapper[29936]: I1205 13:07:06.209529 29936 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.212735 master-0 kubenswrapper[29936]: I1205 13:07:06.211852 29936 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-lib-modules\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.212735 master-0 kubenswrapper[29936]: I1205 13:07:06.211892 29936 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-run\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.212735 master-0 kubenswrapper[29936]: I1205 13:07:06.211906 29936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.212735 master-0 kubenswrapper[29936]: I1205 13:07:06.211917 29936 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-sys\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.212735 master-0 kubenswrapper[29936]: I1205 13:07:06.211927 29936 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.212735 master-0 kubenswrapper[29936]: I1205 13:07:06.212462 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:06.212735 master-0 kubenswrapper[29936]: I1205 13:07:06.212500 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-dev" (OuterVolumeSpecName: "dev") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:06.216632 master-0 kubenswrapper[29936]: I1205 13:07:06.216566 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-m94x4" event={"ID":"e7777eb1-fa7a-4f4b-8887-da54c42cff61","Type":"ContainerStarted","Data":"0f7795ccc26f4d82d27c525531e191fb9e7294703cadfbaa44d20b1b6da1f43e"} Dec 05 13:07:06.223401 master-0 kubenswrapper[29936]: I1205 13:07:06.219545 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:06.223401 master-0 kubenswrapper[29936]: I1205 13:07:06.220318 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:06.223401 master-0 kubenswrapper[29936]: I1205 13:07:06.223162 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:06.276157 master-0 kubenswrapper[29936]: I1205 13:07:06.253397 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-scripts" (OuterVolumeSpecName: "scripts") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:06.276157 master-0 kubenswrapper[29936]: I1205 13:07:06.259000 29936 scope.go:117] "RemoveContainer" containerID="c585694c0ac0650448f8079c818538fb475b6349292cd835a880cf8f876333a8" Dec 05 13:07:06.276157 master-0 kubenswrapper[29936]: I1205 13:07:06.259231 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2913bf6-55a0-42ff-b4ca-e8e39335d588-kube-api-access-twkxl" (OuterVolumeSpecName: "kube-api-access-twkxl") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "kube-api-access-twkxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:06.276157 master-0 kubenswrapper[29936]: I1205 13:07:06.265246 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-backup-0" event={"ID":"46decc62-ea7e-4ec1-ad45-7ff4812f77a5","Type":"ContainerDied","Data":"141c9bba7eac5148ac0d908d62b93b2b23daf83214ba40941d20ad2ec6d276d5"} Dec 05 13:07:06.276157 master-0 kubenswrapper[29936]: I1205 13:07:06.265537 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:06.310285 master-0 kubenswrapper[29936]: I1205 13:07:06.283405 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" event={"ID":"1eb892f5-7ab8-4503-b7b8-1e233a1042bb","Type":"ContainerStarted","Data":"f1213e2119144fd33b41fc94dc9849a329e21aacf488d4e461309a667916318e"} Dec 05 13:07:06.314557 master-0 kubenswrapper[29936]: I1205 13:07:06.313957 29936 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.314557 master-0 kubenswrapper[29936]: I1205 13:07:06.314007 29936 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-dev\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.314557 master-0 kubenswrapper[29936]: I1205 13:07:06.314019 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.314557 master-0 kubenswrapper[29936]: I1205 13:07:06.314029 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twkxl\" (UniqueName: \"kubernetes.io/projected/c2913bf6-55a0-42ff-b4ca-e8e39335d588-kube-api-access-twkxl\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.314557 master-0 kubenswrapper[29936]: I1205 13:07:06.314041 29936 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-nvme\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.314557 master-0 kubenswrapper[29936]: I1205 13:07:06.314051 29936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.314557 master-0 kubenswrapper[29936]: I1205 13:07:06.314060 29936 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c2913bf6-55a0-42ff-b4ca-e8e39335d588-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.314557 master-0 kubenswrapper[29936]: I1205 13:07:06.314261 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:06.346555 master-0 kubenswrapper[29936]: I1205 13:07:06.346519 29936 scope.go:117] "RemoveContainer" containerID="dff7bda528eab88b606bbabb6adcb24512060bf1ae567119c564f3cea491ccf0" Dec 05 13:07:06.353815 master-0 kubenswrapper[29936]: I1205 13:07:06.353642 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-2c1e-account-create-update-6qdwp"] Dec 05 13:07:06.376522 master-0 kubenswrapper[29936]: I1205 13:07:06.376406 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b46d8-scheduler-0"] Dec 05 13:07:06.421369 master-0 kubenswrapper[29936]: I1205 13:07:06.419541 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65b88b76d9-2zcht"] Dec 05 13:07:06.423951 master-0 kubenswrapper[29936]: I1205 13:07:06.423922 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.431580 master-0 kubenswrapper[29936]: I1205 13:07:06.431479 29936 scope.go:117] "RemoveContainer" containerID="a34264df621ab7c0263683464a9f81f32b4cbf3c98c6c3b16fccf927093379c6" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: I1205 13:07:06.659668 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: E1205 13:07:06.660313 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" containerName="probe" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: I1205 13:07:06.660327 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" containerName="probe" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: E1205 13:07:06.660352 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" containerName="cinder-backup" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: I1205 13:07:06.660358 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" containerName="cinder-backup" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: E1205 13:07:06.660402 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" containerName="probe" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: I1205 13:07:06.660429 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" containerName="probe" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: E1205 13:07:06.660446 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" containerName="cinder-volume" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: I1205 13:07:06.660453 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" containerName="cinder-volume" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: I1205 13:07:06.660805 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" containerName="probe" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: I1205 13:07:06.660828 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" containerName="cinder-backup" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: I1205 13:07:06.660864 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" containerName="probe" Dec 05 13:07:06.661238 master-0 kubenswrapper[29936]: I1205 13:07:06.660890 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" containerName="cinder-volume" Dec 05 13:07:06.665120 master-0 kubenswrapper[29936]: I1205 13:07:06.665023 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Dec 05 13:07:06.668759 master-0 kubenswrapper[29936]: I1205 13:07:06.668722 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data" (OuterVolumeSpecName: "config-data") pod "46decc62-ea7e-4ec1-ad45-7ff4812f77a5" (UID: "46decc62-ea7e-4ec1-ad45-7ff4812f77a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:06.669045 master-0 kubenswrapper[29936]: I1205 13:07:06.669024 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Dec 05 13:07:06.670053 master-0 kubenswrapper[29936]: I1205 13:07:06.670038 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Dec 05 13:07:06.698706 master-0 kubenswrapper[29936]: I1205 13:07:06.698638 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Dec 05 13:07:06.717010 master-0 kubenswrapper[29936]: I1205 13:07:06.716916 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data" (OuterVolumeSpecName: "config-data") pod "c2913bf6-55a0-42ff-b4ca-e8e39335d588" (UID: "c2913bf6-55a0-42ff-b4ca-e8e39335d588"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:06.734450 master-0 kubenswrapper[29936]: I1205 13:07:06.732878 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46decc62-ea7e-4ec1-ad45-7ff4812f77a5-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.734450 master-0 kubenswrapper[29936]: I1205 13:07:06.732950 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2913bf6-55a0-42ff-b4ca-e8e39335d588-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:06.779237 master-0 kubenswrapper[29936]: I1205 13:07:06.779150 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-5456ffdd9c-4qjcn"] Dec 05 13:07:06.796222 master-0 kubenswrapper[29936]: W1205 13:07:06.796055 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66cad757_8699_4951_bbdf_b556fd09d35c.slice/crio-cfe26fb6f1b7b0ab5b0ea5c536c112ec0836b3349e5b3129f4aaaaa8f951feac WatchSource:0}: Error finding container cfe26fb6f1b7b0ab5b0ea5c536c112ec0836b3349e5b3129f4aaaaa8f951feac: Status 404 returned error can't find the container with id cfe26fb6f1b7b0ab5b0ea5c536c112ec0836b3349e5b3129f4aaaaa8f951feac Dec 05 13:07:06.837271 master-0 kubenswrapper[29936]: I1205 13:07:06.835642 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.837271 master-0 kubenswrapper[29936]: I1205 13:07:06.835761 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.837271 master-0 kubenswrapper[29936]: I1205 13:07:06.835849 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cc2221d6-014e-4bd4-962b-24512ebf84e8-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.837271 master-0 kubenswrapper[29936]: I1205 13:07:06.835880 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqcgx\" (UniqueName: \"kubernetes.io/projected/cc2221d6-014e-4bd4-962b-24512ebf84e8-kube-api-access-hqcgx\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.837271 master-0 kubenswrapper[29936]: I1205 13:07:06.835933 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-scripts\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.837271 master-0 kubenswrapper[29936]: I1205 13:07:06.835968 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-105d686b-ca62-46ac-93e6-015908a106ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^b4e33d75-2bdd-4375-814d-500d255bc761\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.837271 master-0 kubenswrapper[29936]: I1205 13:07:06.835996 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-config-data\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.837271 master-0 kubenswrapper[29936]: I1205 13:07:06.836030 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc2221d6-014e-4bd4-962b-24512ebf84e8-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.939548 master-0 kubenswrapper[29936]: I1205 13:07:06.938955 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cc2221d6-014e-4bd4-962b-24512ebf84e8-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.939548 master-0 kubenswrapper[29936]: I1205 13:07:06.939037 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqcgx\" (UniqueName: \"kubernetes.io/projected/cc2221d6-014e-4bd4-962b-24512ebf84e8-kube-api-access-hqcgx\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.939548 master-0 kubenswrapper[29936]: I1205 13:07:06.939093 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-scripts\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.939548 master-0 kubenswrapper[29936]: I1205 13:07:06.939138 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-105d686b-ca62-46ac-93e6-015908a106ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^b4e33d75-2bdd-4375-814d-500d255bc761\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.939548 master-0 kubenswrapper[29936]: I1205 13:07:06.939165 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-config-data\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.939548 master-0 kubenswrapper[29936]: I1205 13:07:06.939218 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc2221d6-014e-4bd4-962b-24512ebf84e8-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.939548 master-0 kubenswrapper[29936]: I1205 13:07:06.939270 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.939548 master-0 kubenswrapper[29936]: I1205 13:07:06.939326 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.945026 master-0 kubenswrapper[29936]: I1205 13:07:06.940815 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc2221d6-014e-4bd4-962b-24512ebf84e8-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.945993 master-0 kubenswrapper[29936]: I1205 13:07:06.945767 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cc2221d6-014e-4bd4-962b-24512ebf84e8-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.947392 master-0 kubenswrapper[29936]: I1205 13:07:06.946916 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:07:06.947392 master-0 kubenswrapper[29936]: I1205 13:07:06.946959 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-105d686b-ca62-46ac-93e6-015908a106ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^b4e33d75-2bdd-4375-814d-500d255bc761\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c5ba67bff5071fb0a414dbe22f675a11aab5ba640d242fddec5be45350e477fa/globalmount\"" pod="openstack/ironic-conductor-0" Dec 05 13:07:06.947575 master-0 kubenswrapper[29936]: I1205 13:07:06.947416 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-config-data\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.950829 master-0 kubenswrapper[29936]: I1205 13:07:06.950463 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.950829 master-0 kubenswrapper[29936]: I1205 13:07:06.950772 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.951334 master-0 kubenswrapper[29936]: I1205 13:07:06.951206 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc2221d6-014e-4bd4-962b-24512ebf84e8-scripts\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:06.965639 master-0 kubenswrapper[29936]: I1205 13:07:06.965547 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqcgx\" (UniqueName: \"kubernetes.io/projected/cc2221d6-014e-4bd4-962b-24512ebf84e8-kube-api-access-hqcgx\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:07.303599 master-0 kubenswrapper[29936]: I1205 13:07:07.303522 29936 generic.go:334] "Generic (PLEG): container finished" podID="e7777eb1-fa7a-4f4b-8887-da54c42cff61" containerID="8740745c142be01a2afc231b874b9b9e2b8c88cef19049d5e0abe9d2173d2a0f" exitCode=0 Dec 05 13:07:07.304053 master-0 kubenswrapper[29936]: I1205 13:07:07.304029 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-m94x4" event={"ID":"e7777eb1-fa7a-4f4b-8887-da54c42cff61","Type":"ContainerDied","Data":"8740745c142be01a2afc231b874b9b9e2b8c88cef19049d5e0abe9d2173d2a0f"} Dec 05 13:07:07.308738 master-0 kubenswrapper[29936]: I1205 13:07:07.308702 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" event={"ID":"d702847e-681f-49cd-8d28-029cae3b4bf5","Type":"ContainerStarted","Data":"f8f812c9ba0805d00d6121d8150974665587b0fb45fd0ae13ab0b45bcf77f873"} Dec 05 13:07:07.310850 master-0 kubenswrapper[29936]: I1205 13:07:07.310828 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5456ffdd9c-4qjcn" event={"ID":"66cad757-8699-4951-bbdf-b556fd09d35c","Type":"ContainerStarted","Data":"cfe26fb6f1b7b0ab5b0ea5c536c112ec0836b3349e5b3129f4aaaaa8f951feac"} Dec 05 13:07:07.312701 master-0 kubenswrapper[29936]: I1205 13:07:07.312663 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b46d8-scheduler-0" podUID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" containerName="cinder-scheduler" containerID="cri-o://70707515162ff0f31d64d73855efdd346ed6d3a1dd42e12a3b6d12f42fa16bf3" gracePeriod=30 Dec 05 13:07:07.313017 master-0 kubenswrapper[29936]: I1205 13:07:07.312825 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" event={"ID":"188a05a6-3a61-4472-b383-f44f2d022d08","Type":"ContainerStarted","Data":"973dc3fe506fd0b1629d2d4565678ca598a8ff8ffd809b74a9a1484c5710ad53"} Dec 05 13:07:07.313088 master-0 kubenswrapper[29936]: I1205 13:07:07.312853 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-b46d8-scheduler-0" podUID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" containerName="probe" containerID="cri-o://72ae10d5d443f237be1f1fb9b2d450e0ae86534923bac9663291bac8056c11af" gracePeriod=30 Dec 05 13:07:07.531656 master-0 kubenswrapper[29936]: I1205 13:07:07.531571 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b46d8-volume-lvm-iscsi-0"] Dec 05 13:07:07.574887 master-0 kubenswrapper[29936]: I1205 13:07:07.574716 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b46d8-volume-lvm-iscsi-0"] Dec 05 13:07:07.636295 master-0 kubenswrapper[29936]: I1205 13:07:07.635634 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b46d8-volume-lvm-iscsi-0"] Dec 05 13:07:07.638110 master-0 kubenswrapper[29936]: I1205 13:07:07.638082 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.642830 master-0 kubenswrapper[29936]: I1205 13:07:07.642141 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-volume-lvm-iscsi-config-data" Dec 05 13:07:07.664239 master-0 kubenswrapper[29936]: I1205 13:07:07.649690 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-volume-lvm-iscsi-0"] Dec 05 13:07:07.744607 master-0 kubenswrapper[29936]: I1205 13:07:07.744503 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-combined-ca-bundle\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.745203 master-0 kubenswrapper[29936]: I1205 13:07:07.745057 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-var-locks-brick\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.745203 master-0 kubenswrapper[29936]: I1205 13:07:07.745192 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-dev\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.745375 master-0 kubenswrapper[29936]: I1205 13:07:07.745331 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-lib-modules\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.745520 master-0 kubenswrapper[29936]: I1205 13:07:07.745399 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-etc-nvme\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.745674 master-0 kubenswrapper[29936]: I1205 13:07:07.745574 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-scripts\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.746089 master-0 kubenswrapper[29936]: I1205 13:07:07.745991 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-etc-iscsi\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.746163 master-0 kubenswrapper[29936]: I1205 13:07:07.746045 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-sys\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.746358 master-0 kubenswrapper[29936]: I1205 13:07:07.746316 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55klr\" (UniqueName: \"kubernetes.io/projected/26623288-0e6f-473a-9e87-508c2607b948-kube-api-access-55klr\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.746414 master-0 kubenswrapper[29936]: I1205 13:07:07.746372 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-var-locks-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.746503 master-0 kubenswrapper[29936]: I1205 13:07:07.746475 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-config-data\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.747414 master-0 kubenswrapper[29936]: I1205 13:07:07.746556 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-etc-machine-id\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.747414 master-0 kubenswrapper[29936]: I1205 13:07:07.747410 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-config-data-custom\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.749370 master-0 kubenswrapper[29936]: I1205 13:07:07.747569 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-run\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.749370 master-0 kubenswrapper[29936]: I1205 13:07:07.747608 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-var-lib-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851139 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-combined-ca-bundle\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851291 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-var-locks-brick\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851321 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-dev\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851354 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-lib-modules\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851379 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-etc-nvme\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851412 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-scripts\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851474 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-etc-iscsi\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851503 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-sys\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851550 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55klr\" (UniqueName: \"kubernetes.io/projected/26623288-0e6f-473a-9e87-508c2607b948-kube-api-access-55klr\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851568 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-var-locks-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851590 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-config-data\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851626 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-etc-machine-id\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851649 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-config-data-custom\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851681 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-run\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851700 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-var-lib-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.851892 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-var-lib-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.852025 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-var-locks-brick\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.852070 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-sys\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.853510 master-0 kubenswrapper[29936]: I1205 13:07:07.853195 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-var-locks-cinder\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.854648 master-0 kubenswrapper[29936]: I1205 13:07:07.853960 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-etc-machine-id\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.854648 master-0 kubenswrapper[29936]: I1205 13:07:07.854462 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-run\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.854648 master-0 kubenswrapper[29936]: I1205 13:07:07.854506 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-lib-modules\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.854648 master-0 kubenswrapper[29936]: I1205 13:07:07.854567 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-etc-nvme\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.855376 master-0 kubenswrapper[29936]: I1205 13:07:07.855248 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-etc-iscsi\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.855376 master-0 kubenswrapper[29936]: I1205 13:07:07.855308 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/26623288-0e6f-473a-9e87-508c2607b948-dev\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.857491 master-0 kubenswrapper[29936]: I1205 13:07:07.857448 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-combined-ca-bundle\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.864139 master-0 kubenswrapper[29936]: I1205 13:07:07.864031 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-scripts\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.864334 master-0 kubenswrapper[29936]: I1205 13:07:07.864231 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-config-data-custom\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.866355 master-0 kubenswrapper[29936]: I1205 13:07:07.866173 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26623288-0e6f-473a-9e87-508c2607b948-config-data\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.881242 master-0 kubenswrapper[29936]: I1205 13:07:07.879473 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55klr\" (UniqueName: \"kubernetes.io/projected/26623288-0e6f-473a-9e87-508c2607b948-kube-api-access-55klr\") pod \"cinder-b46d8-volume-lvm-iscsi-0\" (UID: \"26623288-0e6f-473a-9e87-508c2607b948\") " pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:07.996207 master-0 kubenswrapper[29936]: I1205 13:07:07.996071 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:08.342898 master-0 kubenswrapper[29936]: I1205 13:07:08.342798 29936 generic.go:334] "Generic (PLEG): container finished" podID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" containerID="72ae10d5d443f237be1f1fb9b2d450e0ae86534923bac9663291bac8056c11af" exitCode=0 Dec 05 13:07:08.343217 master-0 kubenswrapper[29936]: I1205 13:07:08.342905 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-scheduler-0" event={"ID":"062a3b87-3828-49fb-8b3b-099ef02fe5a3","Type":"ContainerDied","Data":"72ae10d5d443f237be1f1fb9b2d450e0ae86534923bac9663291bac8056c11af"} Dec 05 13:07:08.345194 master-0 kubenswrapper[29936]: I1205 13:07:08.345119 29936 generic.go:334] "Generic (PLEG): container finished" podID="d702847e-681f-49cd-8d28-029cae3b4bf5" containerID="c7b2025fb91d0655b2bb65da82b4d5b5f98b53899cf8e140a02d94f7d2fccabc" exitCode=0 Dec 05 13:07:08.345268 master-0 kubenswrapper[29936]: I1205 13:07:08.345226 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" event={"ID":"d702847e-681f-49cd-8d28-029cae3b4bf5","Type":"ContainerDied","Data":"c7b2025fb91d0655b2bb65da82b4d5b5f98b53899cf8e140a02d94f7d2fccabc"} Dec 05 13:07:08.349283 master-0 kubenswrapper[29936]: I1205 13:07:08.349077 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" event={"ID":"188a05a6-3a61-4472-b383-f44f2d022d08","Type":"ContainerStarted","Data":"5e5f4ceda6016dd2c94bbad5e06f1aed8177635a7cbbb6f34898b4dcbe61a46d"} Dec 05 13:07:08.428969 master-0 kubenswrapper[29936]: I1205 13:07:08.427056 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" podStartSLOduration=4.427032218 podStartE2EDuration="4.427032218s" podCreationTimestamp="2025-12-05 13:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:07:08.411403134 +0000 UTC m=+1025.543482835" watchObservedRunningTime="2025-12-05 13:07:08.427032218 +0000 UTC m=+1025.559111899" Dec 05 13:07:08.883492 master-0 kubenswrapper[29936]: I1205 13:07:08.883128 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-105d686b-ca62-46ac-93e6-015908a106ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^b4e33d75-2bdd-4375-814d-500d255bc761\") pod \"ironic-conductor-0\" (UID: \"cc2221d6-014e-4bd4-962b-24512ebf84e8\") " pod="openstack/ironic-conductor-0" Dec 05 13:07:08.967591 master-0 kubenswrapper[29936]: I1205 13:07:08.966397 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-m94x4" Dec 05 13:07:09.052453 master-0 kubenswrapper[29936]: I1205 13:07:09.052294 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Dec 05 13:07:09.157294 master-0 kubenswrapper[29936]: I1205 13:07:09.157155 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhtng\" (UniqueName: \"kubernetes.io/projected/e7777eb1-fa7a-4f4b-8887-da54c42cff61-kube-api-access-fhtng\") pod \"e7777eb1-fa7a-4f4b-8887-da54c42cff61\" (UID: \"e7777eb1-fa7a-4f4b-8887-da54c42cff61\") " Dec 05 13:07:09.157884 master-0 kubenswrapper[29936]: I1205 13:07:09.157825 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7777eb1-fa7a-4f4b-8887-da54c42cff61-operator-scripts\") pod \"e7777eb1-fa7a-4f4b-8887-da54c42cff61\" (UID: \"e7777eb1-fa7a-4f4b-8887-da54c42cff61\") " Dec 05 13:07:09.158656 master-0 kubenswrapper[29936]: I1205 13:07:09.158572 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7777eb1-fa7a-4f4b-8887-da54c42cff61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7777eb1-fa7a-4f4b-8887-da54c42cff61" (UID: "e7777eb1-fa7a-4f4b-8887-da54c42cff61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:09.161028 master-0 kubenswrapper[29936]: I1205 13:07:09.160956 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7777eb1-fa7a-4f4b-8887-da54c42cff61-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:09.188996 master-0 kubenswrapper[29936]: I1205 13:07:09.188775 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7777eb1-fa7a-4f4b-8887-da54c42cff61-kube-api-access-fhtng" (OuterVolumeSpecName: "kube-api-access-fhtng") pod "e7777eb1-fa7a-4f4b-8887-da54c42cff61" (UID: "e7777eb1-fa7a-4f4b-8887-da54c42cff61"). InnerVolumeSpecName "kube-api-access-fhtng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:09.241351 master-0 kubenswrapper[29936]: I1205 13:07:09.240840 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2913bf6-55a0-42ff-b4ca-e8e39335d588" path="/var/lib/kubelet/pods/c2913bf6-55a0-42ff-b4ca-e8e39335d588/volumes" Dec 05 13:07:09.259825 master-0 kubenswrapper[29936]: I1205 13:07:09.259733 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-volume-lvm-iscsi-0"] Dec 05 13:07:09.268835 master-0 kubenswrapper[29936]: I1205 13:07:09.264767 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhtng\" (UniqueName: \"kubernetes.io/projected/e7777eb1-fa7a-4f4b-8887-da54c42cff61-kube-api-access-fhtng\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:09.300995 master-0 kubenswrapper[29936]: W1205 13:07:09.300736 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26623288_0e6f_473a_9e87_508c2607b948.slice/crio-2361aa219dc5ba3a0f1cb1420d071329c57f4172bd80094e8f03fa3f250f2a70 WatchSource:0}: Error finding container 2361aa219dc5ba3a0f1cb1420d071329c57f4172bd80094e8f03fa3f250f2a70: Status 404 returned error can't find the container with id 2361aa219dc5ba3a0f1cb1420d071329c57f4172bd80094e8f03fa3f250f2a70 Dec 05 13:07:09.396376 master-0 kubenswrapper[29936]: I1205 13:07:09.396300 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" event={"ID":"d702847e-681f-49cd-8d28-029cae3b4bf5","Type":"ContainerStarted","Data":"fd92afb878bf74c89dca66d4fb5262c276ca5ba46225700da9eae52d70d2b55f"} Dec 05 13:07:09.397538 master-0 kubenswrapper[29936]: I1205 13:07:09.397467 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:09.420731 master-0 kubenswrapper[29936]: I1205 13:07:09.419661 29936 generic.go:334] "Generic (PLEG): container finished" podID="188a05a6-3a61-4472-b383-f44f2d022d08" containerID="5e5f4ceda6016dd2c94bbad5e06f1aed8177635a7cbbb6f34898b4dcbe61a46d" exitCode=0 Dec 05 13:07:09.420731 master-0 kubenswrapper[29936]: I1205 13:07:09.419792 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" event={"ID":"188a05a6-3a61-4472-b383-f44f2d022d08","Type":"ContainerDied","Data":"5e5f4ceda6016dd2c94bbad5e06f1aed8177635a7cbbb6f34898b4dcbe61a46d"} Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.448720 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-m94x4" event={"ID":"e7777eb1-fa7a-4f4b-8887-da54c42cff61","Type":"ContainerDied","Data":"0f7795ccc26f4d82d27c525531e191fb9e7294703cadfbaa44d20b1b6da1f43e"} Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.448789 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f7795ccc26f4d82d27c525531e191fb9e7294703cadfbaa44d20b1b6da1f43e" Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.448869 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-m94x4" Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.470737 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-6597984769-rbgpb"] Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: E1205 13:07:09.471558 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7777eb1-fa7a-4f4b-8887-da54c42cff61" containerName="mariadb-database-create" Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.471578 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7777eb1-fa7a-4f4b-8887-da54c42cff61" containerName="mariadb-database-create" Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.472003 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7777eb1-fa7a-4f4b-8887-da54c42cff61" containerName="mariadb-database-create" Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.473865 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" event={"ID":"26623288-0e6f-473a-9e87-508c2607b948","Type":"ContainerStarted","Data":"2361aa219dc5ba3a0f1cb1420d071329c57f4172bd80094e8f03fa3f250f2a70"} Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.473992 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.479480 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.480070 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Dec 05 13:07:09.482830 master-0 kubenswrapper[29936]: I1205 13:07:09.481843 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" event={"ID":"1eb892f5-7ab8-4503-b7b8-1e233a1042bb","Type":"ContainerStarted","Data":"b6fdcf73cc7c1d340e500b2aec0f01c299d2ed131db9c697b9cea15492024369"} Dec 05 13:07:09.483306 master-0 kubenswrapper[29936]: I1205 13:07:09.482854 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:09.495283 master-0 kubenswrapper[29936]: I1205 13:07:09.495060 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" podStartSLOduration=5.495031803 podStartE2EDuration="5.495031803s" podCreationTimestamp="2025-12-05 13:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:07:09.488144035 +0000 UTC m=+1026.620223726" watchObservedRunningTime="2025-12-05 13:07:09.495031803 +0000 UTC m=+1026.627111484" Dec 05 13:07:09.563969 master-0 kubenswrapper[29936]: I1205 13:07:09.562766 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-6597984769-rbgpb"] Dec 05 13:07:09.682539 master-0 kubenswrapper[29936]: I1205 13:07:09.681407 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-config-data\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.682539 master-0 kubenswrapper[29936]: I1205 13:07:09.681488 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d3a21008-4842-4839-9821-e52a2487b796-etc-podinfo\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.682539 master-0 kubenswrapper[29936]: I1205 13:07:09.681585 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-public-tls-certs\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.682539 master-0 kubenswrapper[29936]: I1205 13:07:09.681697 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-internal-tls-certs\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.682539 master-0 kubenswrapper[29936]: I1205 13:07:09.681771 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp9cf\" (UniqueName: \"kubernetes.io/projected/d3a21008-4842-4839-9821-e52a2487b796-kube-api-access-vp9cf\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.682539 master-0 kubenswrapper[29936]: I1205 13:07:09.681815 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d3a21008-4842-4839-9821-e52a2487b796-config-data-merged\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.682539 master-0 kubenswrapper[29936]: I1205 13:07:09.681871 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a21008-4842-4839-9821-e52a2487b796-logs\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.682539 master-0 kubenswrapper[29936]: I1205 13:07:09.681944 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-config-data-custom\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.682539 master-0 kubenswrapper[29936]: I1205 13:07:09.682006 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-combined-ca-bundle\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.682539 master-0 kubenswrapper[29936]: I1205 13:07:09.682057 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-scripts\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.700047 master-0 kubenswrapper[29936]: I1205 13:07:09.699735 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" podStartSLOduration=3.271315889 podStartE2EDuration="5.699703547s" podCreationTimestamp="2025-12-05 13:07:04 +0000 UTC" firstStartedPulling="2025-12-05 13:07:06.176414059 +0000 UTC m=+1023.308493740" lastFinishedPulling="2025-12-05 13:07:08.604801717 +0000 UTC m=+1025.736881398" observedRunningTime="2025-12-05 13:07:09.619439855 +0000 UTC m=+1026.751519556" watchObservedRunningTime="2025-12-05 13:07:09.699703547 +0000 UTC m=+1026.831783238" Dec 05 13:07:09.805718 master-0 kubenswrapper[29936]: I1205 13:07:09.805619 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-config-data\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.806053 master-0 kubenswrapper[29936]: I1205 13:07:09.805834 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d3a21008-4842-4839-9821-e52a2487b796-etc-podinfo\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.806053 master-0 kubenswrapper[29936]: I1205 13:07:09.806003 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-public-tls-certs\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.806203 master-0 kubenswrapper[29936]: I1205 13:07:09.806103 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-internal-tls-certs\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.806203 master-0 kubenswrapper[29936]: I1205 13:07:09.806162 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp9cf\" (UniqueName: \"kubernetes.io/projected/d3a21008-4842-4839-9821-e52a2487b796-kube-api-access-vp9cf\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.806269 master-0 kubenswrapper[29936]: I1205 13:07:09.806225 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d3a21008-4842-4839-9821-e52a2487b796-config-data-merged\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.806303 master-0 kubenswrapper[29936]: I1205 13:07:09.806273 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a21008-4842-4839-9821-e52a2487b796-logs\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.806374 master-0 kubenswrapper[29936]: I1205 13:07:09.806346 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-config-data-custom\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.806439 master-0 kubenswrapper[29936]: I1205 13:07:09.806408 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-combined-ca-bundle\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.806492 master-0 kubenswrapper[29936]: I1205 13:07:09.806467 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-scripts\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.806941 master-0 kubenswrapper[29936]: I1205 13:07:09.806894 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d3a21008-4842-4839-9821-e52a2487b796-config-data-merged\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.807033 master-0 kubenswrapper[29936]: I1205 13:07:09.806893 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a21008-4842-4839-9821-e52a2487b796-logs\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.812965 master-0 kubenswrapper[29936]: I1205 13:07:09.812386 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-internal-tls-certs\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.812965 master-0 kubenswrapper[29936]: I1205 13:07:09.812805 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-public-tls-certs\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.816437 master-0 kubenswrapper[29936]: I1205 13:07:09.815328 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-scripts\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.830860 master-0 kubenswrapper[29936]: I1205 13:07:09.830801 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-config-data\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.832397 master-0 kubenswrapper[29936]: I1205 13:07:09.832352 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-config-data-custom\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.832989 master-0 kubenswrapper[29936]: I1205 13:07:09.832894 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a21008-4842-4839-9821-e52a2487b796-combined-ca-bundle\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.835635 master-0 kubenswrapper[29936]: I1205 13:07:09.835582 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp9cf\" (UniqueName: \"kubernetes.io/projected/d3a21008-4842-4839-9821-e52a2487b796-kube-api-access-vp9cf\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.846832 master-0 kubenswrapper[29936]: I1205 13:07:09.846774 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d3a21008-4842-4839-9821-e52a2487b796-etc-podinfo\") pod \"ironic-6597984769-rbgpb\" (UID: \"d3a21008-4842-4839-9821-e52a2487b796\") " pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.860144 master-0 kubenswrapper[29936]: I1205 13:07:09.860042 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:07:09.887416 master-0 kubenswrapper[29936]: I1205 13:07:09.887341 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:09.933883 master-0 kubenswrapper[29936]: I1205 13:07:09.933767 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-687c55df6d-h9cdt" Dec 05 13:07:10.517295 master-0 kubenswrapper[29936]: I1205 13:07:10.517125 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Dec 05 13:07:10.522523 master-0 kubenswrapper[29936]: I1205 13:07:10.520902 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" event={"ID":"26623288-0e6f-473a-9e87-508c2607b948","Type":"ContainerStarted","Data":"3e5e0f67fff865e60421b941aab69e9f48d04437878fb133423d3989beb14c59"} Dec 05 13:07:10.544059 master-0 kubenswrapper[29936]: I1205 13:07:10.544004 29936 generic.go:334] "Generic (PLEG): container finished" podID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" containerID="70707515162ff0f31d64d73855efdd346ed6d3a1dd42e12a3b6d12f42fa16bf3" exitCode=0 Dec 05 13:07:10.544313 master-0 kubenswrapper[29936]: I1205 13:07:10.544286 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-scheduler-0" event={"ID":"062a3b87-3828-49fb-8b3b-099ef02fe5a3","Type":"ContainerDied","Data":"70707515162ff0f31d64d73855efdd346ed6d3a1dd42e12a3b6d12f42fa16bf3"} Dec 05 13:07:10.943449 master-0 kubenswrapper[29936]: I1205 13:07:10.940608 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-6597984769-rbgpb"] Dec 05 13:07:11.567483 master-0 kubenswrapper[29936]: I1205 13:07:11.567387 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" event={"ID":"26623288-0e6f-473a-9e87-508c2607b948","Type":"ContainerStarted","Data":"69de49074869e66188dcbfeec2c70e582e904cd23b6c22c04097e8132877f921"} Dec 05 13:07:11.572362 master-0 kubenswrapper[29936]: I1205 13:07:11.572260 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"cc2221d6-014e-4bd4-962b-24512ebf84e8","Type":"ContainerStarted","Data":"f0440cac3ec95db6416f4886c0830f1a6e57b1a112112bb0393b37d1ecc197c6"} Dec 05 13:07:11.572362 master-0 kubenswrapper[29936]: I1205 13:07:11.572348 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"cc2221d6-014e-4bd4-962b-24512ebf84e8","Type":"ContainerStarted","Data":"17d62c70185813a10535dae12547b1b416fc29f3c5858bad53f7964ffbb30719"} Dec 05 13:07:11.855075 master-0 kubenswrapper[29936]: I1205 13:07:11.854888 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" podStartSLOduration=4.854856832 podStartE2EDuration="4.854856832s" podCreationTimestamp="2025-12-05 13:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:07:11.834909277 +0000 UTC m=+1028.966988968" watchObservedRunningTime="2025-12-05 13:07:11.854856832 +0000 UTC m=+1028.986936513" Dec 05 13:07:12.094508 master-0 kubenswrapper[29936]: I1205 13:07:12.094449 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-b46d8-api-0" Dec 05 13:07:12.211851 master-0 kubenswrapper[29936]: W1205 13:07:12.205877 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a21008_4842_4839_9821_e52a2487b796.slice/crio-21aca71592373d2dd7b9c8186d616498ca68f60a65b467345b4cc45b6e1e6ee8 WatchSource:0}: Error finding container 21aca71592373d2dd7b9c8186d616498ca68f60a65b467345b4cc45b6e1e6ee8: Status 404 returned error can't find the container with id 21aca71592373d2dd7b9c8186d616498ca68f60a65b467345b4cc45b6e1e6ee8 Dec 05 13:07:12.383706 master-0 kubenswrapper[29936]: I1205 13:07:12.383653 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:12.448021 master-0 kubenswrapper[29936]: I1205 13:07:12.447922 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" Dec 05 13:07:12.520857 master-0 kubenswrapper[29936]: I1205 13:07:12.520260 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data\") pod \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " Dec 05 13:07:12.520857 master-0 kubenswrapper[29936]: I1205 13:07:12.520398 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-combined-ca-bundle\") pod \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " Dec 05 13:07:12.520857 master-0 kubenswrapper[29936]: I1205 13:07:12.520461 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-scripts\") pod \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " Dec 05 13:07:12.520857 master-0 kubenswrapper[29936]: I1205 13:07:12.520495 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/188a05a6-3a61-4472-b383-f44f2d022d08-operator-scripts\") pod \"188a05a6-3a61-4472-b383-f44f2d022d08\" (UID: \"188a05a6-3a61-4472-b383-f44f2d022d08\") " Dec 05 13:07:12.520857 master-0 kubenswrapper[29936]: I1205 13:07:12.520576 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpdcc\" (UniqueName: \"kubernetes.io/projected/188a05a6-3a61-4472-b383-f44f2d022d08-kube-api-access-wpdcc\") pod \"188a05a6-3a61-4472-b383-f44f2d022d08\" (UID: \"188a05a6-3a61-4472-b383-f44f2d022d08\") " Dec 05 13:07:12.520857 master-0 kubenswrapper[29936]: I1205 13:07:12.520621 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9wjg\" (UniqueName: \"kubernetes.io/projected/062a3b87-3828-49fb-8b3b-099ef02fe5a3-kube-api-access-c9wjg\") pod \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " Dec 05 13:07:12.520857 master-0 kubenswrapper[29936]: I1205 13:07:12.520771 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data-custom\") pod \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " Dec 05 13:07:12.520857 master-0 kubenswrapper[29936]: I1205 13:07:12.520814 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/062a3b87-3828-49fb-8b3b-099ef02fe5a3-etc-machine-id\") pod \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\" (UID: \"062a3b87-3828-49fb-8b3b-099ef02fe5a3\") " Dec 05 13:07:12.521844 master-0 kubenswrapper[29936]: I1205 13:07:12.521796 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/188a05a6-3a61-4472-b383-f44f2d022d08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "188a05a6-3a61-4472-b383-f44f2d022d08" (UID: "188a05a6-3a61-4472-b383-f44f2d022d08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:12.523837 master-0 kubenswrapper[29936]: I1205 13:07:12.523697 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/062a3b87-3828-49fb-8b3b-099ef02fe5a3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "062a3b87-3828-49fb-8b3b-099ef02fe5a3" (UID: "062a3b87-3828-49fb-8b3b-099ef02fe5a3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 13:07:12.556124 master-0 kubenswrapper[29936]: I1205 13:07:12.556019 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/062a3b87-3828-49fb-8b3b-099ef02fe5a3-kube-api-access-c9wjg" (OuterVolumeSpecName: "kube-api-access-c9wjg") pod "062a3b87-3828-49fb-8b3b-099ef02fe5a3" (UID: "062a3b87-3828-49fb-8b3b-099ef02fe5a3"). InnerVolumeSpecName "kube-api-access-c9wjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:12.561975 master-0 kubenswrapper[29936]: I1205 13:07:12.561910 29936 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/062a3b87-3828-49fb-8b3b-099ef02fe5a3-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:12.561975 master-0 kubenswrapper[29936]: I1205 13:07:12.561961 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/188a05a6-3a61-4472-b383-f44f2d022d08-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:12.561975 master-0 kubenswrapper[29936]: I1205 13:07:12.561975 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9wjg\" (UniqueName: \"kubernetes.io/projected/062a3b87-3828-49fb-8b3b-099ef02fe5a3-kube-api-access-c9wjg\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:12.577765 master-0 kubenswrapper[29936]: I1205 13:07:12.577641 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "062a3b87-3828-49fb-8b3b-099ef02fe5a3" (UID: "062a3b87-3828-49fb-8b3b-099ef02fe5a3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:12.590303 master-0 kubenswrapper[29936]: I1205 13:07:12.588542 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-scripts" (OuterVolumeSpecName: "scripts") pod "062a3b87-3828-49fb-8b3b-099ef02fe5a3" (UID: "062a3b87-3828-49fb-8b3b-099ef02fe5a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:12.600313 master-0 kubenswrapper[29936]: I1205 13:07:12.599604 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/188a05a6-3a61-4472-b383-f44f2d022d08-kube-api-access-wpdcc" (OuterVolumeSpecName: "kube-api-access-wpdcc") pod "188a05a6-3a61-4472-b383-f44f2d022d08" (UID: "188a05a6-3a61-4472-b383-f44f2d022d08"). InnerVolumeSpecName "kube-api-access-wpdcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:12.664293 master-0 kubenswrapper[29936]: I1205 13:07:12.664238 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpdcc\" (UniqueName: \"kubernetes.io/projected/188a05a6-3a61-4472-b383-f44f2d022d08-kube-api-access-wpdcc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:12.664293 master-0 kubenswrapper[29936]: I1205 13:07:12.664281 29936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:12.664293 master-0 kubenswrapper[29936]: I1205 13:07:12.664293 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:12.690617 master-0 kubenswrapper[29936]: I1205 13:07:12.690533 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-scheduler-0" event={"ID":"062a3b87-3828-49fb-8b3b-099ef02fe5a3","Type":"ContainerDied","Data":"a37989d2f420716ea334bf7c018b26bffd325dfc42ef9329af3d2ccd5ca04567"} Dec 05 13:07:12.690770 master-0 kubenswrapper[29936]: I1205 13:07:12.690618 29936 scope.go:117] "RemoveContainer" containerID="72ae10d5d443f237be1f1fb9b2d450e0ae86534923bac9663291bac8056c11af" Dec 05 13:07:12.690822 master-0 kubenswrapper[29936]: I1205 13:07:12.690814 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:12.698923 master-0 kubenswrapper[29936]: I1205 13:07:12.698526 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6597984769-rbgpb" event={"ID":"d3a21008-4842-4839-9821-e52a2487b796","Type":"ContainerStarted","Data":"21aca71592373d2dd7b9c8186d616498ca68f60a65b467345b4cc45b6e1e6ee8"} Dec 05 13:07:12.707328 master-0 kubenswrapper[29936]: I1205 13:07:12.707235 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "062a3b87-3828-49fb-8b3b-099ef02fe5a3" (UID: "062a3b87-3828-49fb-8b3b-099ef02fe5a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:12.714742 master-0 kubenswrapper[29936]: I1205 13:07:12.714688 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" Dec 05 13:07:12.716497 master-0 kubenswrapper[29936]: I1205 13:07:12.716328 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-2c1e-account-create-update-6qdwp" event={"ID":"188a05a6-3a61-4472-b383-f44f2d022d08","Type":"ContainerDied","Data":"973dc3fe506fd0b1629d2d4565678ca598a8ff8ffd809b74a9a1484c5710ad53"} Dec 05 13:07:12.716497 master-0 kubenswrapper[29936]: I1205 13:07:12.716433 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="973dc3fe506fd0b1629d2d4565678ca598a8ff8ffd809b74a9a1484c5710ad53" Dec 05 13:07:12.769567 master-0 kubenswrapper[29936]: I1205 13:07:12.766637 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:12.831501 master-0 kubenswrapper[29936]: I1205 13:07:12.831405 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data" (OuterVolumeSpecName: "config-data") pod "062a3b87-3828-49fb-8b3b-099ef02fe5a3" (UID: "062a3b87-3828-49fb-8b3b-099ef02fe5a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:12.872422 master-0 kubenswrapper[29936]: I1205 13:07:12.872319 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/062a3b87-3828-49fb-8b3b-099ef02fe5a3-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:12.884283 master-0 kubenswrapper[29936]: I1205 13:07:12.884214 29936 scope.go:117] "RemoveContainer" containerID="70707515162ff0f31d64d73855efdd346ed6d3a1dd42e12a3b6d12f42fa16bf3" Dec 05 13:07:12.997949 master-0 kubenswrapper[29936]: I1205 13:07:12.997836 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:13.043347 master-0 kubenswrapper[29936]: I1205 13:07:13.043255 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b46d8-scheduler-0"] Dec 05 13:07:13.054166 master-0 kubenswrapper[29936]: I1205 13:07:13.052898 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b46d8-scheduler-0"] Dec 05 13:07:13.061299 master-0 kubenswrapper[29936]: I1205 13:07:13.061239 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b46d8-scheduler-0"] Dec 05 13:07:13.061821 master-0 kubenswrapper[29936]: E1205 13:07:13.061791 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" containerName="probe" Dec 05 13:07:13.061821 master-0 kubenswrapper[29936]: I1205 13:07:13.061813 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" containerName="probe" Dec 05 13:07:13.061928 master-0 kubenswrapper[29936]: E1205 13:07:13.061860 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="188a05a6-3a61-4472-b383-f44f2d022d08" containerName="mariadb-account-create-update" Dec 05 13:07:13.061928 master-0 kubenswrapper[29936]: I1205 13:07:13.061867 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="188a05a6-3a61-4472-b383-f44f2d022d08" containerName="mariadb-account-create-update" Dec 05 13:07:13.061928 master-0 kubenswrapper[29936]: E1205 13:07:13.061887 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" containerName="cinder-scheduler" Dec 05 13:07:13.061928 master-0 kubenswrapper[29936]: I1205 13:07:13.061894 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" containerName="cinder-scheduler" Dec 05 13:07:13.062167 master-0 kubenswrapper[29936]: I1205 13:07:13.062141 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" containerName="cinder-scheduler" Dec 05 13:07:13.062167 master-0 kubenswrapper[29936]: I1205 13:07:13.062161 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" containerName="probe" Dec 05 13:07:13.062343 master-0 kubenswrapper[29936]: I1205 13:07:13.062214 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="188a05a6-3a61-4472-b383-f44f2d022d08" containerName="mariadb-account-create-update" Dec 05 13:07:13.064206 master-0 kubenswrapper[29936]: I1205 13:07:13.063439 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.066707 master-0 kubenswrapper[29936]: I1205 13:07:13.066629 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-scheduler-config-data" Dec 05 13:07:13.080650 master-0 kubenswrapper[29936]: I1205 13:07:13.078586 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-scheduler-0"] Dec 05 13:07:13.181226 master-0 kubenswrapper[29936]: I1205 13:07:13.179874 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-config-data\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.181226 master-0 kubenswrapper[29936]: I1205 13:07:13.180276 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-scripts\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.181226 master-0 kubenswrapper[29936]: I1205 13:07:13.180346 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-config-data-custom\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.181226 master-0 kubenswrapper[29936]: I1205 13:07:13.180461 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkblp\" (UniqueName: \"kubernetes.io/projected/5d912173-2b25-46cd-87e9-4a0b9f821da7-kube-api-access-lkblp\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.181226 master-0 kubenswrapper[29936]: I1205 13:07:13.180504 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-combined-ca-bundle\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.181226 master-0 kubenswrapper[29936]: I1205 13:07:13.180534 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d912173-2b25-46cd-87e9-4a0b9f821da7-etc-machine-id\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.217871 master-0 kubenswrapper[29936]: I1205 13:07:13.217793 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="062a3b87-3828-49fb-8b3b-099ef02fe5a3" path="/var/lib/kubelet/pods/062a3b87-3828-49fb-8b3b-099ef02fe5a3/volumes" Dec 05 13:07:13.287028 master-0 kubenswrapper[29936]: I1205 13:07:13.286884 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-combined-ca-bundle\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.287028 master-0 kubenswrapper[29936]: I1205 13:07:13.286982 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d912173-2b25-46cd-87e9-4a0b9f821da7-etc-machine-id\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.287308 master-0 kubenswrapper[29936]: I1205 13:07:13.287040 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-config-data\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.287308 master-0 kubenswrapper[29936]: I1205 13:07:13.287148 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-scripts\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.287308 master-0 kubenswrapper[29936]: I1205 13:07:13.287224 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-config-data-custom\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.290618 master-0 kubenswrapper[29936]: I1205 13:07:13.290578 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-config-data-custom\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.291539 master-0 kubenswrapper[29936]: I1205 13:07:13.291457 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d912173-2b25-46cd-87e9-4a0b9f821da7-etc-machine-id\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.291962 master-0 kubenswrapper[29936]: I1205 13:07:13.291907 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkblp\" (UniqueName: \"kubernetes.io/projected/5d912173-2b25-46cd-87e9-4a0b9f821da7-kube-api-access-lkblp\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.292460 master-0 kubenswrapper[29936]: I1205 13:07:13.292432 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-scripts\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.295227 master-0 kubenswrapper[29936]: I1205 13:07:13.295189 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-config-data\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.302905 master-0 kubenswrapper[29936]: I1205 13:07:13.302810 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d912173-2b25-46cd-87e9-4a0b9f821da7-combined-ca-bundle\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.312625 master-0 kubenswrapper[29936]: I1205 13:07:13.312585 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkblp\" (UniqueName: \"kubernetes.io/projected/5d912173-2b25-46cd-87e9-4a0b9f821da7-kube-api-access-lkblp\") pod \"cinder-b46d8-scheduler-0\" (UID: \"5d912173-2b25-46cd-87e9-4a0b9f821da7\") " pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.437804 master-0 kubenswrapper[29936]: I1205 13:07:13.437741 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:13.748774 master-0 kubenswrapper[29936]: I1205 13:07:13.748595 29936 generic.go:334] "Generic (PLEG): container finished" podID="d3a21008-4842-4839-9821-e52a2487b796" containerID="7ad4289833668a249178c5e60dfecabac47d346299bb3e955e2c41b52771f1ac" exitCode=0 Dec 05 13:07:13.749581 master-0 kubenswrapper[29936]: I1205 13:07:13.749044 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6597984769-rbgpb" event={"ID":"d3a21008-4842-4839-9821-e52a2487b796","Type":"ContainerDied","Data":"7ad4289833668a249178c5e60dfecabac47d346299bb3e955e2c41b52771f1ac"} Dec 05 13:07:13.756660 master-0 kubenswrapper[29936]: I1205 13:07:13.756607 29936 generic.go:334] "Generic (PLEG): container finished" podID="1eb892f5-7ab8-4503-b7b8-1e233a1042bb" containerID="b6fdcf73cc7c1d340e500b2aec0f01c299d2ed131db9c697b9cea15492024369" exitCode=1 Dec 05 13:07:13.756769 master-0 kubenswrapper[29936]: I1205 13:07:13.756650 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" event={"ID":"1eb892f5-7ab8-4503-b7b8-1e233a1042bb","Type":"ContainerDied","Data":"b6fdcf73cc7c1d340e500b2aec0f01c299d2ed131db9c697b9cea15492024369"} Dec 05 13:07:13.757392 master-0 kubenswrapper[29936]: I1205 13:07:13.757357 29936 scope.go:117] "RemoveContainer" containerID="b6fdcf73cc7c1d340e500b2aec0f01c299d2ed131db9c697b9cea15492024369" Dec 05 13:07:13.766356 master-0 kubenswrapper[29936]: I1205 13:07:13.766300 29936 generic.go:334] "Generic (PLEG): container finished" podID="66cad757-8699-4951-bbdf-b556fd09d35c" containerID="eda4e2a76db4cd9e0bce286d535bd746a0b461a78dd5cb560168d9615a4c194a" exitCode=0 Dec 05 13:07:13.766712 master-0 kubenswrapper[29936]: I1205 13:07:13.766644 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5456ffdd9c-4qjcn" event={"ID":"66cad757-8699-4951-bbdf-b556fd09d35c","Type":"ContainerDied","Data":"eda4e2a76db4cd9e0bce286d535bd746a0b461a78dd5cb560168d9615a4c194a"} Dec 05 13:07:14.030655 master-0 kubenswrapper[29936]: I1205 13:07:14.030496 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-scheduler-0"] Dec 05 13:07:14.060016 master-0 kubenswrapper[29936]: W1205 13:07:14.059915 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d912173_2b25_46cd_87e9_4a0b9f821da7.slice/crio-ceb9b295e53f37662cc6a7e9036a281da97687aa4fbb73fc948ea82fa33ae900 WatchSource:0}: Error finding container ceb9b295e53f37662cc6a7e9036a281da97687aa4fbb73fc948ea82fa33ae900: Status 404 returned error can't find the container with id ceb9b295e53f37662cc6a7e9036a281da97687aa4fbb73fc948ea82fa33ae900 Dec 05 13:07:14.783047 master-0 kubenswrapper[29936]: I1205 13:07:14.782983 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-scheduler-0" event={"ID":"5d912173-2b25-46cd-87e9-4a0b9f821da7","Type":"ContainerStarted","Data":"ceb9b295e53f37662cc6a7e9036a281da97687aa4fbb73fc948ea82fa33ae900"} Dec 05 13:07:14.786922 master-0 kubenswrapper[29936]: I1205 13:07:14.786891 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5456ffdd9c-4qjcn" event={"ID":"66cad757-8699-4951-bbdf-b556fd09d35c","Type":"ContainerStarted","Data":"448d5c5534bf38c56dd5b76a6f2b1b6e1b469dbe34505d31875849e2dc384c5f"} Dec 05 13:07:14.790588 master-0 kubenswrapper[29936]: I1205 13:07:14.790554 29936 generic.go:334] "Generic (PLEG): container finished" podID="cc2221d6-014e-4bd4-962b-24512ebf84e8" containerID="f0440cac3ec95db6416f4886c0830f1a6e57b1a112112bb0393b37d1ecc197c6" exitCode=0 Dec 05 13:07:14.790955 master-0 kubenswrapper[29936]: I1205 13:07:14.790687 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"cc2221d6-014e-4bd4-962b-24512ebf84e8","Type":"ContainerDied","Data":"f0440cac3ec95db6416f4886c0830f1a6e57b1a112112bb0393b37d1ecc197c6"} Dec 05 13:07:14.794964 master-0 kubenswrapper[29936]: I1205 13:07:14.794921 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6597984769-rbgpb" event={"ID":"d3a21008-4842-4839-9821-e52a2487b796","Type":"ContainerStarted","Data":"da92248e3e7f74a2a819139231bde1a3a25f6c3c3a0c4be07deb117144d3ca27"} Dec 05 13:07:14.799844 master-0 kubenswrapper[29936]: I1205 13:07:14.799806 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" event={"ID":"1eb892f5-7ab8-4503-b7b8-1e233a1042bb","Type":"ContainerStarted","Data":"0def71aab0e9502c799b2ec9bb2c3e775a167826b1f8d1c0f4ab4c73f4421b4c"} Dec 05 13:07:14.800209 master-0 kubenswrapper[29936]: I1205 13:07:14.800130 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:15.729226 master-0 kubenswrapper[29936]: I1205 13:07:15.728158 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:15.906323 master-0 kubenswrapper[29936]: I1205 13:07:15.905673 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f9d98978f-74tz7"] Dec 05 13:07:15.906323 master-0 kubenswrapper[29936]: I1205 13:07:15.906042 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" podUID="efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" containerName="dnsmasq-dns" containerID="cri-o://ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1" gracePeriod=10 Dec 05 13:07:15.925202 master-0 kubenswrapper[29936]: I1205 13:07:15.922820 29936 generic.go:334] "Generic (PLEG): container finished" podID="66cad757-8699-4951-bbdf-b556fd09d35c" containerID="0429f893b09608de8eed731f8debc25ba51617419d622643a97b0c285ee7140b" exitCode=1 Dec 05 13:07:15.934057 master-0 kubenswrapper[29936]: I1205 13:07:15.933433 29936 scope.go:117] "RemoveContainer" containerID="0429f893b09608de8eed731f8debc25ba51617419d622643a97b0c285ee7140b" Dec 05 13:07:15.952349 master-0 kubenswrapper[29936]: I1205 13:07:15.951901 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5456ffdd9c-4qjcn" event={"ID":"66cad757-8699-4951-bbdf-b556fd09d35c","Type":"ContainerDied","Data":"0429f893b09608de8eed731f8debc25ba51617419d622643a97b0c285ee7140b"} Dec 05 13:07:15.952349 master-0 kubenswrapper[29936]: I1205 13:07:15.952009 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6597984769-rbgpb" event={"ID":"d3a21008-4842-4839-9821-e52a2487b796","Type":"ContainerStarted","Data":"f9f77aacb35e0b7aae6f4fbd648c21e0acd50f94efba677971a6e111659c54e5"} Dec 05 13:07:15.952349 master-0 kubenswrapper[29936]: I1205 13:07:15.952044 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:15.982409 master-0 kubenswrapper[29936]: I1205 13:07:15.982323 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-scheduler-0" event={"ID":"5d912173-2b25-46cd-87e9-4a0b9f821da7","Type":"ContainerStarted","Data":"66d16dd4e97dac68f09f71552ef601c3dd6794e36bacfbefb20c3c3b38d73f3e"} Dec 05 13:07:15.982409 master-0 kubenswrapper[29936]: I1205 13:07:15.982387 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-scheduler-0" event={"ID":"5d912173-2b25-46cd-87e9-4a0b9f821da7","Type":"ContainerStarted","Data":"8c28bfabd3758494b753aff6580e94cf3833b14a899c3da139afa2a2484e6f14"} Dec 05 13:07:16.069313 master-0 kubenswrapper[29936]: I1205 13:07:16.069130 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-6597984769-rbgpb" podStartSLOduration=6.61979034 podStartE2EDuration="7.069095421s" podCreationTimestamp="2025-12-05 13:07:09 +0000 UTC" firstStartedPulling="2025-12-05 13:07:12.22230472 +0000 UTC m=+1029.354384401" lastFinishedPulling="2025-12-05 13:07:12.671609811 +0000 UTC m=+1029.803689482" observedRunningTime="2025-12-05 13:07:16.015157009 +0000 UTC m=+1033.147236690" watchObservedRunningTime="2025-12-05 13:07:16.069095421 +0000 UTC m=+1033.201175102" Dec 05 13:07:16.194060 master-0 kubenswrapper[29936]: I1205 13:07:16.193934 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b46d8-scheduler-0" podStartSLOduration=3.193901663 podStartE2EDuration="3.193901663s" podCreationTimestamp="2025-12-05 13:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:07:16.146412907 +0000 UTC m=+1033.278492608" watchObservedRunningTime="2025-12-05 13:07:16.193901663 +0000 UTC m=+1033.325981344" Dec 05 13:07:16.812427 master-0 kubenswrapper[29936]: I1205 13:07:16.812368 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:07:16.971754 master-0 kubenswrapper[29936]: I1205 13:07:16.971684 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-sb\") pod \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " Dec 05 13:07:16.972545 master-0 kubenswrapper[29936]: I1205 13:07:16.971890 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-config\") pod \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " Dec 05 13:07:16.972545 master-0 kubenswrapper[29936]: I1205 13:07:16.972175 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndmt4\" (UniqueName: \"kubernetes.io/projected/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-kube-api-access-ndmt4\") pod \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " Dec 05 13:07:16.972545 master-0 kubenswrapper[29936]: I1205 13:07:16.972281 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-svc\") pod \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " Dec 05 13:07:16.972545 master-0 kubenswrapper[29936]: I1205 13:07:16.972326 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-swift-storage-0\") pod \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " Dec 05 13:07:16.972545 master-0 kubenswrapper[29936]: I1205 13:07:16.972358 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-nb\") pod \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\" (UID: \"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a\") " Dec 05 13:07:16.980729 master-0 kubenswrapper[29936]: I1205 13:07:16.980639 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-kube-api-access-ndmt4" (OuterVolumeSpecName: "kube-api-access-ndmt4") pod "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" (UID: "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a"). InnerVolumeSpecName "kube-api-access-ndmt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:17.009856 master-0 kubenswrapper[29936]: I1205 13:07:17.009749 29936 generic.go:334] "Generic (PLEG): container finished" podID="efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" containerID="ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1" exitCode=0 Dec 05 13:07:17.010156 master-0 kubenswrapper[29936]: I1205 13:07:17.010026 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" Dec 05 13:07:17.012106 master-0 kubenswrapper[29936]: I1205 13:07:17.012027 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" event={"ID":"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a","Type":"ContainerDied","Data":"ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1"} Dec 05 13:07:17.012213 master-0 kubenswrapper[29936]: I1205 13:07:17.012112 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9d98978f-74tz7" event={"ID":"efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a","Type":"ContainerDied","Data":"7e21017d8ec106b9b45de27fb2cc4416d064d4c60a9f6da2964a998629894b86"} Dec 05 13:07:17.012213 master-0 kubenswrapper[29936]: I1205 13:07:17.012136 29936 scope.go:117] "RemoveContainer" containerID="ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1" Dec 05 13:07:17.016837 master-0 kubenswrapper[29936]: I1205 13:07:17.016788 29936 generic.go:334] "Generic (PLEG): container finished" podID="66cad757-8699-4951-bbdf-b556fd09d35c" containerID="2c8f48be2446be3c991b78efc0cda5c99c4afcd35cc8d791ea0a715b850048d3" exitCode=1 Dec 05 13:07:17.018942 master-0 kubenswrapper[29936]: I1205 13:07:17.018906 29936 scope.go:117] "RemoveContainer" containerID="2c8f48be2446be3c991b78efc0cda5c99c4afcd35cc8d791ea0a715b850048d3" Dec 05 13:07:17.019328 master-0 kubenswrapper[29936]: E1205 13:07:17.019245 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-5456ffdd9c-4qjcn_openstack(66cad757-8699-4951-bbdf-b556fd09d35c)\"" pod="openstack/ironic-5456ffdd9c-4qjcn" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" Dec 05 13:07:17.019732 master-0 kubenswrapper[29936]: I1205 13:07:17.019668 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5456ffdd9c-4qjcn" event={"ID":"66cad757-8699-4951-bbdf-b556fd09d35c","Type":"ContainerDied","Data":"2c8f48be2446be3c991b78efc0cda5c99c4afcd35cc8d791ea0a715b850048d3"} Dec 05 13:07:17.069247 master-0 kubenswrapper[29936]: I1205 13:07:17.065318 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" (UID: "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:17.081872 master-0 kubenswrapper[29936]: I1205 13:07:17.081746 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndmt4\" (UniqueName: \"kubernetes.io/projected/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-kube-api-access-ndmt4\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:17.081872 master-0 kubenswrapper[29936]: I1205 13:07:17.081802 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:17.088051 master-0 kubenswrapper[29936]: I1205 13:07:17.083034 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-config" (OuterVolumeSpecName: "config") pod "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" (UID: "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:17.088051 master-0 kubenswrapper[29936]: I1205 13:07:17.087939 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" (UID: "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:17.102310 master-0 kubenswrapper[29936]: I1205 13:07:17.098031 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" (UID: "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:17.102310 master-0 kubenswrapper[29936]: I1205 13:07:17.100785 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" (UID: "efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:17.190732 master-0 kubenswrapper[29936]: I1205 13:07:17.190601 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:17.190732 master-0 kubenswrapper[29936]: I1205 13:07:17.190703 29936 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:17.190732 master-0 kubenswrapper[29936]: I1205 13:07:17.190721 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:17.190732 master-0 kubenswrapper[29936]: I1205 13:07:17.190736 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:17.226407 master-0 kubenswrapper[29936]: I1205 13:07:17.226338 29936 scope.go:117] "RemoveContainer" containerID="8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345" Dec 05 13:07:17.290742 master-0 kubenswrapper[29936]: I1205 13:07:17.290401 29936 scope.go:117] "RemoveContainer" containerID="ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1" Dec 05 13:07:17.295602 master-0 kubenswrapper[29936]: E1205 13:07:17.294531 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1\": container with ID starting with ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1 not found: ID does not exist" containerID="ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1" Dec 05 13:07:17.295602 master-0 kubenswrapper[29936]: I1205 13:07:17.294626 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1"} err="failed to get container status \"ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1\": rpc error: code = NotFound desc = could not find container \"ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1\": container with ID starting with ce098b005a8238a7192b09f7d5329140ebf27902854cc732266b80cc51a552c1 not found: ID does not exist" Dec 05 13:07:17.295602 master-0 kubenswrapper[29936]: I1205 13:07:17.294672 29936 scope.go:117] "RemoveContainer" containerID="8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345" Dec 05 13:07:17.295602 master-0 kubenswrapper[29936]: E1205 13:07:17.295534 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345\": container with ID starting with 8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345 not found: ID does not exist" containerID="8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345" Dec 05 13:07:17.297066 master-0 kubenswrapper[29936]: I1205 13:07:17.295669 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345"} err="failed to get container status \"8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345\": rpc error: code = NotFound desc = could not find container \"8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345\": container with ID starting with 8d6745c198e1faacc64cd7c72cd055b69995d71ba4d083e85a6a78e7ba3d9345 not found: ID does not exist" Dec 05 13:07:17.297066 master-0 kubenswrapper[29936]: I1205 13:07:17.295823 29936 scope.go:117] "RemoveContainer" containerID="0429f893b09608de8eed731f8debc25ba51617419d622643a97b0c285ee7140b" Dec 05 13:07:17.420135 master-0 kubenswrapper[29936]: I1205 13:07:17.420059 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f9d98978f-74tz7"] Dec 05 13:07:17.432607 master-0 kubenswrapper[29936]: I1205 13:07:17.432436 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f9d98978f-74tz7"] Dec 05 13:07:18.066157 master-0 kubenswrapper[29936]: I1205 13:07:18.066066 29936 scope.go:117] "RemoveContainer" containerID="2c8f48be2446be3c991b78efc0cda5c99c4afcd35cc8d791ea0a715b850048d3" Dec 05 13:07:18.067334 master-0 kubenswrapper[29936]: E1205 13:07:18.066375 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-5456ffdd9c-4qjcn_openstack(66cad757-8699-4951-bbdf-b556fd09d35c)\"" pod="openstack/ironic-5456ffdd9c-4qjcn" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" Dec 05 13:07:18.217385 master-0 kubenswrapper[29936]: I1205 13:07:18.217285 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6584d6f967-pjksk" Dec 05 13:07:18.307656 master-0 kubenswrapper[29936]: I1205 13:07:18.307586 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-b46d8-volume-lvm-iscsi-0" Dec 05 13:07:18.438760 master-0 kubenswrapper[29936]: I1205 13:07:18.438670 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:18.606135 master-0 kubenswrapper[29936]: I1205 13:07:18.604267 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-6597984769-rbgpb" Dec 05 13:07:18.772804 master-0 kubenswrapper[29936]: I1205 13:07:18.768538 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-5456ffdd9c-4qjcn"] Dec 05 13:07:19.086648 master-0 kubenswrapper[29936]: I1205 13:07:19.086403 29936 generic.go:334] "Generic (PLEG): container finished" podID="1eb892f5-7ab8-4503-b7b8-1e233a1042bb" containerID="0def71aab0e9502c799b2ec9bb2c3e775a167826b1f8d1c0f4ab4c73f4421b4c" exitCode=1 Dec 05 13:07:19.087350 master-0 kubenswrapper[29936]: I1205 13:07:19.086512 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" event={"ID":"1eb892f5-7ab8-4503-b7b8-1e233a1042bb","Type":"ContainerDied","Data":"0def71aab0e9502c799b2ec9bb2c3e775a167826b1f8d1c0f4ab4c73f4421b4c"} Dec 05 13:07:19.087524 master-0 kubenswrapper[29936]: I1205 13:07:19.087389 29936 scope.go:117] "RemoveContainer" containerID="b6fdcf73cc7c1d340e500b2aec0f01c299d2ed131db9c697b9cea15492024369" Dec 05 13:07:19.087824 master-0 kubenswrapper[29936]: I1205 13:07:19.087777 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-5456ffdd9c-4qjcn" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="ironic-api-log" containerID="cri-o://448d5c5534bf38c56dd5b76a6f2b1b6e1b469dbe34505d31875849e2dc384c5f" gracePeriod=60 Dec 05 13:07:19.088503 master-0 kubenswrapper[29936]: I1205 13:07:19.088479 29936 scope.go:117] "RemoveContainer" containerID="0def71aab0e9502c799b2ec9bb2c3e775a167826b1f8d1c0f4ab4c73f4421b4c" Dec 05 13:07:19.089587 master-0 kubenswrapper[29936]: E1205 13:07:19.088842 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-8458c7d7db-8c5lp_openstack(1eb892f5-7ab8-4503-b7b8-1e233a1042bb)\"" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" podUID="1eb892f5-7ab8-4503-b7b8-1e233a1042bb" Dec 05 13:07:19.215753 master-0 kubenswrapper[29936]: I1205 13:07:19.215639 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" path="/var/lib/kubelet/pods/efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a/volumes" Dec 05 13:07:19.769499 master-0 kubenswrapper[29936]: I1205 13:07:19.769390 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-8czvf"] Dec 05 13:07:19.774651 master-0 kubenswrapper[29936]: E1205 13:07:19.770084 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" containerName="dnsmasq-dns" Dec 05 13:07:19.774651 master-0 kubenswrapper[29936]: I1205 13:07:19.770101 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" containerName="dnsmasq-dns" Dec 05 13:07:19.774651 master-0 kubenswrapper[29936]: E1205 13:07:19.770147 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" containerName="init" Dec 05 13:07:19.774651 master-0 kubenswrapper[29936]: I1205 13:07:19.770155 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" containerName="init" Dec 05 13:07:19.774651 master-0 kubenswrapper[29936]: I1205 13:07:19.770541 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb9ef7b-31f5-4aa0-a5cf-da58084e3b3a" containerName="dnsmasq-dns" Dec 05 13:07:19.774651 master-0 kubenswrapper[29936]: I1205 13:07:19.771952 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.776580 master-0 kubenswrapper[29936]: I1205 13:07:19.776542 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Dec 05 13:07:19.777408 master-0 kubenswrapper[29936]: I1205 13:07:19.777218 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Dec 05 13:07:19.822320 master-0 kubenswrapper[29936]: I1205 13:07:19.822253 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-8czvf"] Dec 05 13:07:19.845392 master-0 kubenswrapper[29936]: I1205 13:07:19.845307 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-combined-ca-bundle\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.845698 master-0 kubenswrapper[29936]: I1205 13:07:19.845458 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-scripts\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.845698 master-0 kubenswrapper[29936]: I1205 13:07:19.845516 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7hdf\" (UniqueName: \"kubernetes.io/projected/32628c1d-e798-427e-97ca-322d9af2971e-kube-api-access-h7hdf\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.845698 master-0 kubenswrapper[29936]: I1205 13:07:19.845560 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/32628c1d-e798-427e-97ca-322d9af2971e-etc-podinfo\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.845698 master-0 kubenswrapper[29936]: I1205 13:07:19.845632 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.845698 master-0 kubenswrapper[29936]: I1205 13:07:19.845679 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-config\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.845917 master-0 kubenswrapper[29936]: I1205 13:07:19.845713 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.950309 master-0 kubenswrapper[29936]: I1205 13:07:19.948027 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-scripts\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.950309 master-0 kubenswrapper[29936]: I1205 13:07:19.948128 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7hdf\" (UniqueName: \"kubernetes.io/projected/32628c1d-e798-427e-97ca-322d9af2971e-kube-api-access-h7hdf\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.950309 master-0 kubenswrapper[29936]: I1205 13:07:19.948222 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/32628c1d-e798-427e-97ca-322d9af2971e-etc-podinfo\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.950309 master-0 kubenswrapper[29936]: I1205 13:07:19.948279 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.950309 master-0 kubenswrapper[29936]: I1205 13:07:19.948310 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-config\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.950309 master-0 kubenswrapper[29936]: I1205 13:07:19.948338 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.950309 master-0 kubenswrapper[29936]: I1205 13:07:19.949209 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-combined-ca-bundle\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.951454 master-0 kubenswrapper[29936]: I1205 13:07:19.951298 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.951454 master-0 kubenswrapper[29936]: I1205 13:07:19.951408 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.954010 master-0 kubenswrapper[29936]: I1205 13:07:19.953945 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/32628c1d-e798-427e-97ca-322d9af2971e-etc-podinfo\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.954099 master-0 kubenswrapper[29936]: I1205 13:07:19.954067 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-scripts\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.955291 master-0 kubenswrapper[29936]: I1205 13:07:19.954782 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-combined-ca-bundle\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.957469 master-0 kubenswrapper[29936]: I1205 13:07:19.957407 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-config\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:19.970295 master-0 kubenswrapper[29936]: I1205 13:07:19.970209 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7hdf\" (UniqueName: \"kubernetes.io/projected/32628c1d-e798-427e-97ca-322d9af2971e-kube-api-access-h7hdf\") pod \"ironic-inspector-db-sync-8czvf\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:20.108535 master-0 kubenswrapper[29936]: I1205 13:07:20.108411 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:20.113690 master-0 kubenswrapper[29936]: I1205 13:07:20.113632 29936 generic.go:334] "Generic (PLEG): container finished" podID="66cad757-8699-4951-bbdf-b556fd09d35c" containerID="448d5c5534bf38c56dd5b76a6f2b1b6e1b469dbe34505d31875849e2dc384c5f" exitCode=143 Dec 05 13:07:20.113783 master-0 kubenswrapper[29936]: I1205 13:07:20.113702 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5456ffdd9c-4qjcn" event={"ID":"66cad757-8699-4951-bbdf-b556fd09d35c","Type":"ContainerDied","Data":"448d5c5534bf38c56dd5b76a6f2b1b6e1b469dbe34505d31875849e2dc384c5f"} Dec 05 13:07:20.365871 master-0 kubenswrapper[29936]: I1205 13:07:20.365776 29936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:20.366473 master-0 kubenswrapper[29936]: I1205 13:07:20.366445 29936 scope.go:117] "RemoveContainer" containerID="0def71aab0e9502c799b2ec9bb2c3e775a167826b1f8d1c0f4ab4c73f4421b4c" Dec 05 13:07:20.366766 master-0 kubenswrapper[29936]: E1205 13:07:20.366732 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-8458c7d7db-8c5lp_openstack(1eb892f5-7ab8-4503-b7b8-1e233a1042bb)\"" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" podUID="1eb892f5-7ab8-4503-b7b8-1e233a1042bb" Dec 05 13:07:20.916042 master-0 kubenswrapper[29936]: I1205 13:07:20.915951 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:21.458644 master-0 kubenswrapper[29936]: I1205 13:07:21.457331 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:21.554468 master-0 kubenswrapper[29936]: I1205 13:07:21.553927 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-8czvf"] Dec 05 13:07:21.586910 master-0 kubenswrapper[29936]: I1205 13:07:21.586842 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 13:07:21.608498 master-0 kubenswrapper[29936]: E1205 13:07:21.608411 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="ironic-api" Dec 05 13:07:21.608498 master-0 kubenswrapper[29936]: I1205 13:07:21.608488 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="ironic-api" Dec 05 13:07:21.608747 master-0 kubenswrapper[29936]: E1205 13:07:21.608566 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="init" Dec 05 13:07:21.608747 master-0 kubenswrapper[29936]: I1205 13:07:21.608574 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="init" Dec 05 13:07:21.608747 master-0 kubenswrapper[29936]: E1205 13:07:21.608585 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="ironic-api" Dec 05 13:07:21.608747 master-0 kubenswrapper[29936]: I1205 13:07:21.608591 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="ironic-api" Dec 05 13:07:21.608747 master-0 kubenswrapper[29936]: E1205 13:07:21.608616 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="ironic-api-log" Dec 05 13:07:21.608747 master-0 kubenswrapper[29936]: I1205 13:07:21.608622 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="ironic-api-log" Dec 05 13:07:21.609396 master-0 kubenswrapper[29936]: I1205 13:07:21.609337 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="ironic-api-log" Dec 05 13:07:21.609396 master-0 kubenswrapper[29936]: I1205 13:07:21.609399 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="ironic-api" Dec 05 13:07:21.610962 master-0 kubenswrapper[29936]: I1205 13:07:21.610423 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 13:07:21.615938 master-0 kubenswrapper[29936]: I1205 13:07:21.614091 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 13:07:21.615938 master-0 kubenswrapper[29936]: I1205 13:07:21.615029 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 13:07:21.622148 master-0 kubenswrapper[29936]: I1205 13:07:21.621902 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 13:07:21.669331 master-0 kubenswrapper[29936]: I1205 13:07:21.669253 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-logs\") pod \"66cad757-8699-4951-bbdf-b556fd09d35c\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " Dec 05 13:07:21.669795 master-0 kubenswrapper[29936]: I1205 13:07:21.669735 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-logs" (OuterVolumeSpecName: "logs") pod "66cad757-8699-4951-bbdf-b556fd09d35c" (UID: "66cad757-8699-4951-bbdf-b556fd09d35c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:21.670421 master-0 kubenswrapper[29936]: I1205 13:07:21.670382 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-custom\") pod \"66cad757-8699-4951-bbdf-b556fd09d35c\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " Dec 05 13:07:21.670476 master-0 kubenswrapper[29936]: I1205 13:07:21.670443 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g45r8\" (UniqueName: \"kubernetes.io/projected/66cad757-8699-4951-bbdf-b556fd09d35c-kube-api-access-g45r8\") pod \"66cad757-8699-4951-bbdf-b556fd09d35c\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " Dec 05 13:07:21.670528 master-0 kubenswrapper[29936]: I1205 13:07:21.670479 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data\") pod \"66cad757-8699-4951-bbdf-b556fd09d35c\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " Dec 05 13:07:21.670642 master-0 kubenswrapper[29936]: I1205 13:07:21.670615 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-merged\") pod \"66cad757-8699-4951-bbdf-b556fd09d35c\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " Dec 05 13:07:21.670723 master-0 kubenswrapper[29936]: I1205 13:07:21.670649 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-combined-ca-bundle\") pod \"66cad757-8699-4951-bbdf-b556fd09d35c\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " Dec 05 13:07:21.670723 master-0 kubenswrapper[29936]: I1205 13:07:21.670715 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/66cad757-8699-4951-bbdf-b556fd09d35c-etc-podinfo\") pod \"66cad757-8699-4951-bbdf-b556fd09d35c\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " Dec 05 13:07:21.670806 master-0 kubenswrapper[29936]: I1205 13:07:21.670771 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-scripts\") pod \"66cad757-8699-4951-bbdf-b556fd09d35c\" (UID: \"66cad757-8699-4951-bbdf-b556fd09d35c\") " Dec 05 13:07:21.671024 master-0 kubenswrapper[29936]: I1205 13:07:21.670991 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49bzl\" (UniqueName: \"kubernetes.io/projected/be50610c-4538-4753-a4a8-36a3aa7de72d-kube-api-access-49bzl\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.671095 master-0 kubenswrapper[29936]: I1205 13:07:21.671067 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.671259 master-0 kubenswrapper[29936]: I1205 13:07:21.671229 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config-secret\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.671372 master-0 kubenswrapper[29936]: I1205 13:07:21.671329 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.671476 master-0 kubenswrapper[29936]: I1205 13:07:21.671451 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:21.681639 master-0 kubenswrapper[29936]: I1205 13:07:21.681513 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "66cad757-8699-4951-bbdf-b556fd09d35c" (UID: "66cad757-8699-4951-bbdf-b556fd09d35c"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:21.683571 master-0 kubenswrapper[29936]: I1205 13:07:21.683217 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66cad757-8699-4951-bbdf-b556fd09d35c-kube-api-access-g45r8" (OuterVolumeSpecName: "kube-api-access-g45r8") pod "66cad757-8699-4951-bbdf-b556fd09d35c" (UID: "66cad757-8699-4951-bbdf-b556fd09d35c"). InnerVolumeSpecName "kube-api-access-g45r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:21.684398 master-0 kubenswrapper[29936]: I1205 13:07:21.684023 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/66cad757-8699-4951-bbdf-b556fd09d35c-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "66cad757-8699-4951-bbdf-b556fd09d35c" (UID: "66cad757-8699-4951-bbdf-b556fd09d35c"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 13:07:21.685789 master-0 kubenswrapper[29936]: I1205 13:07:21.685721 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-scripts" (OuterVolumeSpecName: "scripts") pod "66cad757-8699-4951-bbdf-b556fd09d35c" (UID: "66cad757-8699-4951-bbdf-b556fd09d35c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:21.695216 master-0 kubenswrapper[29936]: I1205 13:07:21.694867 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "66cad757-8699-4951-bbdf-b556fd09d35c" (UID: "66cad757-8699-4951-bbdf-b556fd09d35c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:21.738197 master-0 kubenswrapper[29936]: I1205 13:07:21.736542 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data" (OuterVolumeSpecName: "config-data") pod "66cad757-8699-4951-bbdf-b556fd09d35c" (UID: "66cad757-8699-4951-bbdf-b556fd09d35c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:21.776377 master-0 kubenswrapper[29936]: I1205 13:07:21.776218 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49bzl\" (UniqueName: \"kubernetes.io/projected/be50610c-4538-4753-a4a8-36a3aa7de72d-kube-api-access-49bzl\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.776377 master-0 kubenswrapper[29936]: I1205 13:07:21.776375 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.776872 master-0 kubenswrapper[29936]: I1205 13:07:21.776644 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config-secret\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.776872 master-0 kubenswrapper[29936]: I1205 13:07:21.776826 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.776991 master-0 kubenswrapper[29936]: I1205 13:07:21.776913 29936 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-merged\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:21.776991 master-0 kubenswrapper[29936]: I1205 13:07:21.776928 29936 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/66cad757-8699-4951-bbdf-b556fd09d35c-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:21.776991 master-0 kubenswrapper[29936]: I1205 13:07:21.776938 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:21.776991 master-0 kubenswrapper[29936]: I1205 13:07:21.776947 29936 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:21.776991 master-0 kubenswrapper[29936]: I1205 13:07:21.776957 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g45r8\" (UniqueName: \"kubernetes.io/projected/66cad757-8699-4951-bbdf-b556fd09d35c-kube-api-access-g45r8\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:21.776991 master-0 kubenswrapper[29936]: I1205 13:07:21.776968 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:21.778838 master-0 kubenswrapper[29936]: E1205 13:07:21.778786 29936 projected.go:194] Error preparing data for projected volume kube-api-access-49bzl for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:master-0" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'master-0' and this object Dec 05 13:07:21.778950 master-0 kubenswrapper[29936]: E1205 13:07:21.778875 29936 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be50610c-4538-4753-a4a8-36a3aa7de72d-kube-api-access-49bzl podName:be50610c-4538-4753-a4a8-36a3aa7de72d nodeName:}" failed. No retries permitted until 2025-12-05 13:07:22.278854695 +0000 UTC m=+1039.410934376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-49bzl" (UniqueName: "kubernetes.io/projected/be50610c-4538-4753-a4a8-36a3aa7de72d-kube-api-access-49bzl") pod "openstackclient" (UID: "be50610c-4538-4753-a4a8-36a3aa7de72d") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:master-0" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'master-0' and this object Dec 05 13:07:21.779432 master-0 kubenswrapper[29936]: I1205 13:07:21.779388 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.782381 master-0 kubenswrapper[29936]: I1205 13:07:21.782333 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.784102 master-0 kubenswrapper[29936]: I1205 13:07:21.784042 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66cad757-8699-4951-bbdf-b556fd09d35c" (UID: "66cad757-8699-4951-bbdf-b556fd09d35c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:21.788164 master-0 kubenswrapper[29936]: I1205 13:07:21.788113 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config-secret\") pod \"openstackclient\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " pod="openstack/openstackclient" Dec 05 13:07:21.793082 master-0 kubenswrapper[29936]: I1205 13:07:21.793018 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 05 13:07:21.794311 master-0 kubenswrapper[29936]: E1205 13:07:21.794229 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-49bzl], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="be50610c-4538-4753-a4a8-36a3aa7de72d" Dec 05 13:07:21.807302 master-0 kubenswrapper[29936]: I1205 13:07:21.807209 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 05 13:07:21.881269 master-0 kubenswrapper[29936]: I1205 13:07:21.880653 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cad757-8699-4951-bbdf-b556fd09d35c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:21.970362 master-0 kubenswrapper[29936]: I1205 13:07:21.967233 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 13:07:21.970362 master-0 kubenswrapper[29936]: I1205 13:07:21.968658 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" containerName="ironic-api" Dec 05 13:07:21.970362 master-0 kubenswrapper[29936]: I1205 13:07:21.970026 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 13:07:21.986507 master-0 kubenswrapper[29936]: I1205 13:07:21.986297 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:21.986507 master-0 kubenswrapper[29936]: I1205 13:07:21.986499 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxkrk\" (UniqueName: \"kubernetes.io/projected/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-kube-api-access-dxkrk\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:21.986850 master-0 kubenswrapper[29936]: I1205 13:07:21.986641 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-openstack-config-secret\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:21.986850 master-0 kubenswrapper[29936]: I1205 13:07:21.986687 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-openstack-config\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:22.018608 master-0 kubenswrapper[29936]: I1205 13:07:22.018362 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 13:07:22.094680 master-0 kubenswrapper[29936]: I1205 13:07:22.090299 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:22.095235 master-0 kubenswrapper[29936]: I1205 13:07:22.094631 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxkrk\" (UniqueName: \"kubernetes.io/projected/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-kube-api-access-dxkrk\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:22.096211 master-0 kubenswrapper[29936]: I1205 13:07:22.096144 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-openstack-config-secret\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:22.098494 master-0 kubenswrapper[29936]: I1205 13:07:22.098261 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-openstack-config\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:22.099572 master-0 kubenswrapper[29936]: I1205 13:07:22.099513 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-openstack-config\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:22.104315 master-0 kubenswrapper[29936]: I1205 13:07:22.103533 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:22.105093 master-0 kubenswrapper[29936]: I1205 13:07:22.105037 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-openstack-config-secret\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:22.120208 master-0 kubenswrapper[29936]: I1205 13:07:22.120135 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxkrk\" (UniqueName: \"kubernetes.io/projected/6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603-kube-api-access-dxkrk\") pod \"openstackclient\" (UID: \"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603\") " pod="openstack/openstackclient" Dec 05 13:07:22.171282 master-0 kubenswrapper[29936]: I1205 13:07:22.171190 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-8czvf" event={"ID":"32628c1d-e798-427e-97ca-322d9af2971e","Type":"ContainerStarted","Data":"fa5cb022876d8c39f3204842e7dbe7418dc4264b63e3e7b5b0e7980b3162fdbc"} Dec 05 13:07:22.174634 master-0 kubenswrapper[29936]: I1205 13:07:22.174580 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 13:07:22.174984 master-0 kubenswrapper[29936]: I1205 13:07:22.174925 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-5456ffdd9c-4qjcn" Dec 05 13:07:22.175166 master-0 kubenswrapper[29936]: I1205 13:07:22.174937 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5456ffdd9c-4qjcn" event={"ID":"66cad757-8699-4951-bbdf-b556fd09d35c","Type":"ContainerDied","Data":"cfe26fb6f1b7b0ab5b0ea5c536c112ec0836b3349e5b3129f4aaaaa8f951feac"} Dec 05 13:07:22.175166 master-0 kubenswrapper[29936]: I1205 13:07:22.175068 29936 scope.go:117] "RemoveContainer" containerID="2c8f48be2446be3c991b78efc0cda5c99c4afcd35cc8d791ea0a715b850048d3" Dec 05 13:07:22.183875 master-0 kubenswrapper[29936]: I1205 13:07:22.183210 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="be50610c-4538-4753-a4a8-36a3aa7de72d" podUID="6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603" Dec 05 13:07:22.201405 master-0 kubenswrapper[29936]: I1205 13:07:22.201169 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:07:22.226201 master-0 kubenswrapper[29936]: I1205 13:07:22.226126 29936 scope.go:117] "RemoveContainer" containerID="448d5c5534bf38c56dd5b76a6f2b1b6e1b469dbe34505d31875849e2dc384c5f" Dec 05 13:07:22.289058 master-0 kubenswrapper[29936]: I1205 13:07:22.287594 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 13:07:22.329057 master-0 kubenswrapper[29936]: I1205 13:07:22.320145 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-combined-ca-bundle\") pod \"be50610c-4538-4753-a4a8-36a3aa7de72d\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " Dec 05 13:07:22.329057 master-0 kubenswrapper[29936]: I1205 13:07:22.320460 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config\") pod \"be50610c-4538-4753-a4a8-36a3aa7de72d\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " Dec 05 13:07:22.329057 master-0 kubenswrapper[29936]: I1205 13:07:22.320617 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config-secret\") pod \"be50610c-4538-4753-a4a8-36a3aa7de72d\" (UID: \"be50610c-4538-4753-a4a8-36a3aa7de72d\") " Dec 05 13:07:22.329057 master-0 kubenswrapper[29936]: I1205 13:07:22.322597 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49bzl\" (UniqueName: \"kubernetes.io/projected/be50610c-4538-4753-a4a8-36a3aa7de72d-kube-api-access-49bzl\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:22.329057 master-0 kubenswrapper[29936]: I1205 13:07:22.329020 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "be50610c-4538-4753-a4a8-36a3aa7de72d" (UID: "be50610c-4538-4753-a4a8-36a3aa7de72d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:22.345094 master-0 kubenswrapper[29936]: I1205 13:07:22.344126 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be50610c-4538-4753-a4a8-36a3aa7de72d" (UID: "be50610c-4538-4753-a4a8-36a3aa7de72d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:22.357534 master-0 kubenswrapper[29936]: I1205 13:07:22.357438 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "be50610c-4538-4753-a4a8-36a3aa7de72d" (UID: "be50610c-4538-4753-a4a8-36a3aa7de72d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:22.358357 master-0 kubenswrapper[29936]: I1205 13:07:22.358072 29936 scope.go:117] "RemoveContainer" containerID="eda4e2a76db4cd9e0bce286d535bd746a0b461a78dd5cb560168d9615a4c194a" Dec 05 13:07:22.364288 master-0 kubenswrapper[29936]: I1205 13:07:22.364168 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 13:07:22.365855 master-0 kubenswrapper[29936]: I1205 13:07:22.365823 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-5456ffdd9c-4qjcn"] Dec 05 13:07:22.389499 master-0 kubenswrapper[29936]: I1205 13:07:22.389416 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-5456ffdd9c-4qjcn"] Dec 05 13:07:22.424572 master-0 kubenswrapper[29936]: I1205 13:07:22.424530 29936 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:22.424572 master-0 kubenswrapper[29936]: I1205 13:07:22.424570 29936 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-openstack-config-secret\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:22.424734 master-0 kubenswrapper[29936]: I1205 13:07:22.424582 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be50610c-4538-4753-a4a8-36a3aa7de72d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:22.983633 master-0 kubenswrapper[29936]: I1205 13:07:22.983568 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 13:07:23.228350 master-0 kubenswrapper[29936]: I1205 13:07:23.228214 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66cad757-8699-4951-bbdf-b556fd09d35c" path="/var/lib/kubelet/pods/66cad757-8699-4951-bbdf-b556fd09d35c/volumes" Dec 05 13:07:23.229169 master-0 kubenswrapper[29936]: I1205 13:07:23.229145 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be50610c-4538-4753-a4a8-36a3aa7de72d" path="/var/lib/kubelet/pods/be50610c-4538-4753-a4a8-36a3aa7de72d/volumes" Dec 05 13:07:23.229955 master-0 kubenswrapper[29936]: I1205 13:07:23.229924 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603","Type":"ContainerStarted","Data":"3588f40236b7c8564243d6498f095b588a894717e721bb525a8ee34166c58bd1"} Dec 05 13:07:23.235772 master-0 kubenswrapper[29936]: I1205 13:07:23.235668 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 13:07:23.272945 master-0 kubenswrapper[29936]: I1205 13:07:23.272857 29936 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="be50610c-4538-4753-a4a8-36a3aa7de72d" podUID="6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603" Dec 05 13:07:23.761207 master-0 kubenswrapper[29936]: I1205 13:07:23.760219 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-b46d8-scheduler-0" Dec 05 13:07:23.958218 master-0 kubenswrapper[29936]: I1205 13:07:23.955892 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-75c7c6bd54-dbbbs"] Dec 05 13:07:23.959385 master-0 kubenswrapper[29936]: I1205 13:07:23.958857 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:23.972313 master-0 kubenswrapper[29936]: I1205 13:07:23.966029 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 13:07:23.972313 master-0 kubenswrapper[29936]: I1205 13:07:23.966489 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 13:07:23.975144 master-0 kubenswrapper[29936]: I1205 13:07:23.973819 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75c7c6bd54-dbbbs"] Dec 05 13:07:23.975144 master-0 kubenswrapper[29936]: I1205 13:07:23.973903 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 13:07:24.099800 master-0 kubenswrapper[29936]: I1205 13:07:24.091771 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv7zm\" (UniqueName: \"kubernetes.io/projected/d4214a65-f648-446c-9a3c-cd338e92e61f-kube-api-access-hv7zm\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.099800 master-0 kubenswrapper[29936]: I1205 13:07:24.091896 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-combined-ca-bundle\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.099800 master-0 kubenswrapper[29936]: I1205 13:07:24.091978 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4214a65-f648-446c-9a3c-cd338e92e61f-log-httpd\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.099800 master-0 kubenswrapper[29936]: I1205 13:07:24.092038 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-config-data\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.099800 master-0 kubenswrapper[29936]: I1205 13:07:24.092062 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4214a65-f648-446c-9a3c-cd338e92e61f-run-httpd\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.099800 master-0 kubenswrapper[29936]: I1205 13:07:24.092125 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-public-tls-certs\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.099800 master-0 kubenswrapper[29936]: I1205 13:07:24.092191 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-internal-tls-certs\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.099800 master-0 kubenswrapper[29936]: I1205 13:07:24.092283 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4214a65-f648-446c-9a3c-cd338e92e61f-etc-swift\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.217301 master-0 kubenswrapper[29936]: I1205 13:07:24.216364 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4214a65-f648-446c-9a3c-cd338e92e61f-run-httpd\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.217301 master-0 kubenswrapper[29936]: I1205 13:07:24.216498 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-public-tls-certs\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.217301 master-0 kubenswrapper[29936]: I1205 13:07:24.216547 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-internal-tls-certs\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.217301 master-0 kubenswrapper[29936]: I1205 13:07:24.216672 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4214a65-f648-446c-9a3c-cd338e92e61f-etc-swift\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.217301 master-0 kubenswrapper[29936]: I1205 13:07:24.216722 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv7zm\" (UniqueName: \"kubernetes.io/projected/d4214a65-f648-446c-9a3c-cd338e92e61f-kube-api-access-hv7zm\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.226620 master-0 kubenswrapper[29936]: I1205 13:07:24.223266 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-combined-ca-bundle\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.226620 master-0 kubenswrapper[29936]: I1205 13:07:24.223607 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4214a65-f648-446c-9a3c-cd338e92e61f-log-httpd\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.226620 master-0 kubenswrapper[29936]: I1205 13:07:24.223821 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-config-data\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.226620 master-0 kubenswrapper[29936]: I1205 13:07:24.226475 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4214a65-f648-446c-9a3c-cd338e92e61f-run-httpd\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.226948 master-0 kubenswrapper[29936]: I1205 13:07:24.226695 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d4214a65-f648-446c-9a3c-cd338e92e61f-etc-swift\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.231596 master-0 kubenswrapper[29936]: I1205 13:07:24.227052 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4214a65-f648-446c-9a3c-cd338e92e61f-log-httpd\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.245687 master-0 kubenswrapper[29936]: I1205 13:07:24.239422 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-combined-ca-bundle\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.245687 master-0 kubenswrapper[29936]: I1205 13:07:24.244061 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-internal-tls-certs\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.257338 master-0 kubenswrapper[29936]: I1205 13:07:24.250986 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-config-data\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.257338 master-0 kubenswrapper[29936]: I1205 13:07:24.253054 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4214a65-f648-446c-9a3c-cd338e92e61f-public-tls-certs\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.320376 master-0 kubenswrapper[29936]: I1205 13:07:24.316277 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv7zm\" (UniqueName: \"kubernetes.io/projected/d4214a65-f648-446c-9a3c-cd338e92e61f-kube-api-access-hv7zm\") pod \"swift-proxy-75c7c6bd54-dbbbs\" (UID: \"d4214a65-f648-446c-9a3c-cd338e92e61f\") " pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:24.608100 master-0 kubenswrapper[29936]: I1205 13:07:24.596109 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:25.694594 master-0 kubenswrapper[29936]: I1205 13:07:25.694460 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-75c7c6bd54-dbbbs"] Dec 05 13:07:26.378776 master-0 kubenswrapper[29936]: I1205 13:07:26.378652 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75c7c6bd54-dbbbs" event={"ID":"d4214a65-f648-446c-9a3c-cd338e92e61f","Type":"ContainerStarted","Data":"5f31a5e8242f9814e5494269f4ddb2aa62bf6a89a7ae3bc59be8f22143a06b7f"} Dec 05 13:07:26.378776 master-0 kubenswrapper[29936]: I1205 13:07:26.378722 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75c7c6bd54-dbbbs" event={"ID":"d4214a65-f648-446c-9a3c-cd338e92e61f","Type":"ContainerStarted","Data":"72951fb3cb10241c7b0814617e77c359bfec4cfcc5b840400a6d308b4c2964e4"} Dec 05 13:07:26.381075 master-0 kubenswrapper[29936]: I1205 13:07:26.381037 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-8czvf" event={"ID":"32628c1d-e798-427e-97ca-322d9af2971e","Type":"ContainerStarted","Data":"2210c31023e83097945b619fe81b1a41648b8e0f5c97d20c0d6fd95a1eb5d0a3"} Dec 05 13:07:26.649626 master-0 kubenswrapper[29936]: I1205 13:07:26.649511 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-8czvf" podStartSLOduration=4.200877771 podStartE2EDuration="7.649475081s" podCreationTimestamp="2025-12-05 13:07:19 +0000 UTC" firstStartedPulling="2025-12-05 13:07:21.592844902 +0000 UTC m=+1038.724924583" lastFinishedPulling="2025-12-05 13:07:25.041442212 +0000 UTC m=+1042.173521893" observedRunningTime="2025-12-05 13:07:26.628111979 +0000 UTC m=+1043.760191660" watchObservedRunningTime="2025-12-05 13:07:26.649475081 +0000 UTC m=+1043.781554762" Dec 05 13:07:27.407848 master-0 kubenswrapper[29936]: I1205 13:07:27.407598 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-75c7c6bd54-dbbbs" event={"ID":"d4214a65-f648-446c-9a3c-cd338e92e61f","Type":"ContainerStarted","Data":"390cf58b65a8f2c91be67c02dd75fb7b4e0bbeb49a8375bcb27b27265badf51b"} Dec 05 13:07:27.407848 master-0 kubenswrapper[29936]: I1205 13:07:27.407715 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:27.407848 master-0 kubenswrapper[29936]: I1205 13:07:27.407749 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:27.463299 master-0 kubenswrapper[29936]: I1205 13:07:27.460095 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-75c7c6bd54-dbbbs" podStartSLOduration=4.4600563 podStartE2EDuration="4.4600563s" podCreationTimestamp="2025-12-05 13:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:07:27.436391449 +0000 UTC m=+1044.568471140" watchObservedRunningTime="2025-12-05 13:07:27.4600563 +0000 UTC m=+1044.592135991" Dec 05 13:07:28.471376 master-0 kubenswrapper[29936]: I1205 13:07:28.471215 29936 generic.go:334] "Generic (PLEG): container finished" podID="32628c1d-e798-427e-97ca-322d9af2971e" containerID="2210c31023e83097945b619fe81b1a41648b8e0f5c97d20c0d6fd95a1eb5d0a3" exitCode=0 Dec 05 13:07:28.472895 master-0 kubenswrapper[29936]: I1205 13:07:28.471390 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-8czvf" event={"ID":"32628c1d-e798-427e-97ca-322d9af2971e","Type":"ContainerDied","Data":"2210c31023e83097945b619fe81b1a41648b8e0f5c97d20c0d6fd95a1eb5d0a3"} Dec 05 13:07:29.301216 master-0 kubenswrapper[29936]: I1205 13:07:29.300988 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c565df557-xrffm" Dec 05 13:07:29.978973 master-0 kubenswrapper[29936]: I1205 13:07:29.978903 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:30.064697 master-0 kubenswrapper[29936]: I1205 13:07:30.064600 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"32628c1d-e798-427e-97ca-322d9af2971e\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " Dec 05 13:07:30.065427 master-0 kubenswrapper[29936]: I1205 13:07:30.064832 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7hdf\" (UniqueName: \"kubernetes.io/projected/32628c1d-e798-427e-97ca-322d9af2971e-kube-api-access-h7hdf\") pod \"32628c1d-e798-427e-97ca-322d9af2971e\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " Dec 05 13:07:30.065427 master-0 kubenswrapper[29936]: I1205 13:07:30.064917 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic\") pod \"32628c1d-e798-427e-97ca-322d9af2971e\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " Dec 05 13:07:30.065427 master-0 kubenswrapper[29936]: I1205 13:07:30.064964 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-config\") pod \"32628c1d-e798-427e-97ca-322d9af2971e\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " Dec 05 13:07:30.065427 master-0 kubenswrapper[29936]: I1205 13:07:30.065054 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/32628c1d-e798-427e-97ca-322d9af2971e-etc-podinfo\") pod \"32628c1d-e798-427e-97ca-322d9af2971e\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " Dec 05 13:07:30.065427 master-0 kubenswrapper[29936]: I1205 13:07:30.065109 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-scripts\") pod \"32628c1d-e798-427e-97ca-322d9af2971e\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " Dec 05 13:07:30.065427 master-0 kubenswrapper[29936]: I1205 13:07:30.065242 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-combined-ca-bundle\") pod \"32628c1d-e798-427e-97ca-322d9af2971e\" (UID: \"32628c1d-e798-427e-97ca-322d9af2971e\") " Dec 05 13:07:30.065427 master-0 kubenswrapper[29936]: I1205 13:07:30.065232 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "32628c1d-e798-427e-97ca-322d9af2971e" (UID: "32628c1d-e798-427e-97ca-322d9af2971e"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:30.066069 master-0 kubenswrapper[29936]: I1205 13:07:30.065536 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "32628c1d-e798-427e-97ca-322d9af2971e" (UID: "32628c1d-e798-427e-97ca-322d9af2971e"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:30.066215 master-0 kubenswrapper[29936]: I1205 13:07:30.066163 29936 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:30.066282 master-0 kubenswrapper[29936]: I1205 13:07:30.066217 29936 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/32628c1d-e798-427e-97ca-322d9af2971e-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:30.070048 master-0 kubenswrapper[29936]: I1205 13:07:30.069944 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/32628c1d-e798-427e-97ca-322d9af2971e-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "32628c1d-e798-427e-97ca-322d9af2971e" (UID: "32628c1d-e798-427e-97ca-322d9af2971e"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 13:07:30.070635 master-0 kubenswrapper[29936]: I1205 13:07:30.070545 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-scripts" (OuterVolumeSpecName: "scripts") pod "32628c1d-e798-427e-97ca-322d9af2971e" (UID: "32628c1d-e798-427e-97ca-322d9af2971e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:30.070978 master-0 kubenswrapper[29936]: I1205 13:07:30.070920 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32628c1d-e798-427e-97ca-322d9af2971e-kube-api-access-h7hdf" (OuterVolumeSpecName: "kube-api-access-h7hdf") pod "32628c1d-e798-427e-97ca-322d9af2971e" (UID: "32628c1d-e798-427e-97ca-322d9af2971e"). InnerVolumeSpecName "kube-api-access-h7hdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:30.104955 master-0 kubenswrapper[29936]: I1205 13:07:30.104873 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32628c1d-e798-427e-97ca-322d9af2971e" (UID: "32628c1d-e798-427e-97ca-322d9af2971e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:30.110597 master-0 kubenswrapper[29936]: I1205 13:07:30.110436 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-config" (OuterVolumeSpecName: "config") pod "32628c1d-e798-427e-97ca-322d9af2971e" (UID: "32628c1d-e798-427e-97ca-322d9af2971e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:30.169566 master-0 kubenswrapper[29936]: I1205 13:07:30.169499 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:30.169566 master-0 kubenswrapper[29936]: I1205 13:07:30.169570 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7hdf\" (UniqueName: \"kubernetes.io/projected/32628c1d-e798-427e-97ca-322d9af2971e-kube-api-access-h7hdf\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:30.169566 master-0 kubenswrapper[29936]: I1205 13:07:30.169590 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:30.170030 master-0 kubenswrapper[29936]: I1205 13:07:30.169604 29936 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/32628c1d-e798-427e-97ca-322d9af2971e-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:30.170030 master-0 kubenswrapper[29936]: I1205 13:07:30.169653 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32628c1d-e798-427e-97ca-322d9af2971e-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:30.508834 master-0 kubenswrapper[29936]: I1205 13:07:30.508757 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-8czvf" event={"ID":"32628c1d-e798-427e-97ca-322d9af2971e","Type":"ContainerDied","Data":"fa5cb022876d8c39f3204842e7dbe7418dc4264b63e3e7b5b0e7980b3162fdbc"} Dec 05 13:07:30.508834 master-0 kubenswrapper[29936]: I1205 13:07:30.508827 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-8czvf" Dec 05 13:07:30.510224 master-0 kubenswrapper[29936]: I1205 13:07:30.508829 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa5cb022876d8c39f3204842e7dbe7418dc4264b63e3e7b5b0e7980b3162fdbc" Dec 05 13:07:31.070610 master-0 kubenswrapper[29936]: I1205 13:07:31.069490 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f5d648448-8m7zn"] Dec 05 13:07:31.070610 master-0 kubenswrapper[29936]: I1205 13:07:31.070466 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f5d648448-8m7zn" podUID="8ee01710-7ad7-47e9-8268-09a33572ab6a" containerName="neutron-api" containerID="cri-o://9a3b05f6ff88b9c96448238538c8a34f6caf50d50918bd415887609dfb3fc194" gracePeriod=30 Dec 05 13:07:31.071528 master-0 kubenswrapper[29936]: I1205 13:07:31.070713 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f5d648448-8m7zn" podUID="8ee01710-7ad7-47e9-8268-09a33572ab6a" containerName="neutron-httpd" containerID="cri-o://629a2e1c62ee9742cb533842e08303c5cee1a87130f73b0f0f822f74d7518e8d" gracePeriod=30 Dec 05 13:07:31.526678 master-0 kubenswrapper[29936]: I1205 13:07:31.526569 29936 generic.go:334] "Generic (PLEG): container finished" podID="8ee01710-7ad7-47e9-8268-09a33572ab6a" containerID="629a2e1c62ee9742cb533842e08303c5cee1a87130f73b0f0f822f74d7518e8d" exitCode=0 Dec 05 13:07:31.526678 master-0 kubenswrapper[29936]: I1205 13:07:31.526660 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f5d648448-8m7zn" event={"ID":"8ee01710-7ad7-47e9-8268-09a33572ab6a","Type":"ContainerDied","Data":"629a2e1c62ee9742cb533842e08303c5cee1a87130f73b0f0f822f74d7518e8d"} Dec 05 13:07:32.187376 master-0 kubenswrapper[29936]: I1205 13:07:32.187290 29936 scope.go:117] "RemoveContainer" containerID="0def71aab0e9502c799b2ec9bb2c3e775a167826b1f8d1c0f4ab4c73f4421b4c" Dec 05 13:07:32.900479 master-0 kubenswrapper[29936]: I1205 13:07:32.900371 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75dc9f44fc-hmphp"] Dec 05 13:07:32.902018 master-0 kubenswrapper[29936]: E1205 13:07:32.901113 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32628c1d-e798-427e-97ca-322d9af2971e" containerName="ironic-inspector-db-sync" Dec 05 13:07:32.902018 master-0 kubenswrapper[29936]: I1205 13:07:32.901137 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="32628c1d-e798-427e-97ca-322d9af2971e" containerName="ironic-inspector-db-sync" Dec 05 13:07:32.902018 master-0 kubenswrapper[29936]: I1205 13:07:32.901538 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="32628c1d-e798-427e-97ca-322d9af2971e" containerName="ironic-inspector-db-sync" Dec 05 13:07:32.902967 master-0 kubenswrapper[29936]: I1205 13:07:32.902923 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:32.931788 master-0 kubenswrapper[29936]: I1205 13:07:32.930562 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dc9f44fc-hmphp"] Dec 05 13:07:33.009808 master-0 kubenswrapper[29936]: I1205 13:07:33.009705 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Dec 05 13:07:33.033342 master-0 kubenswrapper[29936]: I1205 13:07:33.032970 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-992bj\" (UniqueName: \"kubernetes.io/projected/6046b9e1-6a97-47a9-a88b-772270e4cdaf-kube-api-access-992bj\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.033342 master-0 kubenswrapper[29936]: I1205 13:07:33.033096 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-swift-storage-0\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.035129 master-0 kubenswrapper[29936]: I1205 13:07:33.035067 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Dec 05 13:07:33.035311 master-0 kubenswrapper[29936]: I1205 13:07:33.035279 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 05 13:07:33.035484 master-0 kubenswrapper[29936]: I1205 13:07:33.035451 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-config\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.035594 master-0 kubenswrapper[29936]: I1205 13:07:33.035566 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-nb\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.036284 master-0 kubenswrapper[29936]: I1205 13:07:33.035821 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-svc\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.036284 master-0 kubenswrapper[29936]: I1205 13:07:33.035900 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-sb\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.038223 master-0 kubenswrapper[29936]: I1205 13:07:33.038153 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Dec 05 13:07:33.048382 master-0 kubenswrapper[29936]: I1205 13:07:33.041478 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Dec 05 13:07:33.048382 master-0 kubenswrapper[29936]: I1205 13:07:33.041778 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Dec 05 13:07:33.144788 master-0 kubenswrapper[29936]: I1205 13:07:33.141525 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-svc\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.145247 master-0 kubenswrapper[29936]: I1205 13:07:33.144935 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-sb\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.145247 master-0 kubenswrapper[29936]: I1205 13:07:33.145222 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-config\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.145388 master-0 kubenswrapper[29936]: I1205 13:07:33.145254 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.145478 master-0 kubenswrapper[29936]: I1205 13:07:33.145457 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-scripts\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.145689 master-0 kubenswrapper[29936]: I1205 13:07:33.142771 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-svc\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.146310 master-0 kubenswrapper[29936]: I1205 13:07:33.146250 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-992bj\" (UniqueName: \"kubernetes.io/projected/6046b9e1-6a97-47a9-a88b-772270e4cdaf-kube-api-access-992bj\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.146423 master-0 kubenswrapper[29936]: I1205 13:07:33.146384 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-swift-storage-0\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.146509 master-0 kubenswrapper[29936]: I1205 13:07:33.146480 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/01d30809-19fa-4217-9b1b-e35e7504316e-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.146715 master-0 kubenswrapper[29936]: I1205 13:07:33.146671 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.146769 master-0 kubenswrapper[29936]: I1205 13:07:33.146714 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-config\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.146809 master-0 kubenswrapper[29936]: I1205 13:07:33.146778 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.146923 master-0 kubenswrapper[29936]: I1205 13:07:33.146900 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-nb\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.147086 master-0 kubenswrapper[29936]: I1205 13:07:33.147023 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-sb\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.148145 master-0 kubenswrapper[29936]: I1205 13:07:33.147970 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-swift-storage-0\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.149554 master-0 kubenswrapper[29936]: I1205 13:07:33.148816 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntsw7\" (UniqueName: \"kubernetes.io/projected/01d30809-19fa-4217-9b1b-e35e7504316e-kube-api-access-ntsw7\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.149554 master-0 kubenswrapper[29936]: I1205 13:07:33.148957 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-nb\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.155746 master-0 kubenswrapper[29936]: I1205 13:07:33.155501 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-config\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.174442 master-0 kubenswrapper[29936]: I1205 13:07:33.174369 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-992bj\" (UniqueName: \"kubernetes.io/projected/6046b9e1-6a97-47a9-a88b-772270e4cdaf-kube-api-access-992bj\") pod \"dnsmasq-dns-75dc9f44fc-hmphp\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.252376 master-0 kubenswrapper[29936]: I1205 13:07:33.252296 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/01d30809-19fa-4217-9b1b-e35e7504316e-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.252376 master-0 kubenswrapper[29936]: I1205 13:07:33.252385 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.253124 master-0 kubenswrapper[29936]: I1205 13:07:33.252412 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.253124 master-0 kubenswrapper[29936]: I1205 13:07:33.252476 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntsw7\" (UniqueName: \"kubernetes.io/projected/01d30809-19fa-4217-9b1b-e35e7504316e-kube-api-access-ntsw7\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.253124 master-0 kubenswrapper[29936]: I1205 13:07:33.252554 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-config\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.253124 master-0 kubenswrapper[29936]: I1205 13:07:33.252579 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.253124 master-0 kubenswrapper[29936]: I1205 13:07:33.252642 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-scripts\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.253414 master-0 kubenswrapper[29936]: I1205 13:07:33.253270 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.256909 master-0 kubenswrapper[29936]: I1205 13:07:33.256836 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.257951 master-0 kubenswrapper[29936]: I1205 13:07:33.257901 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.262070 master-0 kubenswrapper[29936]: I1205 13:07:33.261866 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-scripts\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.269971 master-0 kubenswrapper[29936]: I1205 13:07:33.269888 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/01d30809-19fa-4217-9b1b-e35e7504316e-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.271333 master-0 kubenswrapper[29936]: I1205 13:07:33.271271 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-config\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.276816 master-0 kubenswrapper[29936]: I1205 13:07:33.276744 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntsw7\" (UniqueName: \"kubernetes.io/projected/01d30809-19fa-4217-9b1b-e35e7504316e-kube-api-access-ntsw7\") pod \"ironic-inspector-0\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:33.299909 master-0 kubenswrapper[29936]: I1205 13:07:33.299134 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:33.367535 master-0 kubenswrapper[29936]: I1205 13:07:33.367430 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 05 13:07:34.615441 master-0 kubenswrapper[29936]: I1205 13:07:34.615362 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:34.631472 master-0 kubenswrapper[29936]: I1205 13:07:34.631360 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-75c7c6bd54-dbbbs" Dec 05 13:07:37.035174 master-0 kubenswrapper[29936]: I1205 13:07:37.034406 29936 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod46decc62-ea7e-4ec1-ad45-7ff4812f77a5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod46decc62-ea7e-4ec1-ad45-7ff4812f77a5] : Timed out while waiting for systemd to remove kubepods-besteffort-pod46decc62_ea7e_4ec1_ad45_7ff4812f77a5.slice" Dec 05 13:07:37.035174 master-0 kubenswrapper[29936]: E1205 13:07:37.034517 29936 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod46decc62-ea7e-4ec1-ad45-7ff4812f77a5] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod46decc62-ea7e-4ec1-ad45-7ff4812f77a5] : Timed out while waiting for systemd to remove kubepods-besteffort-pod46decc62_ea7e_4ec1_ad45_7ff4812f77a5.slice" pod="openstack/cinder-b46d8-backup-0" podUID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" Dec 05 13:07:37.425079 master-0 kubenswrapper[29936]: I1205 13:07:37.423079 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Dec 05 13:07:37.665661 master-0 kubenswrapper[29936]: I1205 13:07:37.665574 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.737607 master-0 kubenswrapper[29936]: I1205 13:07:37.737405 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b46d8-backup-0"] Dec 05 13:07:37.773075 master-0 kubenswrapper[29936]: I1205 13:07:37.772942 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b46d8-backup-0"] Dec 05 13:07:37.799215 master-0 kubenswrapper[29936]: I1205 13:07:37.799110 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b46d8-backup-0"] Dec 05 13:07:37.825364 master-0 kubenswrapper[29936]: I1205 13:07:37.824338 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.828151 master-0 kubenswrapper[29936]: I1205 13:07:37.828036 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-backup-0"] Dec 05 13:07:37.831493 master-0 kubenswrapper[29936]: I1205 13:07:37.830473 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-b46d8-backup-config-data" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.970409 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-config-data\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.970544 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-etc-machine-id\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.970725 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-var-locks-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.970756 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-var-lib-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.970997 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-sys\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.971098 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-var-locks-brick\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.971408 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh79m\" (UniqueName: \"kubernetes.io/projected/0e8e3150-d45a-49f4-97d0-b45473ee80c5-kube-api-access-nh79m\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.971507 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-scripts\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.971558 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-dev\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.971667 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-combined-ca-bundle\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.971794 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-run\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.971835 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-lib-modules\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.971867 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-etc-iscsi\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.973202 master-0 kubenswrapper[29936]: I1205 13:07:37.972235 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-etc-nvme\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:37.974826 master-0 kubenswrapper[29936]: I1205 13:07:37.974768 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-config-data-custom\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.083270 master-0 kubenswrapper[29936]: I1205 13:07:38.082687 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-combined-ca-bundle\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.083270 master-0 kubenswrapper[29936]: I1205 13:07:38.082997 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-run\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.083270 master-0 kubenswrapper[29936]: I1205 13:07:38.083141 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-lib-modules\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.083270 master-0 kubenswrapper[29936]: I1205 13:07:38.083139 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-run\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.083270 master-0 kubenswrapper[29936]: I1205 13:07:38.083246 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-etc-iscsi\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.083988 master-0 kubenswrapper[29936]: I1205 13:07:38.083173 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-etc-iscsi\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.083988 master-0 kubenswrapper[29936]: I1205 13:07:38.083345 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-lib-modules\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.083988 master-0 kubenswrapper[29936]: I1205 13:07:38.083564 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-etc-nvme\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.083988 master-0 kubenswrapper[29936]: I1205 13:07:38.083457 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-etc-nvme\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.083988 master-0 kubenswrapper[29936]: I1205 13:07:38.083826 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-config-data-custom\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.084804 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-config-data\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.084924 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-etc-machine-id\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085066 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-etc-machine-id\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085088 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-var-locks-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085124 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-var-lib-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085158 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-sys\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085162 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-var-locks-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085232 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-var-locks-brick\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085242 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-var-lib-cinder\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085275 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-sys\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085310 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh79m\" (UniqueName: \"kubernetes.io/projected/0e8e3150-d45a-49f4-97d0-b45473ee80c5-kube-api-access-nh79m\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085341 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-var-locks-brick\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085555 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-scripts\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.087190 master-0 kubenswrapper[29936]: I1205 13:07:38.085612 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-dev\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.090381 master-0 kubenswrapper[29936]: I1205 13:07:38.085722 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0e8e3150-d45a-49f4-97d0-b45473ee80c5-dev\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.090381 master-0 kubenswrapper[29936]: I1205 13:07:38.087400 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-combined-ca-bundle\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.090381 master-0 kubenswrapper[29936]: I1205 13:07:38.089226 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-config-data-custom\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.090381 master-0 kubenswrapper[29936]: I1205 13:07:38.089744 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-scripts\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.091094 master-0 kubenswrapper[29936]: I1205 13:07:38.091056 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e8e3150-d45a-49f4-97d0-b45473ee80c5-config-data\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.150316 master-0 kubenswrapper[29936]: I1205 13:07:38.150215 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh79m\" (UniqueName: \"kubernetes.io/projected/0e8e3150-d45a-49f4-97d0-b45473ee80c5-kube-api-access-nh79m\") pod \"cinder-b46d8-backup-0\" (UID: \"0e8e3150-d45a-49f4-97d0-b45473ee80c5\") " pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:38.164017 master-0 kubenswrapper[29936]: I1205 13:07:38.162490 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:39.207799 master-0 kubenswrapper[29936]: I1205 13:07:39.207686 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46decc62-ea7e-4ec1-ad45-7ff4812f77a5" path="/var/lib/kubelet/pods/46decc62-ea7e-4ec1-ad45-7ff4812f77a5/volumes" Dec 05 13:07:44.817363 master-0 kubenswrapper[29936]: I1205 13:07:44.817289 29936 generic.go:334] "Generic (PLEG): container finished" podID="8ee01710-7ad7-47e9-8268-09a33572ab6a" containerID="9a3b05f6ff88b9c96448238538c8a34f6caf50d50918bd415887609dfb3fc194" exitCode=0 Dec 05 13:07:44.818044 master-0 kubenswrapper[29936]: I1205 13:07:44.817381 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f5d648448-8m7zn" event={"ID":"8ee01710-7ad7-47e9-8268-09a33572ab6a","Type":"ContainerDied","Data":"9a3b05f6ff88b9c96448238538c8a34f6caf50d50918bd415887609dfb3fc194"} Dec 05 13:07:45.861230 master-0 kubenswrapper[29936]: I1205 13:07:45.854557 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" event={"ID":"1eb892f5-7ab8-4503-b7b8-1e233a1042bb","Type":"ContainerStarted","Data":"1143057ae2f4b982fc21bb48e85f3a7200c485d51b021577db1d39738961e094"} Dec 05 13:07:45.876203 master-0 kubenswrapper[29936]: I1205 13:07:45.868371 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6bb0a2f8-b8d2-42b7-9ebb-6c7d1bf16603","Type":"ContainerStarted","Data":"3c2eadbe193af9932a94c1bcc990e1437a4e0d43188a43c7fe30378805fa74be"} Dec 05 13:07:46.419208 master-0 kubenswrapper[29936]: I1205 13:07:46.418163 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dc9f44fc-hmphp"] Dec 05 13:07:46.755265 master-0 kubenswrapper[29936]: I1205 13:07:46.746386 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mh6s6"] Dec 05 13:07:46.755265 master-0 kubenswrapper[29936]: I1205 13:07:46.749355 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mh6s6" Dec 05 13:07:46.823212 master-0 kubenswrapper[29936]: W1205 13:07:46.817502 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01d30809_19fa_4217_9b1b_e35e7504316e.slice/crio-bf86107cf084d74a32e2df6ccc00eab7d1f5d5f4151d86d40fc69c97f64f5e32 WatchSource:0}: Error finding container bf86107cf084d74a32e2df6ccc00eab7d1f5d5f4151d86d40fc69c97f64f5e32: Status 404 returned error can't find the container with id bf86107cf084d74a32e2df6ccc00eab7d1f5d5f4151d86d40fc69c97f64f5e32 Dec 05 13:07:46.823212 master-0 kubenswrapper[29936]: I1205 13:07:46.818494 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mh6s6"] Dec 05 13:07:46.864222 master-0 kubenswrapper[29936]: I1205 13:07:46.852067 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Dec 05 13:07:46.923866 master-0 kubenswrapper[29936]: I1205 13:07:46.917536 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a056507-477d-4d77-9905-c7c6344e92ec-operator-scripts\") pod \"nova-api-db-create-mh6s6\" (UID: \"6a056507-477d-4d77-9905-c7c6344e92ec\") " pod="openstack/nova-api-db-create-mh6s6" Dec 05 13:07:46.923866 master-0 kubenswrapper[29936]: I1205 13:07:46.917696 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6hzs\" (UniqueName: \"kubernetes.io/projected/6a056507-477d-4d77-9905-c7c6344e92ec-kube-api-access-p6hzs\") pod \"nova-api-db-create-mh6s6\" (UID: \"6a056507-477d-4d77-9905-c7c6344e92ec\") " pod="openstack/nova-api-db-create-mh6s6" Dec 05 13:07:46.990215 master-0 kubenswrapper[29936]: I1205 13:07:46.979495 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"01d30809-19fa-4217-9b1b-e35e7504316e","Type":"ContainerStarted","Data":"bf86107cf084d74a32e2df6ccc00eab7d1f5d5f4151d86d40fc69c97f64f5e32"} Dec 05 13:07:47.014213 master-0 kubenswrapper[29936]: I1205 13:07:47.006105 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" event={"ID":"6046b9e1-6a97-47a9-a88b-772270e4cdaf","Type":"ContainerStarted","Data":"7a6134451c8a5ae282bf4f802f121960d92f7de6d7f390a2b9dd979fe745fc1a"} Dec 05 13:07:47.014213 master-0 kubenswrapper[29936]: I1205 13:07:47.006371 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:47.045279 master-0 kubenswrapper[29936]: I1205 13:07:47.021969 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6hzs\" (UniqueName: \"kubernetes.io/projected/6a056507-477d-4d77-9905-c7c6344e92ec-kube-api-access-p6hzs\") pod \"nova-api-db-create-mh6s6\" (UID: \"6a056507-477d-4d77-9905-c7c6344e92ec\") " pod="openstack/nova-api-db-create-mh6s6" Dec 05 13:07:47.045279 master-0 kubenswrapper[29936]: I1205 13:07:47.022151 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a056507-477d-4d77-9905-c7c6344e92ec-operator-scripts\") pod \"nova-api-db-create-mh6s6\" (UID: \"6a056507-477d-4d77-9905-c7c6344e92ec\") " pod="openstack/nova-api-db-create-mh6s6" Dec 05 13:07:47.045279 master-0 kubenswrapper[29936]: I1205 13:07:47.022987 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a056507-477d-4d77-9905-c7c6344e92ec-operator-scripts\") pod \"nova-api-db-create-mh6s6\" (UID: \"6a056507-477d-4d77-9905-c7c6344e92ec\") " pod="openstack/nova-api-db-create-mh6s6" Dec 05 13:07:47.045279 master-0 kubenswrapper[29936]: I1205 13:07:47.025260 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-fkxvl"] Dec 05 13:07:47.045279 master-0 kubenswrapper[29936]: I1205 13:07:47.029129 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fkxvl" Dec 05 13:07:47.124831 master-0 kubenswrapper[29936]: I1205 13:07:47.090409 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6hzs\" (UniqueName: \"kubernetes.io/projected/6a056507-477d-4d77-9905-c7c6344e92ec-kube-api-access-p6hzs\") pod \"nova-api-db-create-mh6s6\" (UID: \"6a056507-477d-4d77-9905-c7c6344e92ec\") " pod="openstack/nova-api-db-create-mh6s6" Dec 05 13:07:47.149208 master-0 kubenswrapper[29936]: I1205 13:07:47.128316 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzd7z\" (UniqueName: \"kubernetes.io/projected/d934457f-86b7-4ccf-b0da-8625268d2a56-kube-api-access-mzd7z\") pod \"nova-cell0-db-create-fkxvl\" (UID: \"d934457f-86b7-4ccf-b0da-8625268d2a56\") " pod="openstack/nova-cell0-db-create-fkxvl" Dec 05 13:07:47.149208 master-0 kubenswrapper[29936]: I1205 13:07:47.147213 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d934457f-86b7-4ccf-b0da-8625268d2a56-operator-scripts\") pod \"nova-cell0-db-create-fkxvl\" (UID: \"d934457f-86b7-4ccf-b0da-8625268d2a56\") " pod="openstack/nova-cell0-db-create-fkxvl" Dec 05 13:07:47.173717 master-0 kubenswrapper[29936]: I1205 13:07:47.170237 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fkxvl"] Dec 05 13:07:47.256353 master-0 kubenswrapper[29936]: I1205 13:07:47.241354 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-dc7b-account-create-update-6mmc6"] Dec 05 13:07:47.256353 master-0 kubenswrapper[29936]: I1205 13:07:47.250569 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dc7b-account-create-update-6mmc6" Dec 05 13:07:47.256353 master-0 kubenswrapper[29936]: I1205 13:07:47.253138 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mh6s6" Dec 05 13:07:47.260209 master-0 kubenswrapper[29936]: I1205 13:07:47.259168 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 13:07:47.264965 master-0 kubenswrapper[29936]: I1205 13:07:47.264815 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzd7z\" (UniqueName: \"kubernetes.io/projected/d934457f-86b7-4ccf-b0da-8625268d2a56-kube-api-access-mzd7z\") pod \"nova-cell0-db-create-fkxvl\" (UID: \"d934457f-86b7-4ccf-b0da-8625268d2a56\") " pod="openstack/nova-cell0-db-create-fkxvl" Dec 05 13:07:47.265140 master-0 kubenswrapper[29936]: I1205 13:07:47.265038 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d934457f-86b7-4ccf-b0da-8625268d2a56-operator-scripts\") pod \"nova-cell0-db-create-fkxvl\" (UID: \"d934457f-86b7-4ccf-b0da-8625268d2a56\") " pod="openstack/nova-cell0-db-create-fkxvl" Dec 05 13:07:47.285836 master-0 kubenswrapper[29936]: I1205 13:07:47.266747 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d934457f-86b7-4ccf-b0da-8625268d2a56-operator-scripts\") pod \"nova-cell0-db-create-fkxvl\" (UID: \"d934457f-86b7-4ccf-b0da-8625268d2a56\") " pod="openstack/nova-cell0-db-create-fkxvl" Dec 05 13:07:47.310083 master-0 kubenswrapper[29936]: I1205 13:07:47.308670 29936 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podefb9ef7b-31f5-4aa0-a5cf-da58084e3b3a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podefb9ef7b-31f5-4aa0-a5cf-da58084e3b3a] : Timed out while waiting for systemd to remove kubepods-besteffort-podefb9ef7b_31f5_4aa0_a5cf_da58084e3b3a.slice" Dec 05 13:07:47.354767 master-0 kubenswrapper[29936]: I1205 13:07:47.353887 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-dc7b-account-create-update-6mmc6"] Dec 05 13:07:47.374821 master-0 kubenswrapper[29936]: I1205 13:07:47.373550 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8eb8d4a-1ed8-445b-a455-e79b6659317f-operator-scripts\") pod \"nova-api-dc7b-account-create-update-6mmc6\" (UID: \"a8eb8d4a-1ed8-445b-a455-e79b6659317f\") " pod="openstack/nova-api-dc7b-account-create-update-6mmc6" Dec 05 13:07:47.374821 master-0 kubenswrapper[29936]: I1205 13:07:47.374015 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwdhc\" (UniqueName: \"kubernetes.io/projected/a8eb8d4a-1ed8-445b-a455-e79b6659317f-kube-api-access-jwdhc\") pod \"nova-api-dc7b-account-create-update-6mmc6\" (UID: \"a8eb8d4a-1ed8-445b-a455-e79b6659317f\") " pod="openstack/nova-api-dc7b-account-create-update-6mmc6" Dec 05 13:07:47.428963 master-0 kubenswrapper[29936]: I1205 13:07:47.416204 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzd7z\" (UniqueName: \"kubernetes.io/projected/d934457f-86b7-4ccf-b0da-8625268d2a56-kube-api-access-mzd7z\") pod \"nova-cell0-db-create-fkxvl\" (UID: \"d934457f-86b7-4ccf-b0da-8625268d2a56\") " pod="openstack/nova-cell0-db-create-fkxvl" Dec 05 13:07:47.431458 master-0 kubenswrapper[29936]: I1205 13:07:47.430207 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fkxvl" Dec 05 13:07:47.452980 master-0 kubenswrapper[29936]: I1205 13:07:47.445270 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-cw997"] Dec 05 13:07:47.456828 master-0 kubenswrapper[29936]: I1205 13:07:47.456064 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cw997" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.491372 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwdhc\" (UniqueName: \"kubernetes.io/projected/a8eb8d4a-1ed8-445b-a455-e79b6659317f-kube-api-access-jwdhc\") pod \"nova-api-dc7b-account-create-update-6mmc6\" (UID: \"a8eb8d4a-1ed8-445b-a455-e79b6659317f\") " pod="openstack/nova-api-dc7b-account-create-update-6mmc6" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.491695 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jghht\" (UniqueName: \"kubernetes.io/projected/03ed197d-e0d9-4970-8d1e-f79fa7c70697-kube-api-access-jghht\") pod \"nova-cell1-db-create-cw997\" (UID: \"03ed197d-e0d9-4970-8d1e-f79fa7c70697\") " pod="openstack/nova-cell1-db-create-cw997" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.492047 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8eb8d4a-1ed8-445b-a455-e79b6659317f-operator-scripts\") pod \"nova-api-dc7b-account-create-update-6mmc6\" (UID: \"a8eb8d4a-1ed8-445b-a455-e79b6659317f\") " pod="openstack/nova-api-dc7b-account-create-update-6mmc6" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.492145 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03ed197d-e0d9-4970-8d1e-f79fa7c70697-operator-scripts\") pod \"nova-cell1-db-create-cw997\" (UID: \"03ed197d-e0d9-4970-8d1e-f79fa7c70697\") " pod="openstack/nova-cell1-db-create-cw997" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.502295 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8eb8d4a-1ed8-445b-a455-e79b6659317f-operator-scripts\") pod \"nova-api-dc7b-account-create-update-6mmc6\" (UID: \"a8eb8d4a-1ed8-445b-a455-e79b6659317f\") " pod="openstack/nova-api-dc7b-account-create-update-6mmc6" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.555959 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwdhc\" (UniqueName: \"kubernetes.io/projected/a8eb8d4a-1ed8-445b-a455-e79b6659317f-kube-api-access-jwdhc\") pod \"nova-api-dc7b-account-create-update-6mmc6\" (UID: \"a8eb8d4a-1ed8-445b-a455-e79b6659317f\") " pod="openstack/nova-api-dc7b-account-create-update-6mmc6" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.566626 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cw997"] Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.570351 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.848771938 podStartE2EDuration="26.570330666s" podCreationTimestamp="2025-12-05 13:07:21 +0000 UTC" firstStartedPulling="2025-12-05 13:07:22.999886961 +0000 UTC m=+1040.131966642" lastFinishedPulling="2025-12-05 13:07:44.721445689 +0000 UTC m=+1061.853525370" observedRunningTime="2025-12-05 13:07:47.120233984 +0000 UTC m=+1064.252313665" watchObservedRunningTime="2025-12-05 13:07:47.570330666 +0000 UTC m=+1064.702410367" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.603471 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jghht\" (UniqueName: \"kubernetes.io/projected/03ed197d-e0d9-4970-8d1e-f79fa7c70697-kube-api-access-jghht\") pod \"nova-cell1-db-create-cw997\" (UID: \"03ed197d-e0d9-4970-8d1e-f79fa7c70697\") " pod="openstack/nova-cell1-db-create-cw997" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.603894 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03ed197d-e0d9-4970-8d1e-f79fa7c70697-operator-scripts\") pod \"nova-cell1-db-create-cw997\" (UID: \"03ed197d-e0d9-4970-8d1e-f79fa7c70697\") " pod="openstack/nova-cell1-db-create-cw997" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.605058 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03ed197d-e0d9-4970-8d1e-f79fa7c70697-operator-scripts\") pod \"nova-cell1-db-create-cw997\" (UID: \"03ed197d-e0d9-4970-8d1e-f79fa7c70697\") " pod="openstack/nova-cell1-db-create-cw997" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.627917 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dc7b-account-create-update-6mmc6" Dec 05 13:07:47.713110 master-0 kubenswrapper[29936]: I1205 13:07:47.632932 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jghht\" (UniqueName: \"kubernetes.io/projected/03ed197d-e0d9-4970-8d1e-f79fa7c70697-kube-api-access-jghht\") pod \"nova-cell1-db-create-cw997\" (UID: \"03ed197d-e0d9-4970-8d1e-f79fa7c70697\") " pod="openstack/nova-cell1-db-create-cw997" Dec 05 13:07:47.841700 master-0 kubenswrapper[29936]: I1205 13:07:47.831640 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b154-account-create-update-9nrbh"] Dec 05 13:07:47.849204 master-0 kubenswrapper[29936]: I1205 13:07:47.844981 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b154-account-create-update-9nrbh" Dec 05 13:07:47.870214 master-0 kubenswrapper[29936]: I1205 13:07:47.865547 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 13:07:47.923512 master-0 kubenswrapper[29936]: I1205 13:07:47.917267 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b154-account-create-update-9nrbh"] Dec 05 13:07:48.191906 master-0 kubenswrapper[29936]: I1205 13:07:48.191754 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60288a03-cb34-45c0-a727-0d822e01d9e8-operator-scripts\") pod \"nova-cell0-b154-account-create-update-9nrbh\" (UID: \"60288a03-cb34-45c0-a727-0d822e01d9e8\") " pod="openstack/nova-cell0-b154-account-create-update-9nrbh" Dec 05 13:07:48.193106 master-0 kubenswrapper[29936]: I1205 13:07:48.193069 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dhzr\" (UniqueName: \"kubernetes.io/projected/60288a03-cb34-45c0-a727-0d822e01d9e8-kube-api-access-5dhzr\") pod \"nova-cell0-b154-account-create-update-9nrbh\" (UID: \"60288a03-cb34-45c0-a727-0d822e01d9e8\") " pod="openstack/nova-cell0-b154-account-create-update-9nrbh" Dec 05 13:07:48.243050 master-0 kubenswrapper[29936]: I1205 13:07:48.242972 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b46d8-backup-0"] Dec 05 13:07:48.256289 master-0 kubenswrapper[29936]: I1205 13:07:48.256157 29936 generic.go:334] "Generic (PLEG): container finished" podID="6046b9e1-6a97-47a9-a88b-772270e4cdaf" containerID="72a75b024f9b496d5f1a701d70ada1f8176955f620482d82708f8692efa5dc7f" exitCode=0 Dec 05 13:07:48.256493 master-0 kubenswrapper[29936]: I1205 13:07:48.256437 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" event={"ID":"6046b9e1-6a97-47a9-a88b-772270e4cdaf","Type":"ContainerDied","Data":"72a75b024f9b496d5f1a701d70ada1f8176955f620482d82708f8692efa5dc7f"} Dec 05 13:07:48.305008 master-0 kubenswrapper[29936]: I1205 13:07:48.304907 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f5d648448-8m7zn" event={"ID":"8ee01710-7ad7-47e9-8268-09a33572ab6a","Type":"ContainerDied","Data":"fd0a6ba0af535b40632b7f9ecd4f3c5083e72c0c4d90447a211980846ab4ad55"} Dec 05 13:07:48.305008 master-0 kubenswrapper[29936]: I1205 13:07:48.304985 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd0a6ba0af535b40632b7f9ecd4f3c5083e72c0c4d90447a211980846ab4ad55" Dec 05 13:07:48.330363 master-0 kubenswrapper[29936]: I1205 13:07:48.319759 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dhzr\" (UniqueName: \"kubernetes.io/projected/60288a03-cb34-45c0-a727-0d822e01d9e8-kube-api-access-5dhzr\") pod \"nova-cell0-b154-account-create-update-9nrbh\" (UID: \"60288a03-cb34-45c0-a727-0d822e01d9e8\") " pod="openstack/nova-cell0-b154-account-create-update-9nrbh" Dec 05 13:07:48.331615 master-0 kubenswrapper[29936]: I1205 13:07:48.331103 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60288a03-cb34-45c0-a727-0d822e01d9e8-operator-scripts\") pod \"nova-cell0-b154-account-create-update-9nrbh\" (UID: \"60288a03-cb34-45c0-a727-0d822e01d9e8\") " pod="openstack/nova-cell0-b154-account-create-update-9nrbh" Dec 05 13:07:48.332504 master-0 kubenswrapper[29936]: I1205 13:07:48.332473 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60288a03-cb34-45c0-a727-0d822e01d9e8-operator-scripts\") pod \"nova-cell0-b154-account-create-update-9nrbh\" (UID: \"60288a03-cb34-45c0-a727-0d822e01d9e8\") " pod="openstack/nova-cell0-b154-account-create-update-9nrbh" Dec 05 13:07:48.436102 master-0 kubenswrapper[29936]: I1205 13:07:48.436011 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0961-account-create-update-krxs2"] Dec 05 13:07:48.446232 master-0 kubenswrapper[29936]: I1205 13:07:48.446088 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"cc2221d6-014e-4bd4-962b-24512ebf84e8","Type":"ContainerStarted","Data":"b8dd7e13c98ceb50d8e28003045fed8a9665f1673eae8fbe1b1c32c4e32f07b5"} Dec 05 13:07:48.452388 master-0 kubenswrapper[29936]: I1205 13:07:48.451657 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0961-account-create-update-krxs2" Dec 05 13:07:48.461796 master-0 kubenswrapper[29936]: I1205 13:07:48.460694 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0961-account-create-update-krxs2"] Dec 05 13:07:48.461796 master-0 kubenswrapper[29936]: I1205 13:07:48.460767 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-backup-0" event={"ID":"0e8e3150-d45a-49f4-97d0-b45473ee80c5","Type":"ContainerStarted","Data":"3cf85ddd662f98d0a6f17bf54f71b521c7191e4f8f29ad2f005684b6cc41e4e7"} Dec 05 13:07:48.463090 master-0 kubenswrapper[29936]: I1205 13:07:48.463046 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dhzr\" (UniqueName: \"kubernetes.io/projected/60288a03-cb34-45c0-a727-0d822e01d9e8-kube-api-access-5dhzr\") pod \"nova-cell0-b154-account-create-update-9nrbh\" (UID: \"60288a03-cb34-45c0-a727-0d822e01d9e8\") " pod="openstack/nova-cell0-b154-account-create-update-9nrbh" Dec 05 13:07:48.481108 master-0 kubenswrapper[29936]: I1205 13:07:48.481058 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 13:07:48.482769 master-0 kubenswrapper[29936]: I1205 13:07:48.482584 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cw997" Dec 05 13:07:48.495849 master-0 kubenswrapper[29936]: W1205 13:07:48.494250 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a056507_477d_4d77_9905_c7c6344e92ec.slice/crio-4aa2d067e8b6fd9504893e1a730dc4ccb017d44a02c83ca3e9e555d0819745e1 WatchSource:0}: Error finding container 4aa2d067e8b6fd9504893e1a730dc4ccb017d44a02c83ca3e9e555d0819745e1: Status 404 returned error can't find the container with id 4aa2d067e8b6fd9504893e1a730dc4ccb017d44a02c83ca3e9e555d0819745e1 Dec 05 13:07:48.508810 master-0 kubenswrapper[29936]: I1205 13:07:48.507934 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b154-account-create-update-9nrbh" Dec 05 13:07:48.535538 master-0 kubenswrapper[29936]: I1205 13:07:48.534279 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:07:48.549265 master-0 kubenswrapper[29936]: I1205 13:07:48.549197 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-8458c7d7db-8c5lp" Dec 05 13:07:48.576214 master-0 kubenswrapper[29936]: I1205 13:07:48.576088 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-combined-ca-bundle\") pod \"8ee01710-7ad7-47e9-8268-09a33572ab6a\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " Dec 05 13:07:48.576502 master-0 kubenswrapper[29936]: I1205 13:07:48.576276 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztlch\" (UniqueName: \"kubernetes.io/projected/8ee01710-7ad7-47e9-8268-09a33572ab6a-kube-api-access-ztlch\") pod \"8ee01710-7ad7-47e9-8268-09a33572ab6a\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " Dec 05 13:07:48.576502 master-0 kubenswrapper[29936]: I1205 13:07:48.576422 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-config\") pod \"8ee01710-7ad7-47e9-8268-09a33572ab6a\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " Dec 05 13:07:48.576651 master-0 kubenswrapper[29936]: I1205 13:07:48.576621 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-ovndb-tls-certs\") pod \"8ee01710-7ad7-47e9-8268-09a33572ab6a\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " Dec 05 13:07:48.576793 master-0 kubenswrapper[29936]: I1205 13:07:48.576763 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-httpd-config\") pod \"8ee01710-7ad7-47e9-8268-09a33572ab6a\" (UID: \"8ee01710-7ad7-47e9-8268-09a33572ab6a\") " Dec 05 13:07:48.577758 master-0 kubenswrapper[29936]: I1205 13:07:48.577715 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060bcbc7-c502-431d-8a7d-1f566a91f953-operator-scripts\") pod \"nova-cell1-0961-account-create-update-krxs2\" (UID: \"060bcbc7-c502-431d-8a7d-1f566a91f953\") " pod="openstack/nova-cell1-0961-account-create-update-krxs2" Dec 05 13:07:48.578232 master-0 kubenswrapper[29936]: I1205 13:07:48.578164 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xr9f\" (UniqueName: \"kubernetes.io/projected/060bcbc7-c502-431d-8a7d-1f566a91f953-kube-api-access-2xr9f\") pod \"nova-cell1-0961-account-create-update-krxs2\" (UID: \"060bcbc7-c502-431d-8a7d-1f566a91f953\") " pod="openstack/nova-cell1-0961-account-create-update-krxs2" Dec 05 13:07:48.621814 master-0 kubenswrapper[29936]: I1205 13:07:48.620518 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee01710-7ad7-47e9-8268-09a33572ab6a-kube-api-access-ztlch" (OuterVolumeSpecName: "kube-api-access-ztlch") pod "8ee01710-7ad7-47e9-8268-09a33572ab6a" (UID: "8ee01710-7ad7-47e9-8268-09a33572ab6a"). InnerVolumeSpecName "kube-api-access-ztlch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:48.639471 master-0 kubenswrapper[29936]: I1205 13:07:48.639212 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8ee01710-7ad7-47e9-8268-09a33572ab6a" (UID: "8ee01710-7ad7-47e9-8268-09a33572ab6a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:48.696525 master-0 kubenswrapper[29936]: I1205 13:07:48.689130 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060bcbc7-c502-431d-8a7d-1f566a91f953-operator-scripts\") pod \"nova-cell1-0961-account-create-update-krxs2\" (UID: \"060bcbc7-c502-431d-8a7d-1f566a91f953\") " pod="openstack/nova-cell1-0961-account-create-update-krxs2" Dec 05 13:07:48.696525 master-0 kubenswrapper[29936]: I1205 13:07:48.692101 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060bcbc7-c502-431d-8a7d-1f566a91f953-operator-scripts\") pod \"nova-cell1-0961-account-create-update-krxs2\" (UID: \"060bcbc7-c502-431d-8a7d-1f566a91f953\") " pod="openstack/nova-cell1-0961-account-create-update-krxs2" Dec 05 13:07:48.696525 master-0 kubenswrapper[29936]: I1205 13:07:48.693858 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xr9f\" (UniqueName: \"kubernetes.io/projected/060bcbc7-c502-431d-8a7d-1f566a91f953-kube-api-access-2xr9f\") pod \"nova-cell1-0961-account-create-update-krxs2\" (UID: \"060bcbc7-c502-431d-8a7d-1f566a91f953\") " pod="openstack/nova-cell1-0961-account-create-update-krxs2" Dec 05 13:07:48.696525 master-0 kubenswrapper[29936]: I1205 13:07:48.694792 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mh6s6"] Dec 05 13:07:48.697006 master-0 kubenswrapper[29936]: I1205 13:07:48.696887 29936 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-httpd-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:48.697006 master-0 kubenswrapper[29936]: I1205 13:07:48.696934 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztlch\" (UniqueName: \"kubernetes.io/projected/8ee01710-7ad7-47e9-8268-09a33572ab6a-kube-api-access-ztlch\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:48.735065 master-0 kubenswrapper[29936]: I1205 13:07:48.733921 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xr9f\" (UniqueName: \"kubernetes.io/projected/060bcbc7-c502-431d-8a7d-1f566a91f953-kube-api-access-2xr9f\") pod \"nova-cell1-0961-account-create-update-krxs2\" (UID: \"060bcbc7-c502-431d-8a7d-1f566a91f953\") " pod="openstack/nova-cell1-0961-account-create-update-krxs2" Dec 05 13:07:48.786650 master-0 kubenswrapper[29936]: I1205 13:07:48.786596 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-dc7b-account-create-update-6mmc6"] Dec 05 13:07:48.794708 master-0 kubenswrapper[29936]: I1205 13:07:48.793644 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-config" (OuterVolumeSpecName: "config") pod "8ee01710-7ad7-47e9-8268-09a33572ab6a" (UID: "8ee01710-7ad7-47e9-8268-09a33572ab6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:48.799132 master-0 kubenswrapper[29936]: I1205 13:07:48.799052 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:48.835612 master-0 kubenswrapper[29936]: I1205 13:07:48.835269 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ee01710-7ad7-47e9-8268-09a33572ab6a" (UID: "8ee01710-7ad7-47e9-8268-09a33572ab6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:48.835932 master-0 kubenswrapper[29936]: I1205 13:07:48.835720 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0961-account-create-update-krxs2" Dec 05 13:07:48.871396 master-0 kubenswrapper[29936]: W1205 13:07:48.859127 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8eb8d4a_1ed8_445b_a455_e79b6659317f.slice/crio-bd4d0fe856bbf56a8c0ce0bf9feea954af1f889df7b673cf7869ec130cee9169 WatchSource:0}: Error finding container bd4d0fe856bbf56a8c0ce0bf9feea954af1f889df7b673cf7869ec130cee9169: Status 404 returned error can't find the container with id bd4d0fe856bbf56a8c0ce0bf9feea954af1f889df7b673cf7869ec130cee9169 Dec 05 13:07:48.880607 master-0 kubenswrapper[29936]: I1205 13:07:48.880526 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8ee01710-7ad7-47e9-8268-09a33572ab6a" (UID: "8ee01710-7ad7-47e9-8268-09a33572ab6a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:48.911012 master-0 kubenswrapper[29936]: I1205 13:07:48.910895 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:48.911012 master-0 kubenswrapper[29936]: I1205 13:07:48.910975 29936 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee01710-7ad7-47e9-8268-09a33572ab6a-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:49.021007 master-0 kubenswrapper[29936]: I1205 13:07:49.020818 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fkxvl"] Dec 05 13:07:49.352314 master-0 kubenswrapper[29936]: I1205 13:07:49.352250 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cw997"] Dec 05 13:07:49.374739 master-0 kubenswrapper[29936]: W1205 13:07:49.373998 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03ed197d_e0d9_4970_8d1e_f79fa7c70697.slice/crio-0afb94c085487213e666e6188423baf3bcebcd3bac5826b68d461d8820363951 WatchSource:0}: Error finding container 0afb94c085487213e666e6188423baf3bcebcd3bac5826b68d461d8820363951: Status 404 returned error can't find the container with id 0afb94c085487213e666e6188423baf3bcebcd3bac5826b68d461d8820363951 Dec 05 13:07:49.428471 master-0 kubenswrapper[29936]: I1205 13:07:49.428152 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vt8w4"] Dec 05 13:07:49.429387 master-0 kubenswrapper[29936]: E1205 13:07:49.429354 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee01710-7ad7-47e9-8268-09a33572ab6a" containerName="neutron-httpd" Dec 05 13:07:49.429387 master-0 kubenswrapper[29936]: I1205 13:07:49.429387 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee01710-7ad7-47e9-8268-09a33572ab6a" containerName="neutron-httpd" Dec 05 13:07:49.429513 master-0 kubenswrapper[29936]: E1205 13:07:49.429435 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee01710-7ad7-47e9-8268-09a33572ab6a" containerName="neutron-api" Dec 05 13:07:49.429513 master-0 kubenswrapper[29936]: I1205 13:07:49.429446 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee01710-7ad7-47e9-8268-09a33572ab6a" containerName="neutron-api" Dec 05 13:07:49.442908 master-0 kubenswrapper[29936]: I1205 13:07:49.442573 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee01710-7ad7-47e9-8268-09a33572ab6a" containerName="neutron-httpd" Dec 05 13:07:49.442908 master-0 kubenswrapper[29936]: I1205 13:07:49.442691 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee01710-7ad7-47e9-8268-09a33572ab6a" containerName="neutron-api" Dec 05 13:07:49.457653 master-0 kubenswrapper[29936]: I1205 13:07:49.449148 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.482741 master-0 kubenswrapper[29936]: I1205 13:07:49.482653 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vt8w4"] Dec 05 13:07:49.542950 master-0 kubenswrapper[29936]: I1205 13:07:49.542877 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mh6s6" event={"ID":"6a056507-477d-4d77-9905-c7c6344e92ec","Type":"ContainerStarted","Data":"4aa2d067e8b6fd9504893e1a730dc4ccb017d44a02c83ca3e9e555d0819745e1"} Dec 05 13:07:49.545776 master-0 kubenswrapper[29936]: I1205 13:07:49.545740 29936 generic.go:334] "Generic (PLEG): container finished" podID="01d30809-19fa-4217-9b1b-e35e7504316e" containerID="9d6204ffa9c9da109cce9a10c42e3791e0ef6e7d747178a39d437fb322e9776e" exitCode=0 Dec 05 13:07:49.545856 master-0 kubenswrapper[29936]: I1205 13:07:49.545821 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"01d30809-19fa-4217-9b1b-e35e7504316e","Type":"ContainerDied","Data":"9d6204ffa9c9da109cce9a10c42e3791e0ef6e7d747178a39d437fb322e9776e"} Dec 05 13:07:49.556394 master-0 kubenswrapper[29936]: I1205 13:07:49.556256 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-backup-0" event={"ID":"0e8e3150-d45a-49f4-97d0-b45473ee80c5","Type":"ContainerStarted","Data":"df33963217959aa608a1776f1473121e3424c936c4793334643fa33003f9c19b"} Dec 05 13:07:49.563018 master-0 kubenswrapper[29936]: I1205 13:07:49.562928 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cw997" event={"ID":"03ed197d-e0d9-4970-8d1e-f79fa7c70697","Type":"ContainerStarted","Data":"0afb94c085487213e666e6188423baf3bcebcd3bac5826b68d461d8820363951"} Dec 05 13:07:49.571868 master-0 kubenswrapper[29936]: I1205 13:07:49.571451 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl4c7\" (UniqueName: \"kubernetes.io/projected/df7431e9-8625-453c-82d8-af5e79106c65-kube-api-access-fl4c7\") pod \"redhat-marketplace-vt8w4\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.571868 master-0 kubenswrapper[29936]: I1205 13:07:49.571613 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-catalog-content\") pod \"redhat-marketplace-vt8w4\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.571868 master-0 kubenswrapper[29936]: I1205 13:07:49.571640 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-utilities\") pod \"redhat-marketplace-vt8w4\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.579257 master-0 kubenswrapper[29936]: I1205 13:07:49.575938 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fkxvl" event={"ID":"d934457f-86b7-4ccf-b0da-8625268d2a56","Type":"ContainerStarted","Data":"24d6dd1f2edf2bd3725bcb39a29fe04ccba3cfc3bfe93127d27d12d76feb7686"} Dec 05 13:07:49.593618 master-0 kubenswrapper[29936]: I1205 13:07:49.593132 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f5d648448-8m7zn" Dec 05 13:07:49.594663 master-0 kubenswrapper[29936]: I1205 13:07:49.594625 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dc7b-account-create-update-6mmc6" event={"ID":"a8eb8d4a-1ed8-445b-a455-e79b6659317f","Type":"ContainerStarted","Data":"bd4d0fe856bbf56a8c0ce0bf9feea954af1f889df7b673cf7869ec130cee9169"} Dec 05 13:07:49.646718 master-0 kubenswrapper[29936]: I1205 13:07:49.644419 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gbdmx"] Dec 05 13:07:49.648202 master-0 kubenswrapper[29936]: I1205 13:07:49.648032 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:49.695799 master-0 kubenswrapper[29936]: I1205 13:07:49.695693 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl4c7\" (UniqueName: \"kubernetes.io/projected/df7431e9-8625-453c-82d8-af5e79106c65-kube-api-access-fl4c7\") pod \"redhat-marketplace-vt8w4\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.696137 master-0 kubenswrapper[29936]: I1205 13:07:49.696005 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-catalog-content\") pod \"redhat-marketplace-vt8w4\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.696137 master-0 kubenswrapper[29936]: I1205 13:07:49.696029 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-utilities\") pod \"redhat-marketplace-vt8w4\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.696901 master-0 kubenswrapper[29936]: I1205 13:07:49.696856 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-utilities\") pod \"redhat-marketplace-vt8w4\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.698277 master-0 kubenswrapper[29936]: I1205 13:07:49.698207 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gbdmx"] Dec 05 13:07:49.709543 master-0 kubenswrapper[29936]: I1205 13:07:49.709470 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-catalog-content\") pod \"redhat-marketplace-vt8w4\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.725367 master-0 kubenswrapper[29936]: I1205 13:07:49.724143 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b154-account-create-update-9nrbh"] Dec 05 13:07:49.742072 master-0 kubenswrapper[29936]: I1205 13:07:49.741200 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl4c7\" (UniqueName: \"kubernetes.io/projected/df7431e9-8625-453c-82d8-af5e79106c65-kube-api-access-fl4c7\") pod \"redhat-marketplace-vt8w4\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.787090 master-0 kubenswrapper[29936]: I1205 13:07:49.786989 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f5d648448-8m7zn"] Dec 05 13:07:49.797506 master-0 kubenswrapper[29936]: I1205 13:07:49.792835 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:49.803865 master-0 kubenswrapper[29936]: I1205 13:07:49.802474 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swrwj\" (UniqueName: \"kubernetes.io/projected/04779889-6879-4759-8928-4d0b0ed2eae2-kube-api-access-swrwj\") pod \"redhat-operators-gbdmx\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:49.803865 master-0 kubenswrapper[29936]: I1205 13:07:49.802599 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-catalog-content\") pod \"redhat-operators-gbdmx\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:49.803865 master-0 kubenswrapper[29936]: I1205 13:07:49.802643 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-utilities\") pod \"redhat-operators-gbdmx\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:49.809585 master-0 kubenswrapper[29936]: I1205 13:07:49.809451 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5f5d648448-8m7zn"] Dec 05 13:07:49.939963 master-0 kubenswrapper[29936]: I1205 13:07:49.938834 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swrwj\" (UniqueName: \"kubernetes.io/projected/04779889-6879-4759-8928-4d0b0ed2eae2-kube-api-access-swrwj\") pod \"redhat-operators-gbdmx\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:49.940851 master-0 kubenswrapper[29936]: I1205 13:07:49.940058 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-catalog-content\") pod \"redhat-operators-gbdmx\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:49.940851 master-0 kubenswrapper[29936]: I1205 13:07:49.940157 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-utilities\") pod \"redhat-operators-gbdmx\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:49.941562 master-0 kubenswrapper[29936]: I1205 13:07:49.941452 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-utilities\") pod \"redhat-operators-gbdmx\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:49.942962 master-0 kubenswrapper[29936]: I1205 13:07:49.942322 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-catalog-content\") pod \"redhat-operators-gbdmx\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:49.959408 master-0 kubenswrapper[29936]: I1205 13:07:49.959305 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0961-account-create-update-krxs2"] Dec 05 13:07:49.976638 master-0 kubenswrapper[29936]: I1205 13:07:49.976504 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swrwj\" (UniqueName: \"kubernetes.io/projected/04779889-6879-4759-8928-4d0b0ed2eae2-kube-api-access-swrwj\") pod \"redhat-operators-gbdmx\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:50.030795 master-0 kubenswrapper[29936]: W1205 13:07:50.030738 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod060bcbc7_c502_431d_8a7d_1f566a91f953.slice/crio-759a5df3f1ef7ebaad2c8ac2751ad2ed0b13b69e11d40aa455756bf61e509db7 WatchSource:0}: Error finding container 759a5df3f1ef7ebaad2c8ac2751ad2ed0b13b69e11d40aa455756bf61e509db7: Status 404 returned error can't find the container with id 759a5df3f1ef7ebaad2c8ac2751ad2ed0b13b69e11d40aa455756bf61e509db7 Dec 05 13:07:50.624670 master-0 kubenswrapper[29936]: I1205 13:07:50.624584 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0961-account-create-update-krxs2" event={"ID":"060bcbc7-c502-431d-8a7d-1f566a91f953","Type":"ContainerStarted","Data":"759a5df3f1ef7ebaad2c8ac2751ad2ed0b13b69e11d40aa455756bf61e509db7"} Dec 05 13:07:50.656916 master-0 kubenswrapper[29936]: I1205 13:07:50.656541 29936 generic.go:334] "Generic (PLEG): container finished" podID="d934457f-86b7-4ccf-b0da-8625268d2a56" containerID="e9fd626327750923d02fea0a7f9307cf165b248f43281931e05023f17de4dfed" exitCode=0 Dec 05 13:07:50.656916 master-0 kubenswrapper[29936]: I1205 13:07:50.656710 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fkxvl" event={"ID":"d934457f-86b7-4ccf-b0da-8625268d2a56","Type":"ContainerDied","Data":"e9fd626327750923d02fea0a7f9307cf165b248f43281931e05023f17de4dfed"} Dec 05 13:07:50.701102 master-0 kubenswrapper[29936]: I1205 13:07:50.701007 29936 generic.go:334] "Generic (PLEG): container finished" podID="a8eb8d4a-1ed8-445b-a455-e79b6659317f" containerID="e7c38d0df0df18cd7211f2a820ebdcdadb9f5f3b626e516a401a0a7252c4cd9e" exitCode=0 Dec 05 13:07:50.701559 master-0 kubenswrapper[29936]: I1205 13:07:50.701157 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dc7b-account-create-update-6mmc6" event={"ID":"a8eb8d4a-1ed8-445b-a455-e79b6659317f","Type":"ContainerDied","Data":"e7c38d0df0df18cd7211f2a820ebdcdadb9f5f3b626e516a401a0a7252c4cd9e"} Dec 05 13:07:50.725059 master-0 kubenswrapper[29936]: I1205 13:07:50.723747 29936 generic.go:334] "Generic (PLEG): container finished" podID="6a056507-477d-4d77-9905-c7c6344e92ec" containerID="ab0b4a7481da1a51a4001eaa1507a74ccee97a6a809b0bf807d5a3d863b56523" exitCode=0 Dec 05 13:07:50.725059 master-0 kubenswrapper[29936]: I1205 13:07:50.723851 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mh6s6" event={"ID":"6a056507-477d-4d77-9905-c7c6344e92ec","Type":"ContainerDied","Data":"ab0b4a7481da1a51a4001eaa1507a74ccee97a6a809b0bf807d5a3d863b56523"} Dec 05 13:07:50.728963 master-0 kubenswrapper[29936]: I1205 13:07:50.727120 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"01d30809-19fa-4217-9b1b-e35e7504316e","Type":"ContainerDied","Data":"bf86107cf084d74a32e2df6ccc00eab7d1f5d5f4151d86d40fc69c97f64f5e32"} Dec 05 13:07:50.728963 master-0 kubenswrapper[29936]: I1205 13:07:50.727241 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf86107cf084d74a32e2df6ccc00eab7d1f5d5f4151d86d40fc69c97f64f5e32" Dec 05 13:07:50.735613 master-0 kubenswrapper[29936]: I1205 13:07:50.734415 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b154-account-create-update-9nrbh" event={"ID":"60288a03-cb34-45c0-a727-0d822e01d9e8","Type":"ContainerStarted","Data":"4000bbb7789065388fe3e20fce0cc5e280ffb8873a06a294c22865368909273b"} Dec 05 13:07:50.745229 master-0 kubenswrapper[29936]: I1205 13:07:50.742364 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b46d8-backup-0" event={"ID":"0e8e3150-d45a-49f4-97d0-b45473ee80c5","Type":"ContainerStarted","Data":"60412ccb943459f08fff6f9e39e82ef874e056c50166f6629e982e25fc8defc3"} Dec 05 13:07:50.753716 master-0 kubenswrapper[29936]: I1205 13:07:50.751438 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:07:50.789304 master-0 kubenswrapper[29936]: I1205 13:07:50.784797 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" event={"ID":"6046b9e1-6a97-47a9-a88b-772270e4cdaf","Type":"ContainerStarted","Data":"378b55a682e4564b915ba4bef2cf9a5345a60e9b2084e132467dc39d8de40969"} Dec 05 13:07:50.789304 master-0 kubenswrapper[29936]: I1205 13:07:50.788074 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:50.839882 master-0 kubenswrapper[29936]: I1205 13:07:50.839815 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 05 13:07:50.842641 master-0 kubenswrapper[29936]: I1205 13:07:50.842574 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vt8w4"] Dec 05 13:07:50.869378 master-0 kubenswrapper[29936]: I1205 13:07:50.869261 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b46d8-backup-0" podStartSLOduration=13.869235381 podStartE2EDuration="13.869235381s" podCreationTimestamp="2025-12-05 13:07:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:07:50.810109815 +0000 UTC m=+1067.942189496" watchObservedRunningTime="2025-12-05 13:07:50.869235381 +0000 UTC m=+1068.001315062" Dec 05 13:07:50.892848 master-0 kubenswrapper[29936]: I1205 13:07:50.892694 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-combined-ca-bundle\") pod \"01d30809-19fa-4217-9b1b-e35e7504316e\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " Dec 05 13:07:50.893360 master-0 kubenswrapper[29936]: I1205 13:07:50.893007 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic\") pod \"01d30809-19fa-4217-9b1b-e35e7504316e\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " Dec 05 13:07:50.893360 master-0 kubenswrapper[29936]: I1205 13:07:50.893084 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-scripts\") pod \"01d30809-19fa-4217-9b1b-e35e7504316e\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " Dec 05 13:07:50.897266 master-0 kubenswrapper[29936]: I1205 13:07:50.893691 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"01d30809-19fa-4217-9b1b-e35e7504316e\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " Dec 05 13:07:50.897266 master-0 kubenswrapper[29936]: I1205 13:07:50.893765 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/01d30809-19fa-4217-9b1b-e35e7504316e-etc-podinfo\") pod \"01d30809-19fa-4217-9b1b-e35e7504316e\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " Dec 05 13:07:50.897266 master-0 kubenswrapper[29936]: I1205 13:07:50.894047 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntsw7\" (UniqueName: \"kubernetes.io/projected/01d30809-19fa-4217-9b1b-e35e7504316e-kube-api-access-ntsw7\") pod \"01d30809-19fa-4217-9b1b-e35e7504316e\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " Dec 05 13:07:50.897266 master-0 kubenswrapper[29936]: I1205 13:07:50.894167 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-config\") pod \"01d30809-19fa-4217-9b1b-e35e7504316e\" (UID: \"01d30809-19fa-4217-9b1b-e35e7504316e\") " Dec 05 13:07:50.897266 master-0 kubenswrapper[29936]: I1205 13:07:50.894667 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "01d30809-19fa-4217-9b1b-e35e7504316e" (UID: "01d30809-19fa-4217-9b1b-e35e7504316e"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:50.897266 master-0 kubenswrapper[29936]: I1205 13:07:50.895730 29936 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:50.897266 master-0 kubenswrapper[29936]: I1205 13:07:50.896290 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "01d30809-19fa-4217-9b1b-e35e7504316e" (UID: "01d30809-19fa-4217-9b1b-e35e7504316e"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:50.901956 master-0 kubenswrapper[29936]: I1205 13:07:50.901782 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/01d30809-19fa-4217-9b1b-e35e7504316e-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "01d30809-19fa-4217-9b1b-e35e7504316e" (UID: "01d30809-19fa-4217-9b1b-e35e7504316e"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 13:07:50.904393 master-0 kubenswrapper[29936]: I1205 13:07:50.904018 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-config" (OuterVolumeSpecName: "config") pod "01d30809-19fa-4217-9b1b-e35e7504316e" (UID: "01d30809-19fa-4217-9b1b-e35e7504316e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:50.905348 master-0 kubenswrapper[29936]: I1205 13:07:50.905281 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d30809-19fa-4217-9b1b-e35e7504316e-kube-api-access-ntsw7" (OuterVolumeSpecName: "kube-api-access-ntsw7") pod "01d30809-19fa-4217-9b1b-e35e7504316e" (UID: "01d30809-19fa-4217-9b1b-e35e7504316e"). InnerVolumeSpecName "kube-api-access-ntsw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:50.917634 master-0 kubenswrapper[29936]: I1205 13:07:50.917506 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-scripts" (OuterVolumeSpecName: "scripts") pod "01d30809-19fa-4217-9b1b-e35e7504316e" (UID: "01d30809-19fa-4217-9b1b-e35e7504316e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:50.936298 master-0 kubenswrapper[29936]: I1205 13:07:50.930463 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" podStartSLOduration=18.930433261 podStartE2EDuration="18.930433261s" podCreationTimestamp="2025-12-05 13:07:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:07:50.843832256 +0000 UTC m=+1067.975911947" watchObservedRunningTime="2025-12-05 13:07:50.930433261 +0000 UTC m=+1068.062512942" Dec 05 13:07:50.982884 master-0 kubenswrapper[29936]: I1205 13:07:50.980905 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01d30809-19fa-4217-9b1b-e35e7504316e" (UID: "01d30809-19fa-4217-9b1b-e35e7504316e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:51.016294 master-0 kubenswrapper[29936]: I1205 13:07:51.003487 29936 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/01d30809-19fa-4217-9b1b-e35e7504316e-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:51.016294 master-0 kubenswrapper[29936]: I1205 13:07:51.003548 29936 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/01d30809-19fa-4217-9b1b-e35e7504316e-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:51.016294 master-0 kubenswrapper[29936]: I1205 13:07:51.003564 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntsw7\" (UniqueName: \"kubernetes.io/projected/01d30809-19fa-4217-9b1b-e35e7504316e-kube-api-access-ntsw7\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:51.016294 master-0 kubenswrapper[29936]: I1205 13:07:51.003577 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:51.016294 master-0 kubenswrapper[29936]: I1205 13:07:51.003590 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:51.016294 master-0 kubenswrapper[29936]: I1205 13:07:51.003599 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01d30809-19fa-4217-9b1b-e35e7504316e-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:51.235171 master-0 kubenswrapper[29936]: I1205 13:07:51.235025 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee01710-7ad7-47e9-8268-09a33572ab6a" path="/var/lib/kubelet/pods/8ee01710-7ad7-47e9-8268-09a33572ab6a/volumes" Dec 05 13:07:51.424895 master-0 kubenswrapper[29936]: I1205 13:07:51.424605 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:07:51.425160 master-0 kubenswrapper[29936]: I1205 13:07:51.424954 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b46d8-default-external-api-0" podUID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" containerName="glance-log" containerID="cri-o://32763bbc7419a7e1ea06531624d0b3477a23d9b540fc3fbe12ca7af9e3c4701b" gracePeriod=30 Dec 05 13:07:51.425584 master-0 kubenswrapper[29936]: I1205 13:07:51.425493 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b46d8-default-external-api-0" podUID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" containerName="glance-httpd" containerID="cri-o://3d7fd03fb9af5943d7ca0daceaacc49e1f67c092ce546e5fe5caa7835b1d52b4" gracePeriod=30 Dec 05 13:07:51.480504 master-0 kubenswrapper[29936]: I1205 13:07:51.479563 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gbdmx"] Dec 05 13:07:51.809244 master-0 kubenswrapper[29936]: I1205 13:07:51.807520 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbdmx" event={"ID":"04779889-6879-4759-8928-4d0b0ed2eae2","Type":"ContainerStarted","Data":"5f9136aadda3bfbe68d096a3d99b2dc3af0af20d0e830746d9e01ce84421b0b7"} Dec 05 13:07:51.817202 master-0 kubenswrapper[29936]: I1205 13:07:51.815656 29936 generic.go:334] "Generic (PLEG): container finished" podID="03ed197d-e0d9-4970-8d1e-f79fa7c70697" containerID="d85dbcc812d584e3a0a4ecf780f8f03295299680afe6ef9c172753a77e815f75" exitCode=0 Dec 05 13:07:51.817202 master-0 kubenswrapper[29936]: I1205 13:07:51.815829 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cw997" event={"ID":"03ed197d-e0d9-4970-8d1e-f79fa7c70697","Type":"ContainerDied","Data":"d85dbcc812d584e3a0a4ecf780f8f03295299680afe6ef9c172753a77e815f75"} Dec 05 13:07:51.822206 master-0 kubenswrapper[29936]: I1205 13:07:51.819577 29936 generic.go:334] "Generic (PLEG): container finished" podID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" containerID="32763bbc7419a7e1ea06531624d0b3477a23d9b540fc3fbe12ca7af9e3c4701b" exitCode=143 Dec 05 13:07:51.822206 master-0 kubenswrapper[29936]: I1205 13:07:51.819680 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f","Type":"ContainerDied","Data":"32763bbc7419a7e1ea06531624d0b3477a23d9b540fc3fbe12ca7af9e3c4701b"} Dec 05 13:07:51.822206 master-0 kubenswrapper[29936]: I1205 13:07:51.821941 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0961-account-create-update-krxs2" event={"ID":"060bcbc7-c502-431d-8a7d-1f566a91f953","Type":"ContainerStarted","Data":"e86e7007a55ed024279e5d4b913060ea7f3399573acd4d84012481b69b00aef8"} Dec 05 13:07:51.847203 master-0 kubenswrapper[29936]: I1205 13:07:51.843498 29936 generic.go:334] "Generic (PLEG): container finished" podID="df7431e9-8625-453c-82d8-af5e79106c65" containerID="c52eed55412468e9aa9d8e1b60e274a8fd502a24537c226d30a722785012443e" exitCode=0 Dec 05 13:07:51.847203 master-0 kubenswrapper[29936]: I1205 13:07:51.843582 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt8w4" event={"ID":"df7431e9-8625-453c-82d8-af5e79106c65","Type":"ContainerDied","Data":"c52eed55412468e9aa9d8e1b60e274a8fd502a24537c226d30a722785012443e"} Dec 05 13:07:51.847203 master-0 kubenswrapper[29936]: I1205 13:07:51.843618 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt8w4" event={"ID":"df7431e9-8625-453c-82d8-af5e79106c65","Type":"ContainerStarted","Data":"7e186d7cf14d57ec154600822a40e6ee8d79f5d0ce721be347ca5d5344b6c241"} Dec 05 13:07:51.852202 master-0 kubenswrapper[29936]: I1205 13:07:51.849489 29936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 13:07:51.857282 master-0 kubenswrapper[29936]: I1205 13:07:51.854466 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b154-account-create-update-9nrbh" event={"ID":"60288a03-cb34-45c0-a727-0d822e01d9e8","Type":"ContainerStarted","Data":"85250c2847b34fb5877f76749df89d7236a0c4712b66a06086954f5d107a490f"} Dec 05 13:07:51.857282 master-0 kubenswrapper[29936]: I1205 13:07:51.855725 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 05 13:07:51.967220 master-0 kubenswrapper[29936]: I1205 13:07:51.966469 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-0961-account-create-update-krxs2" podStartSLOduration=4.96644087 podStartE2EDuration="4.96644087s" podCreationTimestamp="2025-12-05 13:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:07:51.887700027 +0000 UTC m=+1069.019779708" watchObservedRunningTime="2025-12-05 13:07:51.96644087 +0000 UTC m=+1069.098520551" Dec 05 13:07:52.271294 master-0 kubenswrapper[29936]: I1205 13:07:52.268706 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Dec 05 13:07:52.307213 master-0 kubenswrapper[29936]: I1205 13:07:52.302595 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Dec 05 13:07:52.331218 master-0 kubenswrapper[29936]: I1205 13:07:52.325648 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Dec 05 13:07:52.331218 master-0 kubenswrapper[29936]: E1205 13:07:52.327066 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d30809-19fa-4217-9b1b-e35e7504316e" containerName="ironic-python-agent-init" Dec 05 13:07:52.331218 master-0 kubenswrapper[29936]: I1205 13:07:52.327097 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d30809-19fa-4217-9b1b-e35e7504316e" containerName="ironic-python-agent-init" Dec 05 13:07:52.331218 master-0 kubenswrapper[29936]: I1205 13:07:52.327614 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d30809-19fa-4217-9b1b-e35e7504316e" containerName="ironic-python-agent-init" Dec 05 13:07:52.340217 master-0 kubenswrapper[29936]: I1205 13:07:52.334517 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 05 13:07:52.354212 master-0 kubenswrapper[29936]: I1205 13:07:52.345423 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Dec 05 13:07:52.354212 master-0 kubenswrapper[29936]: I1205 13:07:52.345610 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Dec 05 13:07:52.354212 master-0 kubenswrapper[29936]: I1205 13:07:52.345854 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Dec 05 13:07:52.354212 master-0 kubenswrapper[29936]: I1205 13:07:52.345980 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Dec 05 13:07:52.354212 master-0 kubenswrapper[29936]: I1205 13:07:52.346203 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Dec 05 13:07:52.354212 master-0 kubenswrapper[29936]: I1205 13:07:52.347821 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Dec 05 13:07:52.416580 master-0 kubenswrapper[29936]: I1205 13:07:52.416496 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-config\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.419402 master-0 kubenswrapper[29936]: I1205 13:07:52.419343 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.419532 master-0 kubenswrapper[29936]: I1205 13:07:52.419427 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.419694 master-0 kubenswrapper[29936]: I1205 13:07:52.419662 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/af1fc46b-7167-4446-9587-7ef591b4e661-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.419765 master-0 kubenswrapper[29936]: I1205 13:07:52.419703 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-scripts\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.419765 master-0 kubenswrapper[29936]: I1205 13:07:52.419746 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/af1fc46b-7167-4446-9587-7ef591b4e661-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.419968 master-0 kubenswrapper[29936]: I1205 13:07:52.419934 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/af1fc46b-7167-4446-9587-7ef591b4e661-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.420029 master-0 kubenswrapper[29936]: I1205 13:07:52.420003 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.420094 master-0 kubenswrapper[29936]: I1205 13:07:52.420075 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msst9\" (UniqueName: \"kubernetes.io/projected/af1fc46b-7167-4446-9587-7ef591b4e661-kube-api-access-msst9\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.525802 master-0 kubenswrapper[29936]: I1205 13:07:52.523271 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-config\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.525802 master-0 kubenswrapper[29936]: I1205 13:07:52.523630 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.525802 master-0 kubenswrapper[29936]: I1205 13:07:52.523659 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.525802 master-0 kubenswrapper[29936]: I1205 13:07:52.523790 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/af1fc46b-7167-4446-9587-7ef591b4e661-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.525802 master-0 kubenswrapper[29936]: I1205 13:07:52.523815 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-scripts\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.525802 master-0 kubenswrapper[29936]: I1205 13:07:52.523849 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/af1fc46b-7167-4446-9587-7ef591b4e661-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.525802 master-0 kubenswrapper[29936]: I1205 13:07:52.523972 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/af1fc46b-7167-4446-9587-7ef591b4e661-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.525802 master-0 kubenswrapper[29936]: I1205 13:07:52.524081 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.525802 master-0 kubenswrapper[29936]: I1205 13:07:52.524126 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msst9\" (UniqueName: \"kubernetes.io/projected/af1fc46b-7167-4446-9587-7ef591b4e661-kube-api-access-msst9\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.525802 master-0 kubenswrapper[29936]: I1205 13:07:52.525514 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/af1fc46b-7167-4446-9587-7ef591b4e661-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.534526 master-0 kubenswrapper[29936]: I1205 13:07:52.534359 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.535784 master-0 kubenswrapper[29936]: I1205 13:07:52.535713 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/af1fc46b-7167-4446-9587-7ef591b4e661-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.536135 master-0 kubenswrapper[29936]: I1205 13:07:52.536100 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/af1fc46b-7167-4446-9587-7ef591b4e661-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.536453 master-0 kubenswrapper[29936]: I1205 13:07:52.536397 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-config\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.537709 master-0 kubenswrapper[29936]: I1205 13:07:52.537669 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.540229 master-0 kubenswrapper[29936]: I1205 13:07:52.540171 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-scripts\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.542010 master-0 kubenswrapper[29936]: I1205 13:07:52.541851 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1fc46b-7167-4446-9587-7ef591b4e661-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.547992 master-0 kubenswrapper[29936]: I1205 13:07:52.544803 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msst9\" (UniqueName: \"kubernetes.io/projected/af1fc46b-7167-4446-9587-7ef591b4e661-kube-api-access-msst9\") pod \"ironic-inspector-0\" (UID: \"af1fc46b-7167-4446-9587-7ef591b4e661\") " pod="openstack/ironic-inspector-0" Dec 05 13:07:52.645303 master-0 kubenswrapper[29936]: I1205 13:07:52.645215 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mh6s6" Dec 05 13:07:52.702565 master-0 kubenswrapper[29936]: I1205 13:07:52.702442 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Dec 05 13:07:52.730114 master-0 kubenswrapper[29936]: I1205 13:07:52.729484 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a056507-477d-4d77-9905-c7c6344e92ec-operator-scripts\") pod \"6a056507-477d-4d77-9905-c7c6344e92ec\" (UID: \"6a056507-477d-4d77-9905-c7c6344e92ec\") " Dec 05 13:07:52.730624 master-0 kubenswrapper[29936]: I1205 13:07:52.730343 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6hzs\" (UniqueName: \"kubernetes.io/projected/6a056507-477d-4d77-9905-c7c6344e92ec-kube-api-access-p6hzs\") pod \"6a056507-477d-4d77-9905-c7c6344e92ec\" (UID: \"6a056507-477d-4d77-9905-c7c6344e92ec\") " Dec 05 13:07:52.731847 master-0 kubenswrapper[29936]: I1205 13:07:52.731668 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a056507-477d-4d77-9905-c7c6344e92ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a056507-477d-4d77-9905-c7c6344e92ec" (UID: "6a056507-477d-4d77-9905-c7c6344e92ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:52.733423 master-0 kubenswrapper[29936]: I1205 13:07:52.733332 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a056507-477d-4d77-9905-c7c6344e92ec-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:52.739658 master-0 kubenswrapper[29936]: I1205 13:07:52.739549 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a056507-477d-4d77-9905-c7c6344e92ec-kube-api-access-p6hzs" (OuterVolumeSpecName: "kube-api-access-p6hzs") pod "6a056507-477d-4d77-9905-c7c6344e92ec" (UID: "6a056507-477d-4d77-9905-c7c6344e92ec"). InnerVolumeSpecName "kube-api-access-p6hzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:52.838561 master-0 kubenswrapper[29936]: I1205 13:07:52.838475 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6hzs\" (UniqueName: \"kubernetes.io/projected/6a056507-477d-4d77-9905-c7c6344e92ec-kube-api-access-p6hzs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:52.872046 master-0 kubenswrapper[29936]: I1205 13:07:52.871984 29936 generic.go:334] "Generic (PLEG): container finished" podID="060bcbc7-c502-431d-8a7d-1f566a91f953" containerID="e86e7007a55ed024279e5d4b913060ea7f3399573acd4d84012481b69b00aef8" exitCode=0 Dec 05 13:07:52.872192 master-0 kubenswrapper[29936]: I1205 13:07:52.872110 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0961-account-create-update-krxs2" event={"ID":"060bcbc7-c502-431d-8a7d-1f566a91f953","Type":"ContainerDied","Data":"e86e7007a55ed024279e5d4b913060ea7f3399573acd4d84012481b69b00aef8"} Dec 05 13:07:52.880412 master-0 kubenswrapper[29936]: I1205 13:07:52.877120 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fkxvl" event={"ID":"d934457f-86b7-4ccf-b0da-8625268d2a56","Type":"ContainerDied","Data":"24d6dd1f2edf2bd3725bcb39a29fe04ccba3cfc3bfe93127d27d12d76feb7686"} Dec 05 13:07:52.880412 master-0 kubenswrapper[29936]: I1205 13:07:52.877244 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24d6dd1f2edf2bd3725bcb39a29fe04ccba3cfc3bfe93127d27d12d76feb7686" Dec 05 13:07:52.880412 master-0 kubenswrapper[29936]: I1205 13:07:52.879304 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dc7b-account-create-update-6mmc6" event={"ID":"a8eb8d4a-1ed8-445b-a455-e79b6659317f","Type":"ContainerDied","Data":"bd4d0fe856bbf56a8c0ce0bf9feea954af1f889df7b673cf7869ec130cee9169"} Dec 05 13:07:52.880412 master-0 kubenswrapper[29936]: I1205 13:07:52.879360 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd4d0fe856bbf56a8c0ce0bf9feea954af1f889df7b673cf7869ec130cee9169" Dec 05 13:07:52.884270 master-0 kubenswrapper[29936]: I1205 13:07:52.881667 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mh6s6" event={"ID":"6a056507-477d-4d77-9905-c7c6344e92ec","Type":"ContainerDied","Data":"4aa2d067e8b6fd9504893e1a730dc4ccb017d44a02c83ca3e9e555d0819745e1"} Dec 05 13:07:52.884270 master-0 kubenswrapper[29936]: I1205 13:07:52.881696 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aa2d067e8b6fd9504893e1a730dc4ccb017d44a02c83ca3e9e555d0819745e1" Dec 05 13:07:52.884270 master-0 kubenswrapper[29936]: I1205 13:07:52.881775 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mh6s6" Dec 05 13:07:52.886125 master-0 kubenswrapper[29936]: I1205 13:07:52.885939 29936 generic.go:334] "Generic (PLEG): container finished" podID="60288a03-cb34-45c0-a727-0d822e01d9e8" containerID="85250c2847b34fb5877f76749df89d7236a0c4712b66a06086954f5d107a490f" exitCode=0 Dec 05 13:07:52.886125 master-0 kubenswrapper[29936]: I1205 13:07:52.886033 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b154-account-create-update-9nrbh" event={"ID":"60288a03-cb34-45c0-a727-0d822e01d9e8","Type":"ContainerDied","Data":"85250c2847b34fb5877f76749df89d7236a0c4712b66a06086954f5d107a490f"} Dec 05 13:07:52.887741 master-0 kubenswrapper[29936]: I1205 13:07:52.887701 29936 generic.go:334] "Generic (PLEG): container finished" podID="04779889-6879-4759-8928-4d0b0ed2eae2" containerID="e0b3151d38a97d8df9eb9d93526631002933582acba759042db83ae719d71acb" exitCode=0 Dec 05 13:07:52.889236 master-0 kubenswrapper[29936]: I1205 13:07:52.888334 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbdmx" event={"ID":"04779889-6879-4759-8928-4d0b0ed2eae2","Type":"ContainerDied","Data":"e0b3151d38a97d8df9eb9d93526631002933582acba759042db83ae719d71acb"} Dec 05 13:07:52.900059 master-0 kubenswrapper[29936]: I1205 13:07:52.899991 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dc7b-account-create-update-6mmc6" Dec 05 13:07:52.914129 master-0 kubenswrapper[29936]: I1205 13:07:52.914020 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fkxvl" Dec 05 13:07:52.940540 master-0 kubenswrapper[29936]: I1205 13:07:52.940199 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8eb8d4a-1ed8-445b-a455-e79b6659317f-operator-scripts\") pod \"a8eb8d4a-1ed8-445b-a455-e79b6659317f\" (UID: \"a8eb8d4a-1ed8-445b-a455-e79b6659317f\") " Dec 05 13:07:52.941974 master-0 kubenswrapper[29936]: I1205 13:07:52.941885 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwdhc\" (UniqueName: \"kubernetes.io/projected/a8eb8d4a-1ed8-445b-a455-e79b6659317f-kube-api-access-jwdhc\") pod \"a8eb8d4a-1ed8-445b-a455-e79b6659317f\" (UID: \"a8eb8d4a-1ed8-445b-a455-e79b6659317f\") " Dec 05 13:07:52.944645 master-0 kubenswrapper[29936]: I1205 13:07:52.941921 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8eb8d4a-1ed8-445b-a455-e79b6659317f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8eb8d4a-1ed8-445b-a455-e79b6659317f" (UID: "a8eb8d4a-1ed8-445b-a455-e79b6659317f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:52.949759 master-0 kubenswrapper[29936]: I1205 13:07:52.949679 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8eb8d4a-1ed8-445b-a455-e79b6659317f-kube-api-access-jwdhc" (OuterVolumeSpecName: "kube-api-access-jwdhc") pod "a8eb8d4a-1ed8-445b-a455-e79b6659317f" (UID: "a8eb8d4a-1ed8-445b-a455-e79b6659317f"). InnerVolumeSpecName "kube-api-access-jwdhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:53.052849 master-0 kubenswrapper[29936]: I1205 13:07:53.052659 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d934457f-86b7-4ccf-b0da-8625268d2a56-operator-scripts\") pod \"d934457f-86b7-4ccf-b0da-8625268d2a56\" (UID: \"d934457f-86b7-4ccf-b0da-8625268d2a56\") " Dec 05 13:07:53.053192 master-0 kubenswrapper[29936]: I1205 13:07:53.053130 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzd7z\" (UniqueName: \"kubernetes.io/projected/d934457f-86b7-4ccf-b0da-8625268d2a56-kube-api-access-mzd7z\") pod \"d934457f-86b7-4ccf-b0da-8625268d2a56\" (UID: \"d934457f-86b7-4ccf-b0da-8625268d2a56\") " Dec 05 13:07:53.054431 master-0 kubenswrapper[29936]: I1205 13:07:53.053956 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d934457f-86b7-4ccf-b0da-8625268d2a56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d934457f-86b7-4ccf-b0da-8625268d2a56" (UID: "d934457f-86b7-4ccf-b0da-8625268d2a56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:53.056081 master-0 kubenswrapper[29936]: I1205 13:07:53.056032 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8eb8d4a-1ed8-445b-a455-e79b6659317f-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:53.056154 master-0 kubenswrapper[29936]: I1205 13:07:53.056089 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwdhc\" (UniqueName: \"kubernetes.io/projected/a8eb8d4a-1ed8-445b-a455-e79b6659317f-kube-api-access-jwdhc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:53.056154 master-0 kubenswrapper[29936]: I1205 13:07:53.056103 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d934457f-86b7-4ccf-b0da-8625268d2a56-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:53.079695 master-0 kubenswrapper[29936]: I1205 13:07:53.079613 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d934457f-86b7-4ccf-b0da-8625268d2a56-kube-api-access-mzd7z" (OuterVolumeSpecName: "kube-api-access-mzd7z") pod "d934457f-86b7-4ccf-b0da-8625268d2a56" (UID: "d934457f-86b7-4ccf-b0da-8625268d2a56"). InnerVolumeSpecName "kube-api-access-mzd7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:53.173081 master-0 kubenswrapper[29936]: I1205 13:07:53.172916 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:53.174690 master-0 kubenswrapper[29936]: I1205 13:07:53.174642 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzd7z\" (UniqueName: \"kubernetes.io/projected/d934457f-86b7-4ccf-b0da-8625268d2a56-kube-api-access-mzd7z\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:53.297393 master-0 kubenswrapper[29936]: I1205 13:07:53.296783 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d30809-19fa-4217-9b1b-e35e7504316e" path="/var/lib/kubelet/pods/01d30809-19fa-4217-9b1b-e35e7504316e/volumes" Dec 05 13:07:53.378529 master-0 kubenswrapper[29936]: I1205 13:07:53.378465 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:07:53.378921 master-0 kubenswrapper[29936]: I1205 13:07:53.378870 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b46d8-default-internal-api-0" podUID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" containerName="glance-log" containerID="cri-o://91537211166e171704ba5738cc011df13401c26aa877dfe384da4a0d9bc16e8f" gracePeriod=30 Dec 05 13:07:53.397558 master-0 kubenswrapper[29936]: I1205 13:07:53.391137 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-b46d8-default-internal-api-0" podUID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" containerName="glance-httpd" containerID="cri-o://583c77d7821dfe955594ede88ab11361e9b1c678ebf2612190f5915df17f6217" gracePeriod=30 Dec 05 13:07:53.449226 master-0 kubenswrapper[29936]: I1205 13:07:53.437781 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Dec 05 13:07:53.542412 master-0 kubenswrapper[29936]: I1205 13:07:53.542342 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cw997" Dec 05 13:07:53.627222 master-0 kubenswrapper[29936]: I1205 13:07:53.627139 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03ed197d-e0d9-4970-8d1e-f79fa7c70697-operator-scripts\") pod \"03ed197d-e0d9-4970-8d1e-f79fa7c70697\" (UID: \"03ed197d-e0d9-4970-8d1e-f79fa7c70697\") " Dec 05 13:07:53.627582 master-0 kubenswrapper[29936]: I1205 13:07:53.627550 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jghht\" (UniqueName: \"kubernetes.io/projected/03ed197d-e0d9-4970-8d1e-f79fa7c70697-kube-api-access-jghht\") pod \"03ed197d-e0d9-4970-8d1e-f79fa7c70697\" (UID: \"03ed197d-e0d9-4970-8d1e-f79fa7c70697\") " Dec 05 13:07:53.629071 master-0 kubenswrapper[29936]: I1205 13:07:53.628999 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ed197d-e0d9-4970-8d1e-f79fa7c70697-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03ed197d-e0d9-4970-8d1e-f79fa7c70697" (UID: "03ed197d-e0d9-4970-8d1e-f79fa7c70697"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:53.632951 master-0 kubenswrapper[29936]: I1205 13:07:53.632758 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ed197d-e0d9-4970-8d1e-f79fa7c70697-kube-api-access-jghht" (OuterVolumeSpecName: "kube-api-access-jghht") pod "03ed197d-e0d9-4970-8d1e-f79fa7c70697" (UID: "03ed197d-e0d9-4970-8d1e-f79fa7c70697"). InnerVolumeSpecName "kube-api-access-jghht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:53.731601 master-0 kubenswrapper[29936]: I1205 13:07:53.731512 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jghht\" (UniqueName: \"kubernetes.io/projected/03ed197d-e0d9-4970-8d1e-f79fa7c70697-kube-api-access-jghht\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:53.731601 master-0 kubenswrapper[29936]: I1205 13:07:53.731572 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03ed197d-e0d9-4970-8d1e-f79fa7c70697-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:53.744316 master-0 kubenswrapper[29936]: I1205 13:07:53.743511 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b154-account-create-update-9nrbh" Dec 05 13:07:53.746834 master-0 kubenswrapper[29936]: I1205 13:07:53.745920 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-b46d8-backup-0" Dec 05 13:07:53.839071 master-0 kubenswrapper[29936]: I1205 13:07:53.838979 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dhzr\" (UniqueName: \"kubernetes.io/projected/60288a03-cb34-45c0-a727-0d822e01d9e8-kube-api-access-5dhzr\") pod \"60288a03-cb34-45c0-a727-0d822e01d9e8\" (UID: \"60288a03-cb34-45c0-a727-0d822e01d9e8\") " Dec 05 13:07:53.839934 master-0 kubenswrapper[29936]: I1205 13:07:53.839158 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60288a03-cb34-45c0-a727-0d822e01d9e8-operator-scripts\") pod \"60288a03-cb34-45c0-a727-0d822e01d9e8\" (UID: \"60288a03-cb34-45c0-a727-0d822e01d9e8\") " Dec 05 13:07:53.841525 master-0 kubenswrapper[29936]: I1205 13:07:53.841357 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60288a03-cb34-45c0-a727-0d822e01d9e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60288a03-cb34-45c0-a727-0d822e01d9e8" (UID: "60288a03-cb34-45c0-a727-0d822e01d9e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:53.880506 master-0 kubenswrapper[29936]: I1205 13:07:53.879493 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60288a03-cb34-45c0-a727-0d822e01d9e8-kube-api-access-5dhzr" (OuterVolumeSpecName: "kube-api-access-5dhzr") pod "60288a03-cb34-45c0-a727-0d822e01d9e8" (UID: "60288a03-cb34-45c0-a727-0d822e01d9e8"). InnerVolumeSpecName "kube-api-access-5dhzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:53.914352 master-0 kubenswrapper[29936]: I1205 13:07:53.914264 29936 generic.go:334] "Generic (PLEG): container finished" podID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" containerID="91537211166e171704ba5738cc011df13401c26aa877dfe384da4a0d9bc16e8f" exitCode=143 Dec 05 13:07:53.915088 master-0 kubenswrapper[29936]: I1205 13:07:53.915037 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3","Type":"ContainerDied","Data":"91537211166e171704ba5738cc011df13401c26aa877dfe384da4a0d9bc16e8f"} Dec 05 13:07:53.918303 master-0 kubenswrapper[29936]: I1205 13:07:53.918260 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cw997" event={"ID":"03ed197d-e0d9-4970-8d1e-f79fa7c70697","Type":"ContainerDied","Data":"0afb94c085487213e666e6188423baf3bcebcd3bac5826b68d461d8820363951"} Dec 05 13:07:53.918303 master-0 kubenswrapper[29936]: I1205 13:07:53.918294 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0afb94c085487213e666e6188423baf3bcebcd3bac5826b68d461d8820363951" Dec 05 13:07:53.918448 master-0 kubenswrapper[29936]: I1205 13:07:53.918365 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cw997" Dec 05 13:07:53.945937 master-0 kubenswrapper[29936]: I1205 13:07:53.945878 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dhzr\" (UniqueName: \"kubernetes.io/projected/60288a03-cb34-45c0-a727-0d822e01d9e8-kube-api-access-5dhzr\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:53.946253 master-0 kubenswrapper[29936]: I1205 13:07:53.946240 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60288a03-cb34-45c0-a727-0d822e01d9e8-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:53.952366 master-0 kubenswrapper[29936]: I1205 13:07:53.950496 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"af1fc46b-7167-4446-9587-7ef591b4e661","Type":"ContainerStarted","Data":"a6cf5c98225e6961897e00d15d33f000760a20a6452cb2a3296b537d4759ab56"} Dec 05 13:07:53.980857 master-0 kubenswrapper[29936]: I1205 13:07:53.980785 29936 generic.go:334] "Generic (PLEG): container finished" podID="df7431e9-8625-453c-82d8-af5e79106c65" containerID="3e174302d928304084899d6919b27c77b1c1e2c681d314e39c76d2834a936eb0" exitCode=0 Dec 05 13:07:53.981145 master-0 kubenswrapper[29936]: I1205 13:07:53.980896 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt8w4" event={"ID":"df7431e9-8625-453c-82d8-af5e79106c65","Type":"ContainerDied","Data":"3e174302d928304084899d6919b27c77b1c1e2c681d314e39c76d2834a936eb0"} Dec 05 13:07:53.988584 master-0 kubenswrapper[29936]: I1205 13:07:53.988539 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fkxvl" Dec 05 13:07:53.988584 master-0 kubenswrapper[29936]: I1205 13:07:53.988553 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dc7b-account-create-update-6mmc6" Dec 05 13:07:53.988747 master-0 kubenswrapper[29936]: I1205 13:07:53.988577 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b154-account-create-update-9nrbh" event={"ID":"60288a03-cb34-45c0-a727-0d822e01d9e8","Type":"ContainerDied","Data":"4000bbb7789065388fe3e20fce0cc5e280ffb8873a06a294c22865368909273b"} Dec 05 13:07:53.988747 master-0 kubenswrapper[29936]: I1205 13:07:53.988624 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4000bbb7789065388fe3e20fce0cc5e280ffb8873a06a294c22865368909273b" Dec 05 13:07:53.988747 master-0 kubenswrapper[29936]: I1205 13:07:53.988648 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b154-account-create-update-9nrbh" Dec 05 13:07:54.481799 master-0 kubenswrapper[29936]: E1205 13:07:54.475668 29936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60288a03_cb34_45c0_a727_0d822e01d9e8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60288a03_cb34_45c0_a727_0d822e01d9e8.slice/crio-4000bbb7789065388fe3e20fce0cc5e280ffb8873a06a294c22865368909273b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf1fc46b_7167_4446_9587_7ef591b4e661.slice/crio-conmon-b2948d3af31040a5f9b39bd451a29b5002a8a7877921e10759c0b27acec542e0.scope\": RecentStats: unable to find data in memory cache]" Dec 05 13:07:54.562836 master-0 kubenswrapper[29936]: I1205 13:07:54.562743 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0961-account-create-update-krxs2" Dec 05 13:07:54.683608 master-0 kubenswrapper[29936]: I1205 13:07:54.676268 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xr9f\" (UniqueName: \"kubernetes.io/projected/060bcbc7-c502-431d-8a7d-1f566a91f953-kube-api-access-2xr9f\") pod \"060bcbc7-c502-431d-8a7d-1f566a91f953\" (UID: \"060bcbc7-c502-431d-8a7d-1f566a91f953\") " Dec 05 13:07:54.683608 master-0 kubenswrapper[29936]: I1205 13:07:54.676671 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060bcbc7-c502-431d-8a7d-1f566a91f953-operator-scripts\") pod \"060bcbc7-c502-431d-8a7d-1f566a91f953\" (UID: \"060bcbc7-c502-431d-8a7d-1f566a91f953\") " Dec 05 13:07:54.683608 master-0 kubenswrapper[29936]: I1205 13:07:54.677729 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060bcbc7-c502-431d-8a7d-1f566a91f953-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "060bcbc7-c502-431d-8a7d-1f566a91f953" (UID: "060bcbc7-c502-431d-8a7d-1f566a91f953"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:54.683608 master-0 kubenswrapper[29936]: I1205 13:07:54.680413 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060bcbc7-c502-431d-8a7d-1f566a91f953-kube-api-access-2xr9f" (OuterVolumeSpecName: "kube-api-access-2xr9f") pod "060bcbc7-c502-431d-8a7d-1f566a91f953" (UID: "060bcbc7-c502-431d-8a7d-1f566a91f953"). InnerVolumeSpecName "kube-api-access-2xr9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:54.780050 master-0 kubenswrapper[29936]: I1205 13:07:54.779926 29936 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/060bcbc7-c502-431d-8a7d-1f566a91f953-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:54.780050 master-0 kubenswrapper[29936]: I1205 13:07:54.780020 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xr9f\" (UniqueName: \"kubernetes.io/projected/060bcbc7-c502-431d-8a7d-1f566a91f953-kube-api-access-2xr9f\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:55.004272 master-0 kubenswrapper[29936]: I1205 13:07:55.004186 29936 generic.go:334] "Generic (PLEG): container finished" podID="af1fc46b-7167-4446-9587-7ef591b4e661" containerID="b2948d3af31040a5f9b39bd451a29b5002a8a7877921e10759c0b27acec542e0" exitCode=0 Dec 05 13:07:55.004983 master-0 kubenswrapper[29936]: I1205 13:07:55.004910 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"af1fc46b-7167-4446-9587-7ef591b4e661","Type":"ContainerDied","Data":"b2948d3af31040a5f9b39bd451a29b5002a8a7877921e10759c0b27acec542e0"} Dec 05 13:07:55.011877 master-0 kubenswrapper[29936]: I1205 13:07:55.011809 29936 generic.go:334] "Generic (PLEG): container finished" podID="cc2221d6-014e-4bd4-962b-24512ebf84e8" containerID="b8dd7e13c98ceb50d8e28003045fed8a9665f1673eae8fbe1b1c32c4e32f07b5" exitCode=0 Dec 05 13:07:55.011947 master-0 kubenswrapper[29936]: I1205 13:07:55.011887 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"cc2221d6-014e-4bd4-962b-24512ebf84e8","Type":"ContainerDied","Data":"b8dd7e13c98ceb50d8e28003045fed8a9665f1673eae8fbe1b1c32c4e32f07b5"} Dec 05 13:07:55.017395 master-0 kubenswrapper[29936]: I1205 13:07:55.017312 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbdmx" event={"ID":"04779889-6879-4759-8928-4d0b0ed2eae2","Type":"ContainerStarted","Data":"f827acfbf32c26bca61492c0f9bf9a4be131282cd1ba603aff56047a3be9bb4f"} Dec 05 13:07:55.019819 master-0 kubenswrapper[29936]: I1205 13:07:55.019768 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0961-account-create-update-krxs2" event={"ID":"060bcbc7-c502-431d-8a7d-1f566a91f953","Type":"ContainerDied","Data":"759a5df3f1ef7ebaad2c8ac2751ad2ed0b13b69e11d40aa455756bf61e509db7"} Dec 05 13:07:55.019871 master-0 kubenswrapper[29936]: I1205 13:07:55.019820 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="759a5df3f1ef7ebaad2c8ac2751ad2ed0b13b69e11d40aa455756bf61e509db7" Dec 05 13:07:55.019910 master-0 kubenswrapper[29936]: I1205 13:07:55.019883 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0961-account-create-update-krxs2" Dec 05 13:07:56.044903 master-0 kubenswrapper[29936]: I1205 13:07:56.044809 29936 generic.go:334] "Generic (PLEG): container finished" podID="04779889-6879-4759-8928-4d0b0ed2eae2" containerID="f827acfbf32c26bca61492c0f9bf9a4be131282cd1ba603aff56047a3be9bb4f" exitCode=0 Dec 05 13:07:56.045646 master-0 kubenswrapper[29936]: I1205 13:07:56.044986 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbdmx" event={"ID":"04779889-6879-4759-8928-4d0b0ed2eae2","Type":"ContainerDied","Data":"f827acfbf32c26bca61492c0f9bf9a4be131282cd1ba603aff56047a3be9bb4f"} Dec 05 13:07:56.052309 master-0 kubenswrapper[29936]: I1205 13:07:56.052272 29936 generic.go:334] "Generic (PLEG): container finished" podID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" containerID="3d7fd03fb9af5943d7ca0daceaacc49e1f67c092ce546e5fe5caa7835b1d52b4" exitCode=0 Dec 05 13:07:56.052391 master-0 kubenswrapper[29936]: I1205 13:07:56.052311 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f","Type":"ContainerDied","Data":"3d7fd03fb9af5943d7ca0daceaacc49e1f67c092ce546e5fe5caa7835b1d52b4"} Dec 05 13:07:57.076696 master-0 kubenswrapper[29936]: I1205 13:07:57.076616 29936 generic.go:334] "Generic (PLEG): container finished" podID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" containerID="583c77d7821dfe955594ede88ab11361e9b1c678ebf2612190f5915df17f6217" exitCode=0 Dec 05 13:07:57.077305 master-0 kubenswrapper[29936]: I1205 13:07:57.076701 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3","Type":"ContainerDied","Data":"583c77d7821dfe955594ede88ab11361e9b1c678ebf2612190f5915df17f6217"} Dec 05 13:07:57.549482 master-0 kubenswrapper[29936]: I1205 13:07:57.549422 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:57.675431 master-0 kubenswrapper[29936]: I1205 13:07:57.665552 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-public-tls-certs\") pod \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " Dec 05 13:07:57.675431 master-0 kubenswrapper[29936]: I1205 13:07:57.665735 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-combined-ca-bundle\") pod \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " Dec 05 13:07:57.675431 master-0 kubenswrapper[29936]: I1205 13:07:57.665778 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-httpd-run\") pod \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " Dec 05 13:07:57.675431 master-0 kubenswrapper[29936]: I1205 13:07:57.666095 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " Dec 05 13:07:57.675431 master-0 kubenswrapper[29936]: I1205 13:07:57.666191 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-scripts\") pod \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " Dec 05 13:07:57.675431 master-0 kubenswrapper[29936]: I1205 13:07:57.666230 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-logs\") pod \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " Dec 05 13:07:57.675431 master-0 kubenswrapper[29936]: I1205 13:07:57.666315 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fw2z\" (UniqueName: \"kubernetes.io/projected/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-kube-api-access-5fw2z\") pod \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " Dec 05 13:07:57.675431 master-0 kubenswrapper[29936]: I1205 13:07:57.666578 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-config-data\") pod \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\" (UID: \"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f\") " Dec 05 13:07:57.677644 master-0 kubenswrapper[29936]: I1205 13:07:57.677160 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" (UID: "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:57.678343 master-0 kubenswrapper[29936]: I1205 13:07:57.678249 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-logs" (OuterVolumeSpecName: "logs") pod "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" (UID: "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:57.682440 master-0 kubenswrapper[29936]: I1205 13:07:57.682369 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-kube-api-access-5fw2z" (OuterVolumeSpecName: "kube-api-access-5fw2z") pod "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" (UID: "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f"). InnerVolumeSpecName "kube-api-access-5fw2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:57.685056 master-0 kubenswrapper[29936]: I1205 13:07:57.683319 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-scripts" (OuterVolumeSpecName: "scripts") pod "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" (UID: "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:57.700183 master-0 kubenswrapper[29936]: I1205 13:07:57.700109 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c" (OuterVolumeSpecName: "glance") pod "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" (UID: "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f"). InnerVolumeSpecName "pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 13:07:57.726403 master-0 kubenswrapper[29936]: I1205 13:07:57.725785 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" (UID: "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:57.741606 master-0 kubenswrapper[29936]: I1205 13:07:57.741526 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-config-data" (OuterVolumeSpecName: "config-data") pod "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" (UID: "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:57.770840 master-0 kubenswrapper[29936]: I1205 13:07:57.770530 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.770840 master-0 kubenswrapper[29936]: I1205 13:07:57.770607 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.770840 master-0 kubenswrapper[29936]: I1205 13:07:57.770621 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fw2z\" (UniqueName: \"kubernetes.io/projected/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-kube-api-access-5fw2z\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.770840 master-0 kubenswrapper[29936]: I1205 13:07:57.770642 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.770840 master-0 kubenswrapper[29936]: I1205 13:07:57.770654 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.770840 master-0 kubenswrapper[29936]: I1205 13:07:57.770663 29936 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.770840 master-0 kubenswrapper[29936]: I1205 13:07:57.770719 29936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") on node \"master-0\" " Dec 05 13:07:57.798617 master-0 kubenswrapper[29936]: I1205 13:07:57.788507 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" (UID: "8fce82a4-d12a-4773-bcc6-37cfc2a46b3f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:57.845985 master-0 kubenswrapper[29936]: I1205 13:07:57.845929 29936 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 13:07:57.846279 master-0 kubenswrapper[29936]: I1205 13:07:57.846237 29936 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe" (UniqueName: "kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c") on node "master-0" Dec 05 13:07:57.879258 master-0 kubenswrapper[29936]: I1205 13:07:57.879196 29936 reconciler_common.go:293] "Volume detached for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.880577 master-0 kubenswrapper[29936]: I1205 13:07:57.880500 29936 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.904653 master-0 kubenswrapper[29936]: I1205 13:07:57.904162 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:57.982109 master-0 kubenswrapper[29936]: I1205 13:07:57.982048 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " Dec 05 13:07:57.982109 master-0 kubenswrapper[29936]: I1205 13:07:57.982114 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwvk2\" (UniqueName: \"kubernetes.io/projected/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-kube-api-access-bwvk2\") pod \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " Dec 05 13:07:57.982404 master-0 kubenswrapper[29936]: I1205 13:07:57.982334 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-combined-ca-bundle\") pod \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " Dec 05 13:07:57.982454 master-0 kubenswrapper[29936]: I1205 13:07:57.982412 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-logs\") pod \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " Dec 05 13:07:57.982454 master-0 kubenswrapper[29936]: I1205 13:07:57.982431 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-config-data\") pod \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " Dec 05 13:07:57.982596 master-0 kubenswrapper[29936]: I1205 13:07:57.982574 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-internal-tls-certs\") pod \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " Dec 05 13:07:57.982697 master-0 kubenswrapper[29936]: I1205 13:07:57.982669 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-scripts\") pod \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " Dec 05 13:07:57.982814 master-0 kubenswrapper[29936]: I1205 13:07:57.982792 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-httpd-run\") pod \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\" (UID: \"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3\") " Dec 05 13:07:57.991972 master-0 kubenswrapper[29936]: I1205 13:07:57.985824 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-logs" (OuterVolumeSpecName: "logs") pod "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" (UID: "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:57.991972 master-0 kubenswrapper[29936]: I1205 13:07:57.990372 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" (UID: "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:07:57.992815 master-0 kubenswrapper[29936]: I1205 13:07:57.992738 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-kube-api-access-bwvk2" (OuterVolumeSpecName: "kube-api-access-bwvk2") pod "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" (UID: "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3"). InnerVolumeSpecName "kube-api-access-bwvk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:57.993570 master-0 kubenswrapper[29936]: I1205 13:07:57.993508 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.993570 master-0 kubenswrapper[29936]: I1205 13:07:57.993569 29936 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.993570 master-0 kubenswrapper[29936]: I1205 13:07:57.993584 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwvk2\" (UniqueName: \"kubernetes.io/projected/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-kube-api-access-bwvk2\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:57.998122 master-0 kubenswrapper[29936]: I1205 13:07:57.998039 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-scripts" (OuterVolumeSpecName: "scripts") pod "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" (UID: "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:58.009799 master-0 kubenswrapper[29936]: I1205 13:07:58.009117 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f" (OuterVolumeSpecName: "glance") pod "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" (UID: "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3"). InnerVolumeSpecName "pvc-0d3529a7-2405-43a7-8986-74c66fb23772". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 13:07:58.080235 master-0 kubenswrapper[29936]: I1205 13:07:58.080131 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" (UID: "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:58.087065 master-0 kubenswrapper[29936]: I1205 13:07:58.087005 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qjcct"] Dec 05 13:07:58.088336 master-0 kubenswrapper[29936]: E1205 13:07:58.088105 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ed197d-e0d9-4970-8d1e-f79fa7c70697" containerName="mariadb-database-create" Dec 05 13:07:58.088336 master-0 kubenswrapper[29936]: I1205 13:07:58.088337 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ed197d-e0d9-4970-8d1e-f79fa7c70697" containerName="mariadb-database-create" Dec 05 13:07:58.088444 master-0 kubenswrapper[29936]: E1205 13:07:58.088409 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8eb8d4a-1ed8-445b-a455-e79b6659317f" containerName="mariadb-account-create-update" Dec 05 13:07:58.088444 master-0 kubenswrapper[29936]: I1205 13:07:58.088418 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8eb8d4a-1ed8-445b-a455-e79b6659317f" containerName="mariadb-account-create-update" Dec 05 13:07:58.088519 master-0 kubenswrapper[29936]: E1205 13:07:58.088453 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" containerName="glance-log" Dec 05 13:07:58.088519 master-0 kubenswrapper[29936]: I1205 13:07:58.088462 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" containerName="glance-log" Dec 05 13:07:58.088519 master-0 kubenswrapper[29936]: E1205 13:07:58.088476 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" containerName="glance-httpd" Dec 05 13:07:58.088519 master-0 kubenswrapper[29936]: I1205 13:07:58.088484 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" containerName="glance-httpd" Dec 05 13:07:58.088519 master-0 kubenswrapper[29936]: E1205 13:07:58.088502 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" containerName="glance-httpd" Dec 05 13:07:58.088519 master-0 kubenswrapper[29936]: I1205 13:07:58.088513 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" containerName="glance-httpd" Dec 05 13:07:58.088702 master-0 kubenswrapper[29936]: E1205 13:07:58.088537 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" containerName="glance-log" Dec 05 13:07:58.088702 master-0 kubenswrapper[29936]: I1205 13:07:58.088546 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" containerName="glance-log" Dec 05 13:07:58.088702 master-0 kubenswrapper[29936]: E1205 13:07:58.088565 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a056507-477d-4d77-9905-c7c6344e92ec" containerName="mariadb-database-create" Dec 05 13:07:58.088702 master-0 kubenswrapper[29936]: I1205 13:07:58.088573 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a056507-477d-4d77-9905-c7c6344e92ec" containerName="mariadb-database-create" Dec 05 13:07:58.088702 master-0 kubenswrapper[29936]: E1205 13:07:58.088587 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d934457f-86b7-4ccf-b0da-8625268d2a56" containerName="mariadb-database-create" Dec 05 13:07:58.088702 master-0 kubenswrapper[29936]: I1205 13:07:58.088598 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d934457f-86b7-4ccf-b0da-8625268d2a56" containerName="mariadb-database-create" Dec 05 13:07:58.088702 master-0 kubenswrapper[29936]: E1205 13:07:58.088615 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60288a03-cb34-45c0-a727-0d822e01d9e8" containerName="mariadb-account-create-update" Dec 05 13:07:58.088702 master-0 kubenswrapper[29936]: I1205 13:07:58.088622 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="60288a03-cb34-45c0-a727-0d822e01d9e8" containerName="mariadb-account-create-update" Dec 05 13:07:58.088702 master-0 kubenswrapper[29936]: E1205 13:07:58.088635 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060bcbc7-c502-431d-8a7d-1f566a91f953" containerName="mariadb-account-create-update" Dec 05 13:07:58.088702 master-0 kubenswrapper[29936]: I1205 13:07:58.088643 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="060bcbc7-c502-431d-8a7d-1f566a91f953" containerName="mariadb-account-create-update" Dec 05 13:07:58.091850 master-0 kubenswrapper[29936]: I1205 13:07:58.091795 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8eb8d4a-1ed8-445b-a455-e79b6659317f" containerName="mariadb-account-create-update" Dec 05 13:07:58.091850 master-0 kubenswrapper[29936]: I1205 13:07:58.091841 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="03ed197d-e0d9-4970-8d1e-f79fa7c70697" containerName="mariadb-database-create" Dec 05 13:07:58.091850 master-0 kubenswrapper[29936]: I1205 13:07:58.091859 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" containerName="glance-log" Dec 05 13:07:58.092087 master-0 kubenswrapper[29936]: I1205 13:07:58.091869 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" containerName="glance-httpd" Dec 05 13:07:58.092087 master-0 kubenswrapper[29936]: I1205 13:07:58.091891 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" containerName="glance-log" Dec 05 13:07:58.092087 master-0 kubenswrapper[29936]: I1205 13:07:58.091905 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="60288a03-cb34-45c0-a727-0d822e01d9e8" containerName="mariadb-account-create-update" Dec 05 13:07:58.092087 master-0 kubenswrapper[29936]: I1205 13:07:58.091919 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d934457f-86b7-4ccf-b0da-8625268d2a56" containerName="mariadb-database-create" Dec 05 13:07:58.092087 master-0 kubenswrapper[29936]: I1205 13:07:58.091936 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a056507-477d-4d77-9905-c7c6344e92ec" containerName="mariadb-database-create" Dec 05 13:07:58.092087 master-0 kubenswrapper[29936]: I1205 13:07:58.091948 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" containerName="glance-httpd" Dec 05 13:07:58.092087 master-0 kubenswrapper[29936]: I1205 13:07:58.091971 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="060bcbc7-c502-431d-8a7d-1f566a91f953" containerName="mariadb-account-create-update" Dec 05 13:07:58.093149 master-0 kubenswrapper[29936]: I1205 13:07:58.093122 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.102656 master-0 kubenswrapper[29936]: I1205 13:07:58.096976 29936 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") on node \"master-0\" " Dec 05 13:07:58.102656 master-0 kubenswrapper[29936]: I1205 13:07:58.097029 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:58.102656 master-0 kubenswrapper[29936]: I1205 13:07:58.097040 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:58.104104 master-0 kubenswrapper[29936]: I1205 13:07:58.104016 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 13:07:58.104994 master-0 kubenswrapper[29936]: I1205 13:07:58.104905 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 13:07:58.145102 master-0 kubenswrapper[29936]: I1205 13:07:58.145046 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qjcct"] Dec 05 13:07:58.171281 master-0 kubenswrapper[29936]: I1205 13:07:58.168750 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-config-data" (OuterVolumeSpecName: "config-data") pod "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" (UID: "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:58.186072 master-0 kubenswrapper[29936]: I1205 13:07:58.185913 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"8fce82a4-d12a-4773-bcc6-37cfc2a46b3f","Type":"ContainerDied","Data":"c385119d00ef28c6515cf4da5d767b06553bd05d74e71885802d9f10e26714e3"} Dec 05 13:07:58.186072 master-0 kubenswrapper[29936]: I1205 13:07:58.185993 29936 scope.go:117] "RemoveContainer" containerID="3d7fd03fb9af5943d7ca0daceaacc49e1f67c092ce546e5fe5caa7835b1d52b4" Dec 05 13:07:58.186483 master-0 kubenswrapper[29936]: I1205 13:07:58.186123 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.186619 master-0 kubenswrapper[29936]: I1205 13:07:58.186587 29936 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 05 13:07:58.187020 master-0 kubenswrapper[29936]: I1205 13:07:58.186976 29936 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0d3529a7-2405-43a7-8986-74c66fb23772" (UniqueName: "kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f") on node "master-0" Dec 05 13:07:58.194873 master-0 kubenswrapper[29936]: I1205 13:07:58.194817 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt8w4" event={"ID":"df7431e9-8625-453c-82d8-af5e79106c65","Type":"ContainerStarted","Data":"fbfd9884d3057456a6e65f13c28bdcbca9243bbfccaeeb7d0a09ca3928cfb03c"} Dec 05 13:07:58.199459 master-0 kubenswrapper[29936]: I1205 13:07:58.199416 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" (UID: "5ef1ef52-6a31-4620-bda8-d76e5eaa97f3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:07:58.201658 master-0 kubenswrapper[29936]: I1205 13:07:58.201613 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-scripts\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.201741 master-0 kubenswrapper[29936]: I1205 13:07:58.201718 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8lfp\" (UniqueName: \"kubernetes.io/projected/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-kube-api-access-p8lfp\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.208461 master-0 kubenswrapper[29936]: I1205 13:07:58.207963 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbdmx" event={"ID":"04779889-6879-4759-8928-4d0b0ed2eae2","Type":"ContainerStarted","Data":"7cb6964247abf6862572dbe8c94b91a535418fa5d4c3ca1017aafe535e03176f"} Dec 05 13:07:58.209184 master-0 kubenswrapper[29936]: I1205 13:07:58.209146 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.216813 master-0 kubenswrapper[29936]: I1205 13:07:58.216732 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-config-data\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.217209 master-0 kubenswrapper[29936]: I1205 13:07:58.217164 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:58.237461 master-0 kubenswrapper[29936]: I1205 13:07:58.237401 29936 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:58.237461 master-0 kubenswrapper[29936]: I1205 13:07:58.237461 29936 reconciler_common.go:293] "Volume detached for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:58.238114 master-0 kubenswrapper[29936]: I1205 13:07:58.238059 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vt8w4" podStartSLOduration=4.160516322 podStartE2EDuration="9.23804204s" podCreationTimestamp="2025-12-05 13:07:49 +0000 UTC" firstStartedPulling="2025-12-05 13:07:51.849360598 +0000 UTC m=+1068.981440279" lastFinishedPulling="2025-12-05 13:07:56.926886316 +0000 UTC m=+1074.058965997" observedRunningTime="2025-12-05 13:07:58.22448566 +0000 UTC m=+1075.356565351" watchObservedRunningTime="2025-12-05 13:07:58.23804204 +0000 UTC m=+1075.370121721" Dec 05 13:07:58.267466 master-0 kubenswrapper[29936]: I1205 13:07:58.265163 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"5ef1ef52-6a31-4620-bda8-d76e5eaa97f3","Type":"ContainerDied","Data":"971f893e7f84c3e087cd861b6d7f2c0ffab437d179d0b089f41cb191a87ecc89"} Dec 05 13:07:58.267466 master-0 kubenswrapper[29936]: I1205 13:07:58.265400 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:58.303591 master-0 kubenswrapper[29936]: I1205 13:07:58.303517 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:07:58.335849 master-0 kubenswrapper[29936]: I1205 13:07:58.335753 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:07:58.343249 master-0 kubenswrapper[29936]: I1205 13:07:58.343161 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-config-data\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.343502 master-0 kubenswrapper[29936]: I1205 13:07:58.343313 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-scripts\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.343591 master-0 kubenswrapper[29936]: I1205 13:07:58.343566 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8lfp\" (UniqueName: \"kubernetes.io/projected/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-kube-api-access-p8lfp\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.343951 master-0 kubenswrapper[29936]: I1205 13:07:58.343929 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.352131 master-0 kubenswrapper[29936]: I1205 13:07:58.348731 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.354867 master-0 kubenswrapper[29936]: I1205 13:07:58.354815 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-config-data\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.365900 master-0 kubenswrapper[29936]: I1205 13:07:58.365695 29936 scope.go:117] "RemoveContainer" containerID="32763bbc7419a7e1ea06531624d0b3477a23d9b540fc3fbe12ca7af9e3c4701b" Dec 05 13:07:58.376284 master-0 kubenswrapper[29936]: I1205 13:07:58.372571 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:07:58.376284 master-0 kubenswrapper[29936]: I1205 13:07:58.372720 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-scripts\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.377829 master-0 kubenswrapper[29936]: I1205 13:07:58.377775 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8lfp\" (UniqueName: \"kubernetes.io/projected/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-kube-api-access-p8lfp\") pod \"nova-cell0-conductor-db-sync-qjcct\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.414535 master-0 kubenswrapper[29936]: I1205 13:07:58.413141 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:07:58.423064 master-0 kubenswrapper[29936]: I1205 13:07:58.422956 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.427326 master-0 kubenswrapper[29936]: I1205 13:07:58.427034 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gbdmx" podStartSLOduration=5.211997779 podStartE2EDuration="9.427001579s" podCreationTimestamp="2025-12-05 13:07:49 +0000 UTC" firstStartedPulling="2025-12-05 13:07:52.890426547 +0000 UTC m=+1070.022506228" lastFinishedPulling="2025-12-05 13:07:57.105430347 +0000 UTC m=+1074.237510028" observedRunningTime="2025-12-05 13:07:58.296997382 +0000 UTC m=+1075.429077083" watchObservedRunningTime="2025-12-05 13:07:58.427001579 +0000 UTC m=+1075.559081260" Dec 05 13:07:58.443379 master-0 kubenswrapper[29936]: I1205 13:07:58.438558 29936 scope.go:117] "RemoveContainer" containerID="583c77d7821dfe955594ede88ab11361e9b1c678ebf2612190f5915df17f6217" Dec 05 13:07:58.443379 master-0 kubenswrapper[29936]: I1205 13:07:58.439546 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 13:07:58.443379 master-0 kubenswrapper[29936]: I1205 13:07:58.439945 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b46d8-default-external-config-data" Dec 05 13:07:58.443379 master-0 kubenswrapper[29936]: I1205 13:07:58.442374 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 13:07:58.507475 master-0 kubenswrapper[29936]: I1205 13:07:58.501077 29936 scope.go:117] "RemoveContainer" containerID="91537211166e171704ba5738cc011df13401c26aa877dfe384da4a0d9bc16e8f" Dec 05 13:07:58.529142 master-0 kubenswrapper[29936]: I1205 13:07:58.521000 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:07:58.561673 master-0 kubenswrapper[29936]: I1205 13:07:58.560821 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-public-tls-certs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.561673 master-0 kubenswrapper[29936]: I1205 13:07:58.560977 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.561673 master-0 kubenswrapper[29936]: I1205 13:07:58.561060 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.561673 master-0 kubenswrapper[29936]: I1205 13:07:58.561276 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.561673 master-0 kubenswrapper[29936]: I1205 13:07:58.561312 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.561673 master-0 kubenswrapper[29936]: I1205 13:07:58.561345 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.561673 master-0 kubenswrapper[29936]: I1205 13:07:58.561392 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsxx6\" (UniqueName: \"kubernetes.io/projected/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-kube-api-access-tsxx6\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.561673 master-0 kubenswrapper[29936]: I1205 13:07:58.561469 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.578535 master-0 kubenswrapper[29936]: I1205 13:07:58.578443 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:07:58.612432 master-0 kubenswrapper[29936]: I1205 13:07:58.612358 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:07:58.637904 master-0 kubenswrapper[29936]: I1205 13:07:58.637782 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65b88b76d9-2zcht"] Dec 05 13:07:58.638738 master-0 kubenswrapper[29936]: I1205 13:07:58.638210 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" podUID="d702847e-681f-49cd-8d28-029cae3b4bf5" containerName="dnsmasq-dns" containerID="cri-o://fd92afb878bf74c89dca66d4fb5262c276ca5ba46225700da9eae52d70d2b55f" gracePeriod=10 Dec 05 13:07:58.670081 master-0 kubenswrapper[29936]: I1205 13:07:58.669976 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.672107 master-0 kubenswrapper[29936]: I1205 13:07:58.670144 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsxx6\" (UniqueName: \"kubernetes.io/projected/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-kube-api-access-tsxx6\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.672107 master-0 kubenswrapper[29936]: I1205 13:07:58.671030 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-logs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.672107 master-0 kubenswrapper[29936]: I1205 13:07:58.671104 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.672107 master-0 kubenswrapper[29936]: I1205 13:07:58.671281 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-public-tls-certs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.672107 master-0 kubenswrapper[29936]: I1205 13:07:58.671562 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.674996 master-0 kubenswrapper[29936]: I1205 13:07:58.673935 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.674996 master-0 kubenswrapper[29936]: I1205 13:07:58.674322 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.674996 master-0 kubenswrapper[29936]: I1205 13:07:58.674361 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.674996 master-0 kubenswrapper[29936]: I1205 13:07:58.674912 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-httpd-run\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.680683 master-0 kubenswrapper[29936]: I1205 13:07:58.680607 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-config-data\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.686136 master-0 kubenswrapper[29936]: I1205 13:07:58.683558 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-public-tls-certs\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.686136 master-0 kubenswrapper[29936]: I1205 13:07:58.684288 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-scripts\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.686136 master-0 kubenswrapper[29936]: I1205 13:07:58.684636 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:07:58.686136 master-0 kubenswrapper[29936]: I1205 13:07:58.684711 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/49c247ccb75d82d7aa2d53a927c5c3fb8512a27de6927aa154d2e3366fa1652b/globalmount\"" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.688705 master-0 kubenswrapper[29936]: I1205 13:07:58.688511 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-combined-ca-bundle\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.700776 master-0 kubenswrapper[29936]: I1205 13:07:58.700682 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:07:58.701152 master-0 kubenswrapper[29936]: I1205 13:07:58.701109 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsxx6\" (UniqueName: \"kubernetes.io/projected/7b2eb1a0-8caa-4c08-8008-05620fe8f5fa-kube-api-access-tsxx6\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:58.741726 master-0 kubenswrapper[29936]: I1205 13:07:58.741653 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:07:58.747853 master-0 kubenswrapper[29936]: I1205 13:07:58.747583 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:58.765549 master-0 kubenswrapper[29936]: I1205 13:07:58.762776 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 13:07:58.765549 master-0 kubenswrapper[29936]: I1205 13:07:58.763083 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-b46d8-default-internal-config-data" Dec 05 13:07:58.772983 master-0 kubenswrapper[29936]: I1205 13:07:58.772343 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:07:58.899239 master-0 kubenswrapper[29936]: I1205 13:07:58.895899 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5fa0c51-8a6d-4d82-b035-3612f0acf729-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:58.899239 master-0 kubenswrapper[29936]: I1205 13:07:58.895992 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:58.899239 master-0 kubenswrapper[29936]: I1205 13:07:58.896050 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:58.899239 master-0 kubenswrapper[29936]: I1205 13:07:58.896107 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:58.899239 master-0 kubenswrapper[29936]: I1205 13:07:58.896153 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-internal-tls-certs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:58.903750 master-0 kubenswrapper[29936]: I1205 13:07:58.903682 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4ft5\" (UniqueName: \"kubernetes.io/projected/d5fa0c51-8a6d-4d82-b035-3612f0acf729-kube-api-access-m4ft5\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:58.903938 master-0 kubenswrapper[29936]: I1205 13:07:58.903894 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:58.913395 master-0 kubenswrapper[29936]: I1205 13:07:58.904036 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5fa0c51-8a6d-4d82-b035-3612f0acf729-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.006751 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-internal-tls-certs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.006865 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4ft5\" (UniqueName: \"kubernetes.io/projected/d5fa0c51-8a6d-4d82-b035-3612f0acf729-kube-api-access-m4ft5\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.006913 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.006961 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5fa0c51-8a6d-4d82-b035-3612f0acf729-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.007069 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5fa0c51-8a6d-4d82-b035-3612f0acf729-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.007099 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.007150 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.007233 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.008782 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5fa0c51-8a6d-4d82-b035-3612f0acf729-httpd-run\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.018903 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-internal-tls-certs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.023655 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-config-data\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.023957 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5fa0c51-8a6d-4d82-b035-3612f0acf729-logs\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.025341 master-0 kubenswrapper[29936]: I1205 13:07:59.024824 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-combined-ca-bundle\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.026215 master-0 kubenswrapper[29936]: I1205 13:07:59.025941 29936 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 13:07:59.026215 master-0 kubenswrapper[29936]: I1205 13:07:59.025966 29936 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c11f3d51f30df7daf7a2bb71b828158f184983aebcc191306ab2ed71e7a567d1/globalmount\"" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.036394 master-0 kubenswrapper[29936]: I1205 13:07:59.031125 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5fa0c51-8a6d-4d82-b035-3612f0acf729-scripts\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.071628 master-0 kubenswrapper[29936]: I1205 13:07:59.062080 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4ft5\" (UniqueName: \"kubernetes.io/projected/d5fa0c51-8a6d-4d82-b035-3612f0acf729-kube-api-access-m4ft5\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:07:59.275308 master-0 kubenswrapper[29936]: I1205 13:07:59.275180 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef1ef52-6a31-4620-bda8-d76e5eaa97f3" path="/var/lib/kubelet/pods/5ef1ef52-6a31-4620-bda8-d76e5eaa97f3/volumes" Dec 05 13:07:59.277449 master-0 kubenswrapper[29936]: I1205 13:07:59.277406 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fce82a4-d12a-4773-bcc6-37cfc2a46b3f" path="/var/lib/kubelet/pods/8fce82a4-d12a-4773-bcc6-37cfc2a46b3f/volumes" Dec 05 13:07:59.349019 master-0 kubenswrapper[29936]: I1205 13:07:59.348956 29936 generic.go:334] "Generic (PLEG): container finished" podID="d702847e-681f-49cd-8d28-029cae3b4bf5" containerID="fd92afb878bf74c89dca66d4fb5262c276ca5ba46225700da9eae52d70d2b55f" exitCode=0 Dec 05 13:07:59.358415 master-0 kubenswrapper[29936]: I1205 13:07:59.349609 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" event={"ID":"d702847e-681f-49cd-8d28-029cae3b4bf5","Type":"ContainerDied","Data":"fd92afb878bf74c89dca66d4fb5262c276ca5ba46225700da9eae52d70d2b55f"} Dec 05 13:07:59.387435 master-0 kubenswrapper[29936]: I1205 13:07:59.378902 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qjcct"] Dec 05 13:07:59.396499 master-0 kubenswrapper[29936]: W1205 13:07:59.396402 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e234efb_4fe5_4c70_a992_4bfb95d2fc4c.slice/crio-c5c3c9d99b9cb382b7cfc3707f42652bb49007f2ba63ec5ebb467382d83956b0 WatchSource:0}: Error finding container c5c3c9d99b9cb382b7cfc3707f42652bb49007f2ba63ec5ebb467382d83956b0: Status 404 returned error can't find the container with id c5c3c9d99b9cb382b7cfc3707f42652bb49007f2ba63ec5ebb467382d83956b0 Dec 05 13:07:59.551045 master-0 kubenswrapper[29936]: I1205 13:07:59.550911 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:07:59.568882 master-0 kubenswrapper[29936]: I1205 13:07:59.568792 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mv4r\" (UniqueName: \"kubernetes.io/projected/d702847e-681f-49cd-8d28-029cae3b4bf5-kube-api-access-9mv4r\") pod \"d702847e-681f-49cd-8d28-029cae3b4bf5\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " Dec 05 13:07:59.569585 master-0 kubenswrapper[29936]: I1205 13:07:59.569547 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-nb\") pod \"d702847e-681f-49cd-8d28-029cae3b4bf5\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " Dec 05 13:07:59.569650 master-0 kubenswrapper[29936]: I1205 13:07:59.569619 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-config\") pod \"d702847e-681f-49cd-8d28-029cae3b4bf5\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " Dec 05 13:07:59.571473 master-0 kubenswrapper[29936]: I1205 13:07:59.571430 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-svc\") pod \"d702847e-681f-49cd-8d28-029cae3b4bf5\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " Dec 05 13:07:59.571538 master-0 kubenswrapper[29936]: I1205 13:07:59.571502 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-swift-storage-0\") pod \"d702847e-681f-49cd-8d28-029cae3b4bf5\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " Dec 05 13:07:59.571686 master-0 kubenswrapper[29936]: I1205 13:07:59.571649 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-sb\") pod \"d702847e-681f-49cd-8d28-029cae3b4bf5\" (UID: \"d702847e-681f-49cd-8d28-029cae3b4bf5\") " Dec 05 13:07:59.584236 master-0 kubenswrapper[29936]: I1205 13:07:59.584122 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d702847e-681f-49cd-8d28-029cae3b4bf5-kube-api-access-9mv4r" (OuterVolumeSpecName: "kube-api-access-9mv4r") pod "d702847e-681f-49cd-8d28-029cae3b4bf5" (UID: "d702847e-681f-49cd-8d28-029cae3b4bf5"). InnerVolumeSpecName "kube-api-access-9mv4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:07:59.631604 master-0 kubenswrapper[29936]: I1205 13:07:59.631540 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01e7e80a-75bb-4db7-8f54-38e48a37b9fe\" (UniqueName: \"kubernetes.io/csi/topolvm.io^dfda101a-9e62-4735-944c-c4776fa9490c\") pod \"glance-b46d8-default-external-api-0\" (UID: \"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa\") " pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:59.676992 master-0 kubenswrapper[29936]: I1205 13:07:59.676869 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mv4r\" (UniqueName: \"kubernetes.io/projected/d702847e-681f-49cd-8d28-029cae3b4bf5-kube-api-access-9mv4r\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:59.677783 master-0 kubenswrapper[29936]: I1205 13:07:59.677707 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d702847e-681f-49cd-8d28-029cae3b4bf5" (UID: "d702847e-681f-49cd-8d28-029cae3b4bf5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:59.715217 master-0 kubenswrapper[29936]: I1205 13:07:59.691137 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:07:59.715217 master-0 kubenswrapper[29936]: I1205 13:07:59.697752 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d702847e-681f-49cd-8d28-029cae3b4bf5" (UID: "d702847e-681f-49cd-8d28-029cae3b4bf5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:59.715217 master-0 kubenswrapper[29936]: I1205 13:07:59.697774 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d702847e-681f-49cd-8d28-029cae3b4bf5" (UID: "d702847e-681f-49cd-8d28-029cae3b4bf5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:59.715217 master-0 kubenswrapper[29936]: I1205 13:07:59.712720 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-config" (OuterVolumeSpecName: "config") pod "d702847e-681f-49cd-8d28-029cae3b4bf5" (UID: "d702847e-681f-49cd-8d28-029cae3b4bf5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:59.733321 master-0 kubenswrapper[29936]: I1205 13:07:59.728347 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d702847e-681f-49cd-8d28-029cae3b4bf5" (UID: "d702847e-681f-49cd-8d28-029cae3b4bf5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:07:59.785931 master-0 kubenswrapper[29936]: I1205 13:07:59.785148 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:59.785931 master-0 kubenswrapper[29936]: I1205 13:07:59.785240 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:59.785931 master-0 kubenswrapper[29936]: I1205 13:07:59.785256 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:59.785931 master-0 kubenswrapper[29936]: I1205 13:07:59.785271 29936 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:59.785931 master-0 kubenswrapper[29936]: I1205 13:07:59.785293 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d702847e-681f-49cd-8d28-029cae3b4bf5-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:07:59.794295 master-0 kubenswrapper[29936]: I1205 13:07:59.794225 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:59.794295 master-0 kubenswrapper[29936]: I1205 13:07:59.794290 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:07:59.872585 master-0 kubenswrapper[29936]: I1205 13:07:59.872515 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:08:00.387714 master-0 kubenswrapper[29936]: I1205 13:08:00.387611 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qjcct" event={"ID":"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c","Type":"ContainerStarted","Data":"c5c3c9d99b9cb382b7cfc3707f42652bb49007f2ba63ec5ebb467382d83956b0"} Dec 05 13:08:00.391431 master-0 kubenswrapper[29936]: I1205 13:08:00.391371 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" Dec 05 13:08:00.392793 master-0 kubenswrapper[29936]: I1205 13:08:00.392721 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65b88b76d9-2zcht" event={"ID":"d702847e-681f-49cd-8d28-029cae3b4bf5","Type":"ContainerDied","Data":"f8f812c9ba0805d00d6121d8150974665587b0fb45fd0ae13ab0b45bcf77f873"} Dec 05 13:08:00.392909 master-0 kubenswrapper[29936]: I1205 13:08:00.392808 29936 scope.go:117] "RemoveContainer" containerID="fd92afb878bf74c89dca66d4fb5262c276ca5ba46225700da9eae52d70d2b55f" Dec 05 13:08:00.426858 master-0 kubenswrapper[29936]: I1205 13:08:00.426774 29936 scope.go:117] "RemoveContainer" containerID="c7b2025fb91d0655b2bb65da82b4d5b5f98b53899cf8e140a02d94f7d2fccabc" Dec 05 13:08:00.500299 master-0 kubenswrapper[29936]: I1205 13:08:00.500220 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-external-api-0"] Dec 05 13:08:00.528808 master-0 kubenswrapper[29936]: I1205 13:08:00.528520 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0d3529a7-2405-43a7-8986-74c66fb23772\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d1cb910c-9f97-4053-a169-7a3b9b7d4e4f\") pod \"glance-b46d8-default-internal-api-0\" (UID: \"d5fa0c51-8a6d-4d82-b035-3612f0acf729\") " pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:08:00.573958 master-0 kubenswrapper[29936]: I1205 13:08:00.573713 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65b88b76d9-2zcht"] Dec 05 13:08:00.593065 master-0 kubenswrapper[29936]: I1205 13:08:00.592939 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65b88b76d9-2zcht"] Dec 05 13:08:00.688220 master-0 kubenswrapper[29936]: I1205 13:08:00.681151 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:08:00.753578 master-0 kubenswrapper[29936]: I1205 13:08:00.753507 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:08:00.753578 master-0 kubenswrapper[29936]: I1205 13:08:00.753585 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:08:01.214133 master-0 kubenswrapper[29936]: I1205 13:08:01.214048 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d702847e-681f-49cd-8d28-029cae3b4bf5" path="/var/lib/kubelet/pods/d702847e-681f-49cd-8d28-029cae3b4bf5/volumes" Dec 05 13:08:01.412618 master-0 kubenswrapper[29936]: I1205 13:08:01.412311 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa","Type":"ContainerStarted","Data":"304bea9a97ad076ab536a85df5f49717dbf1f2056a70284bac0c00257021e8c5"} Dec 05 13:08:01.827013 master-0 kubenswrapper[29936]: I1205 13:08:01.826922 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gbdmx" podUID="04779889-6879-4759-8928-4d0b0ed2eae2" containerName="registry-server" probeResult="failure" output=< Dec 05 13:08:01.827013 master-0 kubenswrapper[29936]: timeout: failed to connect service ":50051" within 1s Dec 05 13:08:01.827013 master-0 kubenswrapper[29936]: > Dec 05 13:08:06.484812 master-0 kubenswrapper[29936]: I1205 13:08:06.484732 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa","Type":"ContainerStarted","Data":"d88daeeffb74884fbfe3d2e6856af37f47776f61f262cd88b93613a1ab36f513"} Dec 05 13:08:08.634212 master-0 kubenswrapper[29936]: I1205 13:08:08.623118 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b46d8-default-internal-api-0"] Dec 05 13:08:09.850233 master-0 kubenswrapper[29936]: I1205 13:08:09.850130 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:08:09.936816 master-0 kubenswrapper[29936]: I1205 13:08:09.936725 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vt8w4"] Dec 05 13:08:10.586602 master-0 kubenswrapper[29936]: I1205 13:08:10.576807 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vt8w4" podUID="df7431e9-8625-453c-82d8-af5e79106c65" containerName="registry-server" containerID="cri-o://fbfd9884d3057456a6e65f13c28bdcbca9243bbfccaeeb7d0a09ca3928cfb03c" gracePeriod=2 Dec 05 13:08:10.812839 master-0 kubenswrapper[29936]: I1205 13:08:10.812745 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:08:10.860221 master-0 kubenswrapper[29936]: W1205 13:08:10.860107 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5fa0c51_8a6d_4d82_b035_3612f0acf729.slice/crio-0a2d198d87c0f1ed730710caa6eaaa36f7cf5e8c109f3c637c4861a386cf5a02 WatchSource:0}: Error finding container 0a2d198d87c0f1ed730710caa6eaaa36f7cf5e8c109f3c637c4861a386cf5a02: Status 404 returned error can't find the container with id 0a2d198d87c0f1ed730710caa6eaaa36f7cf5e8c109f3c637c4861a386cf5a02 Dec 05 13:08:10.887773 master-0 kubenswrapper[29936]: I1205 13:08:10.887701 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:08:11.525759 master-0 kubenswrapper[29936]: I1205 13:08:11.525669 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gbdmx"] Dec 05 13:08:11.596759 master-0 kubenswrapper[29936]: I1205 13:08:11.596633 29936 generic.go:334] "Generic (PLEG): container finished" podID="df7431e9-8625-453c-82d8-af5e79106c65" containerID="fbfd9884d3057456a6e65f13c28bdcbca9243bbfccaeeb7d0a09ca3928cfb03c" exitCode=0 Dec 05 13:08:11.596759 master-0 kubenswrapper[29936]: I1205 13:08:11.596736 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt8w4" event={"ID":"df7431e9-8625-453c-82d8-af5e79106c65","Type":"ContainerDied","Data":"fbfd9884d3057456a6e65f13c28bdcbca9243bbfccaeeb7d0a09ca3928cfb03c"} Dec 05 13:08:11.598874 master-0 kubenswrapper[29936]: I1205 13:08:11.598836 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"d5fa0c51-8a6d-4d82-b035-3612f0acf729","Type":"ContainerStarted","Data":"0a2d198d87c0f1ed730710caa6eaaa36f7cf5e8c109f3c637c4861a386cf5a02"} Dec 05 13:08:12.616667 master-0 kubenswrapper[29936]: I1205 13:08:12.616516 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gbdmx" podUID="04779889-6879-4759-8928-4d0b0ed2eae2" containerName="registry-server" containerID="cri-o://7cb6964247abf6862572dbe8c94b91a535418fa5d4c3ca1017aafe535e03176f" gracePeriod=2 Dec 05 13:08:13.638925 master-0 kubenswrapper[29936]: I1205 13:08:13.638825 29936 generic.go:334] "Generic (PLEG): container finished" podID="04779889-6879-4759-8928-4d0b0ed2eae2" containerID="7cb6964247abf6862572dbe8c94b91a535418fa5d4c3ca1017aafe535e03176f" exitCode=0 Dec 05 13:08:13.638925 master-0 kubenswrapper[29936]: I1205 13:08:13.638915 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbdmx" event={"ID":"04779889-6879-4759-8928-4d0b0ed2eae2","Type":"ContainerDied","Data":"7cb6964247abf6862572dbe8c94b91a535418fa5d4c3ca1017aafe535e03176f"} Dec 05 13:08:14.797548 master-0 kubenswrapper[29936]: I1205 13:08:14.797487 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:08:14.859282 master-0 kubenswrapper[29936]: I1205 13:08:14.859143 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl4c7\" (UniqueName: \"kubernetes.io/projected/df7431e9-8625-453c-82d8-af5e79106c65-kube-api-access-fl4c7\") pod \"df7431e9-8625-453c-82d8-af5e79106c65\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " Dec 05 13:08:14.859753 master-0 kubenswrapper[29936]: I1205 13:08:14.859341 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-catalog-content\") pod \"df7431e9-8625-453c-82d8-af5e79106c65\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " Dec 05 13:08:14.859824 master-0 kubenswrapper[29936]: I1205 13:08:14.859770 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-utilities\") pod \"df7431e9-8625-453c-82d8-af5e79106c65\" (UID: \"df7431e9-8625-453c-82d8-af5e79106c65\") " Dec 05 13:08:14.861291 master-0 kubenswrapper[29936]: I1205 13:08:14.861250 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-utilities" (OuterVolumeSpecName: "utilities") pod "df7431e9-8625-453c-82d8-af5e79106c65" (UID: "df7431e9-8625-453c-82d8-af5e79106c65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:08:14.885015 master-0 kubenswrapper[29936]: I1205 13:08:14.884917 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df7431e9-8625-453c-82d8-af5e79106c65" (UID: "df7431e9-8625-453c-82d8-af5e79106c65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:08:14.885428 master-0 kubenswrapper[29936]: I1205 13:08:14.885048 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7431e9-8625-453c-82d8-af5e79106c65-kube-api-access-fl4c7" (OuterVolumeSpecName: "kube-api-access-fl4c7") pod "df7431e9-8625-453c-82d8-af5e79106c65" (UID: "df7431e9-8625-453c-82d8-af5e79106c65"). InnerVolumeSpecName "kube-api-access-fl4c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:08:14.963860 master-0 kubenswrapper[29936]: I1205 13:08:14.963769 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:08:14.963860 master-0 kubenswrapper[29936]: I1205 13:08:14.963847 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl4c7\" (UniqueName: \"kubernetes.io/projected/df7431e9-8625-453c-82d8-af5e79106c65-kube-api-access-fl4c7\") on node \"master-0\" DevicePath \"\"" Dec 05 13:08:14.963860 master-0 kubenswrapper[29936]: I1205 13:08:14.963862 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df7431e9-8625-453c-82d8-af5e79106c65-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:08:15.240629 master-0 kubenswrapper[29936]: I1205 13:08:15.240584 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:08:15.274609 master-0 kubenswrapper[29936]: I1205 13:08:15.274502 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-utilities\") pod \"04779889-6879-4759-8928-4d0b0ed2eae2\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " Dec 05 13:08:15.274772 master-0 kubenswrapper[29936]: I1205 13:08:15.274629 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swrwj\" (UniqueName: \"kubernetes.io/projected/04779889-6879-4759-8928-4d0b0ed2eae2-kube-api-access-swrwj\") pod \"04779889-6879-4759-8928-4d0b0ed2eae2\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " Dec 05 13:08:15.274823 master-0 kubenswrapper[29936]: I1205 13:08:15.274774 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-catalog-content\") pod \"04779889-6879-4759-8928-4d0b0ed2eae2\" (UID: \"04779889-6879-4759-8928-4d0b0ed2eae2\") " Dec 05 13:08:15.277827 master-0 kubenswrapper[29936]: I1205 13:08:15.277777 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-utilities" (OuterVolumeSpecName: "utilities") pod "04779889-6879-4759-8928-4d0b0ed2eae2" (UID: "04779889-6879-4759-8928-4d0b0ed2eae2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:08:15.280525 master-0 kubenswrapper[29936]: I1205 13:08:15.280498 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04779889-6879-4759-8928-4d0b0ed2eae2-kube-api-access-swrwj" (OuterVolumeSpecName: "kube-api-access-swrwj") pod "04779889-6879-4759-8928-4d0b0ed2eae2" (UID: "04779889-6879-4759-8928-4d0b0ed2eae2"). InnerVolumeSpecName "kube-api-access-swrwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:08:15.283549 master-0 kubenswrapper[29936]: I1205 13:08:15.283525 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Dec 05 13:08:15.378406 master-0 kubenswrapper[29936]: I1205 13:08:15.378330 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:08:15.378406 master-0 kubenswrapper[29936]: I1205 13:08:15.378402 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swrwj\" (UniqueName: \"kubernetes.io/projected/04779889-6879-4759-8928-4d0b0ed2eae2-kube-api-access-swrwj\") on node \"master-0\" DevicePath \"\"" Dec 05 13:08:15.460355 master-0 kubenswrapper[29936]: I1205 13:08:15.460276 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04779889-6879-4759-8928-4d0b0ed2eae2" (UID: "04779889-6879-4759-8928-4d0b0ed2eae2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:08:15.481277 master-0 kubenswrapper[29936]: I1205 13:08:15.481193 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04779889-6879-4759-8928-4d0b0ed2eae2-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:08:15.677621 master-0 kubenswrapper[29936]: I1205 13:08:15.677546 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qjcct" event={"ID":"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c","Type":"ContainerStarted","Data":"76b7953df4c4e4e5ab492f23b5832f9e96dfc878f6c084b9ba459cac0ec279f3"} Dec 05 13:08:15.683851 master-0 kubenswrapper[29936]: I1205 13:08:15.683783 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gbdmx" event={"ID":"04779889-6879-4759-8928-4d0b0ed2eae2","Type":"ContainerDied","Data":"5f9136aadda3bfbe68d096a3d99b2dc3af0af20d0e830746d9e01ce84421b0b7"} Dec 05 13:08:15.683957 master-0 kubenswrapper[29936]: I1205 13:08:15.683865 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gbdmx" Dec 05 13:08:15.684091 master-0 kubenswrapper[29936]: I1205 13:08:15.683880 29936 scope.go:117] "RemoveContainer" containerID="7cb6964247abf6862572dbe8c94b91a535418fa5d4c3ca1017aafe535e03176f" Dec 05 13:08:15.692246 master-0 kubenswrapper[29936]: I1205 13:08:15.692132 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vt8w4" event={"ID":"df7431e9-8625-453c-82d8-af5e79106c65","Type":"ContainerDied","Data":"7e186d7cf14d57ec154600822a40e6ee8d79f5d0ce721be347ca5d5344b6c241"} Dec 05 13:08:15.692419 master-0 kubenswrapper[29936]: I1205 13:08:15.692212 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vt8w4" Dec 05 13:08:15.702482 master-0 kubenswrapper[29936]: I1205 13:08:15.702198 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qjcct" podStartSLOduration=2.806964082 podStartE2EDuration="18.702152584s" podCreationTimestamp="2025-12-05 13:07:57 +0000 UTC" firstStartedPulling="2025-12-05 13:07:59.410568434 +0000 UTC m=+1076.542648115" lastFinishedPulling="2025-12-05 13:08:15.305756936 +0000 UTC m=+1092.437836617" observedRunningTime="2025-12-05 13:08:15.697755158 +0000 UTC m=+1092.829834839" watchObservedRunningTime="2025-12-05 13:08:15.702152584 +0000 UTC m=+1092.834232265" Dec 05 13:08:15.771376 master-0 kubenswrapper[29936]: I1205 13:08:15.771302 29936 scope.go:117] "RemoveContainer" containerID="f827acfbf32c26bca61492c0f9bf9a4be131282cd1ba603aff56047a3be9bb4f" Dec 05 13:08:15.796110 master-0 kubenswrapper[29936]: I1205 13:08:15.796010 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vt8w4"] Dec 05 13:08:15.819619 master-0 kubenswrapper[29936]: I1205 13:08:15.819488 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vt8w4"] Dec 05 13:08:15.830249 master-0 kubenswrapper[29936]: I1205 13:08:15.830203 29936 scope.go:117] "RemoveContainer" containerID="e0b3151d38a97d8df9eb9d93526631002933582acba759042db83ae719d71acb" Dec 05 13:08:15.836404 master-0 kubenswrapper[29936]: I1205 13:08:15.836036 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gbdmx"] Dec 05 13:08:15.853622 master-0 kubenswrapper[29936]: I1205 13:08:15.852549 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gbdmx"] Dec 05 13:08:15.894285 master-0 kubenswrapper[29936]: I1205 13:08:15.894234 29936 scope.go:117] "RemoveContainer" containerID="fbfd9884d3057456a6e65f13c28bdcbca9243bbfccaeeb7d0a09ca3928cfb03c" Dec 05 13:08:15.973090 master-0 kubenswrapper[29936]: I1205 13:08:15.973049 29936 scope.go:117] "RemoveContainer" containerID="3e174302d928304084899d6919b27c77b1c1e2c681d314e39c76d2834a936eb0" Dec 05 13:08:16.090781 master-0 kubenswrapper[29936]: I1205 13:08:16.090722 29936 scope.go:117] "RemoveContainer" containerID="c52eed55412468e9aa9d8e1b60e274a8fd502a24537c226d30a722785012443e" Dec 05 13:08:16.721595 master-0 kubenswrapper[29936]: I1205 13:08:16.721492 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"cc2221d6-014e-4bd4-962b-24512ebf84e8","Type":"ContainerStarted","Data":"86971eb55cc065fffdc859914c0e627f6872d1fbeee9c64f5dd72cbea2af43e6"} Dec 05 13:08:16.726071 master-0 kubenswrapper[29936]: I1205 13:08:16.726004 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"d5fa0c51-8a6d-4d82-b035-3612f0acf729","Type":"ContainerStarted","Data":"a30dad78a97cfa9fb6b1e954cae6c64d03f77acbe39675d6068ad91951f554d7"} Dec 05 13:08:16.726330 master-0 kubenswrapper[29936]: I1205 13:08:16.726093 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-internal-api-0" event={"ID":"d5fa0c51-8a6d-4d82-b035-3612f0acf729","Type":"ContainerStarted","Data":"4457388c68a54787405a2b89bb5ac7dcde3e0d2c4868cc60eaa6e62f81412b6e"} Dec 05 13:08:16.729757 master-0 kubenswrapper[29936]: I1205 13:08:16.729626 29936 generic.go:334] "Generic (PLEG): container finished" podID="af1fc46b-7167-4446-9587-7ef591b4e661" containerID="7f7e91eff420b0e8ccc45a3f4b880805e49a60ce4272ba2073c32c54e3183dec" exitCode=0 Dec 05 13:08:16.729757 master-0 kubenswrapper[29936]: I1205 13:08:16.729689 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"af1fc46b-7167-4446-9587-7ef591b4e661","Type":"ContainerDied","Data":"7f7e91eff420b0e8ccc45a3f4b880805e49a60ce4272ba2073c32c54e3183dec"} Dec 05 13:08:16.742108 master-0 kubenswrapper[29936]: I1205 13:08:16.742030 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b46d8-default-external-api-0" event={"ID":"7b2eb1a0-8caa-4c08-8008-05620fe8f5fa","Type":"ContainerStarted","Data":"57b689dfe9d7f52b222a5eacc118504a76575737b8b963989fbf2d5d51980646"} Dec 05 13:08:16.822035 master-0 kubenswrapper[29936]: I1205 13:08:16.821869 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b46d8-default-internal-api-0" podStartSLOduration=18.821835891 podStartE2EDuration="18.821835891s" podCreationTimestamp="2025-12-05 13:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:08:16.793034446 +0000 UTC m=+1093.925114137" watchObservedRunningTime="2025-12-05 13:08:16.821835891 +0000 UTC m=+1093.953915572" Dec 05 13:08:16.897405 master-0 kubenswrapper[29936]: I1205 13:08:16.897259 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-b46d8-default-external-api-0" podStartSLOduration=18.897231581 podStartE2EDuration="18.897231581s" podCreationTimestamp="2025-12-05 13:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:08:16.830425209 +0000 UTC m=+1093.962504890" watchObservedRunningTime="2025-12-05 13:08:16.897231581 +0000 UTC m=+1094.029311262" Dec 05 13:08:17.209709 master-0 kubenswrapper[29936]: I1205 13:08:17.209555 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04779889-6879-4759-8928-4d0b0ed2eae2" path="/var/lib/kubelet/pods/04779889-6879-4759-8928-4d0b0ed2eae2/volumes" Dec 05 13:08:17.211040 master-0 kubenswrapper[29936]: I1205 13:08:17.210996 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7431e9-8625-453c-82d8-af5e79106c65" path="/var/lib/kubelet/pods/df7431e9-8625-453c-82d8-af5e79106c65/volumes" Dec 05 13:08:17.760565 master-0 kubenswrapper[29936]: I1205 13:08:17.760472 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"af1fc46b-7167-4446-9587-7ef591b4e661","Type":"ContainerStarted","Data":"35ca70cd714de271fb9b404cfac8f5c16edee0ce8b9b0f369e46671c6fd5049c"} Dec 05 13:08:18.785205 master-0 kubenswrapper[29936]: I1205 13:08:18.785078 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"af1fc46b-7167-4446-9587-7ef591b4e661","Type":"ContainerStarted","Data":"17eb0678ae8b8dd5cdbadc08311a98bb353918f009164a5e3516e9a237f474e9"} Dec 05 13:08:18.785205 master-0 kubenswrapper[29936]: I1205 13:08:18.785157 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"af1fc46b-7167-4446-9587-7ef591b4e661","Type":"ContainerStarted","Data":"a00807fa86a2462d69a28afd32a2773d17486bc8e5266c136f7d828e8555bcc1"} Dec 05 13:08:19.692229 master-0 kubenswrapper[29936]: I1205 13:08:19.691055 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:08:19.692330 master-0 kubenswrapper[29936]: I1205 13:08:19.692239 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:08:19.729863 master-0 kubenswrapper[29936]: I1205 13:08:19.729281 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:08:19.746716 master-0 kubenswrapper[29936]: I1205 13:08:19.746624 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:08:19.820665 master-0 kubenswrapper[29936]: I1205 13:08:19.820516 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"af1fc46b-7167-4446-9587-7ef591b4e661","Type":"ContainerStarted","Data":"303c505c018cc6b73714a99eb9bbb3618572c7e53389a3c2b77d1c02b958fd5e"} Dec 05 13:08:19.821342 master-0 kubenswrapper[29936]: I1205 13:08:19.820713 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"af1fc46b-7167-4446-9587-7ef591b4e661","Type":"ContainerStarted","Data":"2a9e6c94f04c793e452eb3991931f88b988a9c3cd96e2dcfd1876248e4a5e319"} Dec 05 13:08:19.821342 master-0 kubenswrapper[29936]: I1205 13:08:19.820852 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:08:19.821342 master-0 kubenswrapper[29936]: I1205 13:08:19.820876 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:08:19.821342 master-0 kubenswrapper[29936]: I1205 13:08:19.820888 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 05 13:08:19.956203 master-0 kubenswrapper[29936]: I1205 13:08:19.954479 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=7.680518697 podStartE2EDuration="27.954458888s" podCreationTimestamp="2025-12-05 13:07:52 +0000 UTC" firstStartedPulling="2025-12-05 13:07:55.006512284 +0000 UTC m=+1072.138591965" lastFinishedPulling="2025-12-05 13:08:15.280452475 +0000 UTC m=+1092.412532156" observedRunningTime="2025-12-05 13:08:19.953563686 +0000 UTC m=+1097.085643377" watchObservedRunningTime="2025-12-05 13:08:19.954458888 +0000 UTC m=+1097.086538569" Dec 05 13:08:20.682189 master-0 kubenswrapper[29936]: I1205 13:08:20.682087 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:08:20.682495 master-0 kubenswrapper[29936]: I1205 13:08:20.682217 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:08:20.721360 master-0 kubenswrapper[29936]: I1205 13:08:20.721242 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:08:20.741764 master-0 kubenswrapper[29936]: I1205 13:08:20.741691 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:08:20.870140 master-0 kubenswrapper[29936]: I1205 13:08:20.868122 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:08:20.870140 master-0 kubenswrapper[29936]: I1205 13:08:20.868208 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:08:20.870140 master-0 kubenswrapper[29936]: I1205 13:08:20.869342 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 05 13:08:22.422344 master-0 kubenswrapper[29936]: I1205 13:08:22.422262 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:08:22.427346 master-0 kubenswrapper[29936]: I1205 13:08:22.427284 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b46d8-default-external-api-0" Dec 05 13:08:22.703698 master-0 kubenswrapper[29936]: I1205 13:08:22.703647 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Dec 05 13:08:22.704255 master-0 kubenswrapper[29936]: I1205 13:08:22.704240 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 05 13:08:22.704425 master-0 kubenswrapper[29936]: I1205 13:08:22.704355 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Dec 05 13:08:22.704529 master-0 kubenswrapper[29936]: I1205 13:08:22.704514 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Dec 05 13:08:22.744007 master-0 kubenswrapper[29936]: I1205 13:08:22.743928 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Dec 05 13:08:22.747194 master-0 kubenswrapper[29936]: I1205 13:08:22.747094 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Dec 05 13:08:22.757627 master-0 kubenswrapper[29936]: I1205 13:08:22.757566 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 05 13:08:22.904453 master-0 kubenswrapper[29936]: I1205 13:08:22.904361 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 05 13:08:22.906389 master-0 kubenswrapper[29936]: I1205 13:08:22.906351 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 05 13:08:23.245468 master-0 kubenswrapper[29936]: I1205 13:08:23.245385 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:08:23.247791 master-0 kubenswrapper[29936]: I1205 13:08:23.247719 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-b46d8-default-internal-api-0" Dec 05 13:08:23.919573 master-0 kubenswrapper[29936]: I1205 13:08:23.919435 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Dec 05 13:08:34.101675 master-0 kubenswrapper[29936]: I1205 13:08:34.101499 29936 generic.go:334] "Generic (PLEG): container finished" podID="6e234efb-4fe5-4c70-a992-4bfb95d2fc4c" containerID="76b7953df4c4e4e5ab492f23b5832f9e96dfc878f6c084b9ba459cac0ec279f3" exitCode=0 Dec 05 13:08:34.101675 master-0 kubenswrapper[29936]: I1205 13:08:34.101575 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qjcct" event={"ID":"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c","Type":"ContainerDied","Data":"76b7953df4c4e4e5ab492f23b5832f9e96dfc878f6c084b9ba459cac0ec279f3"} Dec 05 13:08:35.655505 master-0 kubenswrapper[29936]: I1205 13:08:35.655435 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:08:35.724143 master-0 kubenswrapper[29936]: I1205 13:08:35.724037 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-config-data\") pod \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " Dec 05 13:08:35.724530 master-0 kubenswrapper[29936]: I1205 13:08:35.724193 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-combined-ca-bundle\") pod \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " Dec 05 13:08:35.724530 master-0 kubenswrapper[29936]: I1205 13:08:35.724262 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8lfp\" (UniqueName: \"kubernetes.io/projected/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-kube-api-access-p8lfp\") pod \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " Dec 05 13:08:35.724530 master-0 kubenswrapper[29936]: I1205 13:08:35.724316 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-scripts\") pod \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\" (UID: \"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c\") " Dec 05 13:08:35.729166 master-0 kubenswrapper[29936]: I1205 13:08:35.728695 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-kube-api-access-p8lfp" (OuterVolumeSpecName: "kube-api-access-p8lfp") pod "6e234efb-4fe5-4c70-a992-4bfb95d2fc4c" (UID: "6e234efb-4fe5-4c70-a992-4bfb95d2fc4c"). InnerVolumeSpecName "kube-api-access-p8lfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:08:35.729664 master-0 kubenswrapper[29936]: I1205 13:08:35.729614 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-scripts" (OuterVolumeSpecName: "scripts") pod "6e234efb-4fe5-4c70-a992-4bfb95d2fc4c" (UID: "6e234efb-4fe5-4c70-a992-4bfb95d2fc4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:08:35.759916 master-0 kubenswrapper[29936]: I1205 13:08:35.759839 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e234efb-4fe5-4c70-a992-4bfb95d2fc4c" (UID: "6e234efb-4fe5-4c70-a992-4bfb95d2fc4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:08:35.760435 master-0 kubenswrapper[29936]: I1205 13:08:35.760372 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-config-data" (OuterVolumeSpecName: "config-data") pod "6e234efb-4fe5-4c70-a992-4bfb95d2fc4c" (UID: "6e234efb-4fe5-4c70-a992-4bfb95d2fc4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:08:35.827819 master-0 kubenswrapper[29936]: I1205 13:08:35.827634 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:08:35.827819 master-0 kubenswrapper[29936]: I1205 13:08:35.827704 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:08:35.827819 master-0 kubenswrapper[29936]: I1205 13:08:35.827719 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8lfp\" (UniqueName: \"kubernetes.io/projected/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-kube-api-access-p8lfp\") on node \"master-0\" DevicePath \"\"" Dec 05 13:08:35.827819 master-0 kubenswrapper[29936]: I1205 13:08:35.827733 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:08:36.140364 master-0 kubenswrapper[29936]: I1205 13:08:36.140124 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qjcct" event={"ID":"6e234efb-4fe5-4c70-a992-4bfb95d2fc4c","Type":"ContainerDied","Data":"c5c3c9d99b9cb382b7cfc3707f42652bb49007f2ba63ec5ebb467382d83956b0"} Dec 05 13:08:36.140364 master-0 kubenswrapper[29936]: I1205 13:08:36.140223 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5c3c9d99b9cb382b7cfc3707f42652bb49007f2ba63ec5ebb467382d83956b0" Dec 05 13:08:36.140364 master-0 kubenswrapper[29936]: I1205 13:08:36.140166 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qjcct" Dec 05 13:08:36.355307 master-0 kubenswrapper[29936]: I1205 13:08:36.354830 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 13:08:36.355909 master-0 kubenswrapper[29936]: E1205 13:08:36.355818 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e234efb-4fe5-4c70-a992-4bfb95d2fc4c" containerName="nova-cell0-conductor-db-sync" Dec 05 13:08:36.355909 master-0 kubenswrapper[29936]: I1205 13:08:36.355861 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e234efb-4fe5-4c70-a992-4bfb95d2fc4c" containerName="nova-cell0-conductor-db-sync" Dec 05 13:08:36.356036 master-0 kubenswrapper[29936]: E1205 13:08:36.355916 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04779889-6879-4759-8928-4d0b0ed2eae2" containerName="extract-utilities" Dec 05 13:08:36.356036 master-0 kubenswrapper[29936]: I1205 13:08:36.355931 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="04779889-6879-4759-8928-4d0b0ed2eae2" containerName="extract-utilities" Dec 05 13:08:36.356036 master-0 kubenswrapper[29936]: E1205 13:08:36.355979 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04779889-6879-4759-8928-4d0b0ed2eae2" containerName="extract-content" Dec 05 13:08:36.356036 master-0 kubenswrapper[29936]: I1205 13:08:36.355993 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="04779889-6879-4759-8928-4d0b0ed2eae2" containerName="extract-content" Dec 05 13:08:36.356036 master-0 kubenswrapper[29936]: E1205 13:08:36.356020 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d702847e-681f-49cd-8d28-029cae3b4bf5" containerName="init" Dec 05 13:08:36.356036 master-0 kubenswrapper[29936]: I1205 13:08:36.356032 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d702847e-681f-49cd-8d28-029cae3b4bf5" containerName="init" Dec 05 13:08:36.356306 master-0 kubenswrapper[29936]: E1205 13:08:36.356090 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7431e9-8625-453c-82d8-af5e79106c65" containerName="extract-content" Dec 05 13:08:36.356306 master-0 kubenswrapper[29936]: I1205 13:08:36.356103 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7431e9-8625-453c-82d8-af5e79106c65" containerName="extract-content" Dec 05 13:08:36.356306 master-0 kubenswrapper[29936]: E1205 13:08:36.356126 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7431e9-8625-453c-82d8-af5e79106c65" containerName="registry-server" Dec 05 13:08:36.356306 master-0 kubenswrapper[29936]: I1205 13:08:36.356137 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7431e9-8625-453c-82d8-af5e79106c65" containerName="registry-server" Dec 05 13:08:36.356306 master-0 kubenswrapper[29936]: E1205 13:08:36.356158 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d702847e-681f-49cd-8d28-029cae3b4bf5" containerName="dnsmasq-dns" Dec 05 13:08:36.356306 master-0 kubenswrapper[29936]: I1205 13:08:36.356169 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d702847e-681f-49cd-8d28-029cae3b4bf5" containerName="dnsmasq-dns" Dec 05 13:08:36.356306 master-0 kubenswrapper[29936]: E1205 13:08:36.356216 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7431e9-8625-453c-82d8-af5e79106c65" containerName="extract-utilities" Dec 05 13:08:36.356306 master-0 kubenswrapper[29936]: I1205 13:08:36.356230 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7431e9-8625-453c-82d8-af5e79106c65" containerName="extract-utilities" Dec 05 13:08:36.356306 master-0 kubenswrapper[29936]: E1205 13:08:36.356263 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04779889-6879-4759-8928-4d0b0ed2eae2" containerName="registry-server" Dec 05 13:08:36.356306 master-0 kubenswrapper[29936]: I1205 13:08:36.356275 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="04779889-6879-4759-8928-4d0b0ed2eae2" containerName="registry-server" Dec 05 13:08:36.356723 master-0 kubenswrapper[29936]: I1205 13:08:36.356676 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="04779889-6879-4759-8928-4d0b0ed2eae2" containerName="registry-server" Dec 05 13:08:36.356805 master-0 kubenswrapper[29936]: I1205 13:08:36.356772 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7431e9-8625-453c-82d8-af5e79106c65" containerName="registry-server" Dec 05 13:08:36.356882 master-0 kubenswrapper[29936]: I1205 13:08:36.356814 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d702847e-681f-49cd-8d28-029cae3b4bf5" containerName="dnsmasq-dns" Dec 05 13:08:36.356882 master-0 kubenswrapper[29936]: I1205 13:08:36.356838 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e234efb-4fe5-4c70-a992-4bfb95d2fc4c" containerName="nova-cell0-conductor-db-sync" Dec 05 13:08:36.358900 master-0 kubenswrapper[29936]: I1205 13:08:36.358347 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:36.361367 master-0 kubenswrapper[29936]: I1205 13:08:36.360918 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 13:08:36.412029 master-0 kubenswrapper[29936]: I1205 13:08:36.411965 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 13:08:36.459552 master-0 kubenswrapper[29936]: I1205 13:08:36.458990 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db1bc08-dd34-4b8e-9e69-bf0ea19a4072-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072\") " pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:36.460000 master-0 kubenswrapper[29936]: I1205 13:08:36.459722 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db1bc08-dd34-4b8e-9e69-bf0ea19a4072-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072\") " pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:36.460000 master-0 kubenswrapper[29936]: I1205 13:08:36.459967 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9bn5\" (UniqueName: \"kubernetes.io/projected/1db1bc08-dd34-4b8e-9e69-bf0ea19a4072-kube-api-access-w9bn5\") pod \"nova-cell0-conductor-0\" (UID: \"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072\") " pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:36.562524 master-0 kubenswrapper[29936]: I1205 13:08:36.562420 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db1bc08-dd34-4b8e-9e69-bf0ea19a4072-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072\") " pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:36.562884 master-0 kubenswrapper[29936]: I1205 13:08:36.562621 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9bn5\" (UniqueName: \"kubernetes.io/projected/1db1bc08-dd34-4b8e-9e69-bf0ea19a4072-kube-api-access-w9bn5\") pod \"nova-cell0-conductor-0\" (UID: \"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072\") " pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:36.562884 master-0 kubenswrapper[29936]: I1205 13:08:36.562655 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db1bc08-dd34-4b8e-9e69-bf0ea19a4072-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072\") " pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:36.569010 master-0 kubenswrapper[29936]: I1205 13:08:36.568962 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1db1bc08-dd34-4b8e-9e69-bf0ea19a4072-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072\") " pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:36.569615 master-0 kubenswrapper[29936]: I1205 13:08:36.569590 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1db1bc08-dd34-4b8e-9e69-bf0ea19a4072-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072\") " pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:36.581738 master-0 kubenswrapper[29936]: I1205 13:08:36.581661 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9bn5\" (UniqueName: \"kubernetes.io/projected/1db1bc08-dd34-4b8e-9e69-bf0ea19a4072-kube-api-access-w9bn5\") pod \"nova-cell0-conductor-0\" (UID: \"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072\") " pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:36.727736 master-0 kubenswrapper[29936]: I1205 13:08:36.727635 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:37.259066 master-0 kubenswrapper[29936]: I1205 13:08:37.258986 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 13:08:37.260148 master-0 kubenswrapper[29936]: W1205 13:08:37.260078 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1db1bc08_dd34_4b8e_9e69_bf0ea19a4072.slice/crio-4f7300fe562e32e426eedf4dea79a1fd109e1c75d341d6877e54e7542fb74b3e WatchSource:0}: Error finding container 4f7300fe562e32e426eedf4dea79a1fd109e1c75d341d6877e54e7542fb74b3e: Status 404 returned error can't find the container with id 4f7300fe562e32e426eedf4dea79a1fd109e1c75d341d6877e54e7542fb74b3e Dec 05 13:08:38.186687 master-0 kubenswrapper[29936]: I1205 13:08:38.186601 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072","Type":"ContainerStarted","Data":"b57f1249aa1cb5f60664d05b152af1b7784333d1d9aa7dbf9898a20a9f2bdddb"} Dec 05 13:08:38.186687 master-0 kubenswrapper[29936]: I1205 13:08:38.186679 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1db1bc08-dd34-4b8e-9e69-bf0ea19a4072","Type":"ContainerStarted","Data":"4f7300fe562e32e426eedf4dea79a1fd109e1c75d341d6877e54e7542fb74b3e"} Dec 05 13:08:38.187590 master-0 kubenswrapper[29936]: I1205 13:08:38.186748 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:38.218913 master-0 kubenswrapper[29936]: I1205 13:08:38.218800 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.218768071 podStartE2EDuration="2.218768071s" podCreationTimestamp="2025-12-05 13:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:08:38.206365173 +0000 UTC m=+1115.338444864" watchObservedRunningTime="2025-12-05 13:08:38.218768071 +0000 UTC m=+1115.350847752" Dec 05 13:08:46.763963 master-0 kubenswrapper[29936]: I1205 13:08:46.763876 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 13:08:47.328429 master-0 kubenswrapper[29936]: I1205 13:08:47.328336 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-bmgf2"] Dec 05 13:08:47.331977 master-0 kubenswrapper[29936]: I1205 13:08:47.331900 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.335874 master-0 kubenswrapper[29936]: I1205 13:08:47.335820 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 13:08:47.336198 master-0 kubenswrapper[29936]: I1205 13:08:47.336150 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 13:08:47.343254 master-0 kubenswrapper[29936]: I1205 13:08:47.343155 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bmgf2"] Dec 05 13:08:47.448207 master-0 kubenswrapper[29936]: I1205 13:08:47.441616 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgg9b\" (UniqueName: \"kubernetes.io/projected/3ebb85b0-cfc1-4d5b-af96-01798e97b809-kube-api-access-pgg9b\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.448207 master-0 kubenswrapper[29936]: I1205 13:08:47.441796 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.448207 master-0 kubenswrapper[29936]: I1205 13:08:47.441923 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-scripts\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.448207 master-0 kubenswrapper[29936]: I1205 13:08:47.442027 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-config-data\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.506383 master-0 kubenswrapper[29936]: I1205 13:08:47.503760 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Dec 05 13:08:47.508622 master-0 kubenswrapper[29936]: I1205 13:08:47.508574 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:47.524206 master-0 kubenswrapper[29936]: I1205 13:08:47.519849 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Dec 05 13:08:47.533271 master-0 kubenswrapper[29936]: I1205 13:08:47.532885 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-ironic-compute-config-data" Dec 05 13:08:47.548133 master-0 kubenswrapper[29936]: I1205 13:08:47.547692 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.548133 master-0 kubenswrapper[29936]: I1205 13:08:47.547915 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-scripts\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.548133 master-0 kubenswrapper[29936]: I1205 13:08:47.548085 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-config-data\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.548466 master-0 kubenswrapper[29936]: I1205 13:08:47.548204 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgg9b\" (UniqueName: \"kubernetes.io/projected/3ebb85b0-cfc1-4d5b-af96-01798e97b809-kube-api-access-pgg9b\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.576208 master-0 kubenswrapper[29936]: I1205 13:08:47.565300 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-config-data\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.576208 master-0 kubenswrapper[29936]: I1205 13:08:47.565636 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-scripts\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.576208 master-0 kubenswrapper[29936]: I1205 13:08:47.565878 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.598285 master-0 kubenswrapper[29936]: I1205 13:08:47.595857 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgg9b\" (UniqueName: \"kubernetes.io/projected/3ebb85b0-cfc1-4d5b-af96-01798e97b809-kube-api-access-pgg9b\") pod \"nova-cell0-cell-mapping-bmgf2\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.632205 master-0 kubenswrapper[29936]: I1205 13:08:47.629870 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 13:08:47.675342 master-0 kubenswrapper[29936]: I1205 13:08:47.648822 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:08:47.675342 master-0 kubenswrapper[29936]: I1205 13:08:47.650309 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:47.675342 master-0 kubenswrapper[29936]: I1205 13:08:47.650418 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:47.675342 master-0 kubenswrapper[29936]: I1205 13:08:47.650493 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqljr\" (UniqueName: \"kubernetes.io/projected/ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877-kube-api-access-pqljr\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:47.675342 master-0 kubenswrapper[29936]: I1205 13:08:47.650742 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:08:47.684205 master-0 kubenswrapper[29936]: I1205 13:08:47.676150 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:08:47.684205 master-0 kubenswrapper[29936]: I1205 13:08:47.678136 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 13:08:47.699218 master-0 kubenswrapper[29936]: I1205 13:08:47.696271 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:08:47.704405 master-0 kubenswrapper[29936]: I1205 13:08:47.700991 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:08:47.756751 master-0 kubenswrapper[29936]: I1205 13:08:47.740340 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 13:08:47.756751 master-0 kubenswrapper[29936]: I1205 13:08:47.740858 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 13:08:47.767204 master-0 kubenswrapper[29936]: I1205 13:08:47.757967 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85pnm\" (UniqueName: \"kubernetes.io/projected/7b1d92e5-7252-4099-9a92-d6566b00de62-kube-api-access-85pnm\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.767204 master-0 kubenswrapper[29936]: I1205 13:08:47.758050 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-config-data\") pod \"nova-scheduler-0\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " pod="openstack/nova-scheduler-0" Dec 05 13:08:47.767204 master-0 kubenswrapper[29936]: I1205 13:08:47.758097 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " pod="openstack/nova-scheduler-0" Dec 05 13:08:47.767204 master-0 kubenswrapper[29936]: I1205 13:08:47.758145 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b1d92e5-7252-4099-9a92-d6566b00de62-logs\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.782220 master-0 kubenswrapper[29936]: I1205 13:08:47.775125 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:47.782220 master-0 kubenswrapper[29936]: I1205 13:08:47.775415 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.782220 master-0 kubenswrapper[29936]: I1205 13:08:47.775824 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:47.804616 master-0 kubenswrapper[29936]: I1205 13:08:47.803496 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-config-data\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.804616 master-0 kubenswrapper[29936]: I1205 13:08:47.803705 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqljr\" (UniqueName: \"kubernetes.io/projected/ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877-kube-api-access-pqljr\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:47.804616 master-0 kubenswrapper[29936]: I1205 13:08:47.803817 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6hpt\" (UniqueName: \"kubernetes.io/projected/4c0c6457-0252-4483-b3ee-8b530b0214e7-kube-api-access-s6hpt\") pod \"nova-scheduler-0\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " pod="openstack/nova-scheduler-0" Dec 05 13:08:47.811492 master-0 kubenswrapper[29936]: I1205 13:08:47.811436 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:47.816318 master-0 kubenswrapper[29936]: I1205 13:08:47.814736 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:47.850614 master-0 kubenswrapper[29936]: I1205 13:08:47.850035 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqljr\" (UniqueName: \"kubernetes.io/projected/ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877-kube-api-access-pqljr\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:47.906697 master-0 kubenswrapper[29936]: I1205 13:08:47.906630 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.907123 master-0 kubenswrapper[29936]: I1205 13:08:47.907104 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-config-data\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.908201 master-0 kubenswrapper[29936]: I1205 13:08:47.907503 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6hpt\" (UniqueName: \"kubernetes.io/projected/4c0c6457-0252-4483-b3ee-8b530b0214e7-kube-api-access-s6hpt\") pod \"nova-scheduler-0\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " pod="openstack/nova-scheduler-0" Dec 05 13:08:47.908201 master-0 kubenswrapper[29936]: I1205 13:08:47.907819 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85pnm\" (UniqueName: \"kubernetes.io/projected/7b1d92e5-7252-4099-9a92-d6566b00de62-kube-api-access-85pnm\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.908201 master-0 kubenswrapper[29936]: I1205 13:08:47.907894 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-config-data\") pod \"nova-scheduler-0\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " pod="openstack/nova-scheduler-0" Dec 05 13:08:47.908201 master-0 kubenswrapper[29936]: I1205 13:08:47.907992 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " pod="openstack/nova-scheduler-0" Dec 05 13:08:47.908201 master-0 kubenswrapper[29936]: I1205 13:08:47.908097 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b1d92e5-7252-4099-9a92-d6566b00de62-logs\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.908873 master-0 kubenswrapper[29936]: I1205 13:08:47.908845 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b1d92e5-7252-4099-9a92-d6566b00de62-logs\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.947278 master-0 kubenswrapper[29936]: I1205 13:08:47.942647 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " pod="openstack/nova-scheduler-0" Dec 05 13:08:47.947278 master-0 kubenswrapper[29936]: I1205 13:08:47.944452 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-config-data\") pod \"nova-scheduler-0\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " pod="openstack/nova-scheduler-0" Dec 05 13:08:47.947278 master-0 kubenswrapper[29936]: I1205 13:08:47.947263 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.969279 master-0 kubenswrapper[29936]: I1205 13:08:47.968984 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85pnm\" (UniqueName: \"kubernetes.io/projected/7b1d92e5-7252-4099-9a92-d6566b00de62-kube-api-access-85pnm\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.969279 master-0 kubenswrapper[29936]: I1205 13:08:47.969072 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-config-data\") pod \"nova-api-0\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " pod="openstack/nova-api-0" Dec 05 13:08:47.986097 master-0 kubenswrapper[29936]: I1205 13:08:47.986037 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6hpt\" (UniqueName: \"kubernetes.io/projected/4c0c6457-0252-4483-b3ee-8b530b0214e7-kube-api-access-s6hpt\") pod \"nova-scheduler-0\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " pod="openstack/nova-scheduler-0" Dec 05 13:08:48.039831 master-0 kubenswrapper[29936]: I1205 13:08:48.039755 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:08:48.084212 master-0 kubenswrapper[29936]: I1205 13:08:48.077904 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:08:48.084212 master-0 kubenswrapper[29936]: I1205 13:08:48.081592 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 13:08:48.089634 master-0 kubenswrapper[29936]: I1205 13:08:48.088730 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 13:08:48.174345 master-0 kubenswrapper[29936]: I1205 13:08:48.170727 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:08:48.242194 master-0 kubenswrapper[29936]: I1205 13:08:48.239532 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:08:48.265764 master-0 kubenswrapper[29936]: I1205 13:08:48.264161 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvml\" (UniqueName: \"kubernetes.io/projected/5372734c-00e3-40c0-91f9-17a8e5610698-kube-api-access-whvml\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.265764 master-0 kubenswrapper[29936]: I1205 13:08:48.264594 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5372734c-00e3-40c0-91f9-17a8e5610698-logs\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.265764 master-0 kubenswrapper[29936]: I1205 13:08:48.264644 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.265764 master-0 kubenswrapper[29936]: I1205 13:08:48.264742 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-config-data\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.279684 master-0 kubenswrapper[29936]: I1205 13:08:48.270619 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 13:08:48.289332 master-0 kubenswrapper[29936]: I1205 13:08:48.289259 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 13:08:48.290117 master-0 kubenswrapper[29936]: I1205 13:08:48.289831 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:48.297886 master-0 kubenswrapper[29936]: I1205 13:08:48.297826 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 13:08:48.305036 master-0 kubenswrapper[29936]: I1205 13:08:48.304976 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 13:08:48.371619 master-0 kubenswrapper[29936]: I1205 13:08:48.371449 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:48.371994 master-0 kubenswrapper[29936]: I1205 13:08:48.371633 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whvml\" (UniqueName: \"kubernetes.io/projected/5372734c-00e3-40c0-91f9-17a8e5610698-kube-api-access-whvml\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.371994 master-0 kubenswrapper[29936]: I1205 13:08:48.371733 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5372734c-00e3-40c0-91f9-17a8e5610698-logs\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.371994 master-0 kubenswrapper[29936]: I1205 13:08:48.371774 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.371994 master-0 kubenswrapper[29936]: I1205 13:08:48.371846 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf5v5\" (UniqueName: \"kubernetes.io/projected/52bfe120-ee71-4c4f-8433-24164a6b82ac-kube-api-access-gf5v5\") pod \"nova-cell1-novncproxy-0\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:48.371994 master-0 kubenswrapper[29936]: I1205 13:08:48.371898 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-config-data\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.372413 master-0 kubenswrapper[29936]: I1205 13:08:48.372132 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:48.373513 master-0 kubenswrapper[29936]: I1205 13:08:48.373468 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5372734c-00e3-40c0-91f9-17a8e5610698-logs\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.377914 master-0 kubenswrapper[29936]: I1205 13:08:48.377828 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-config-data\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.400435 master-0 kubenswrapper[29936]: I1205 13:08:48.392462 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.401818 master-0 kubenswrapper[29936]: I1205 13:08:48.401270 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvml\" (UniqueName: \"kubernetes.io/projected/5372734c-00e3-40c0-91f9-17a8e5610698-kube-api-access-whvml\") pod \"nova-metadata-0\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " pod="openstack/nova-metadata-0" Dec 05 13:08:48.504798 master-0 kubenswrapper[29936]: I1205 13:08:48.504726 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:48.504959 master-0 kubenswrapper[29936]: I1205 13:08:48.504884 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:48.505169 master-0 kubenswrapper[29936]: I1205 13:08:48.505137 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf5v5\" (UniqueName: \"kubernetes.io/projected/52bfe120-ee71-4c4f-8433-24164a6b82ac-kube-api-access-gf5v5\") pod \"nova-cell1-novncproxy-0\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:48.506022 master-0 kubenswrapper[29936]: I1205 13:08:48.505982 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6944864c6f-cr675"] Dec 05 13:08:48.510189 master-0 kubenswrapper[29936]: I1205 13:08:48.510108 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:48.513365 master-0 kubenswrapper[29936]: I1205 13:08:48.513304 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:48.514069 master-0 kubenswrapper[29936]: I1205 13:08:48.514028 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.526067 master-0 kubenswrapper[29936]: I1205 13:08:48.525960 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6944864c6f-cr675"] Dec 05 13:08:48.615885 master-0 kubenswrapper[29936]: I1205 13:08:48.610833 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvkph\" (UniqueName: \"kubernetes.io/projected/df45b5b9-1e68-471c-96e9-fa2906275144-kube-api-access-jvkph\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.615885 master-0 kubenswrapper[29936]: I1205 13:08:48.611044 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-swift-storage-0\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.615885 master-0 kubenswrapper[29936]: I1205 13:08:48.611527 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-sb\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.615885 master-0 kubenswrapper[29936]: I1205 13:08:48.612143 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-nb\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.615885 master-0 kubenswrapper[29936]: I1205 13:08:48.612219 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-config\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.615885 master-0 kubenswrapper[29936]: I1205 13:08:48.612426 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-svc\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.620584 master-0 kubenswrapper[29936]: I1205 13:08:48.620542 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 13:08:48.716488 master-0 kubenswrapper[29936]: I1205 13:08:48.716383 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-swift-storage-0\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.716855 master-0 kubenswrapper[29936]: I1205 13:08:48.716636 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-sb\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.717004 master-0 kubenswrapper[29936]: I1205 13:08:48.716919 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-nb\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.717072 master-0 kubenswrapper[29936]: I1205 13:08:48.717053 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-config\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.717285 master-0 kubenswrapper[29936]: I1205 13:08:48.717256 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-svc\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.717601 master-0 kubenswrapper[29936]: I1205 13:08:48.717564 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvkph\" (UniqueName: \"kubernetes.io/projected/df45b5b9-1e68-471c-96e9-fa2906275144-kube-api-access-jvkph\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.717913 master-0 kubenswrapper[29936]: I1205 13:08:48.717776 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-swift-storage-0\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.717983 master-0 kubenswrapper[29936]: I1205 13:08:48.717944 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-nb\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.718036 master-0 kubenswrapper[29936]: I1205 13:08:48.717975 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-sb\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.718211 master-0 kubenswrapper[29936]: I1205 13:08:48.718140 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-config\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:48.718589 master-0 kubenswrapper[29936]: I1205 13:08:48.718535 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-svc\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:51.382612 master-0 kubenswrapper[29936]: I1205 13:08:51.382415 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf5v5\" (UniqueName: \"kubernetes.io/projected/52bfe120-ee71-4c4f-8433-24164a6b82ac-kube-api-access-gf5v5\") pod \"nova-cell1-novncproxy-0\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:51.388166 master-0 kubenswrapper[29936]: I1205 13:08:51.388024 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvkph\" (UniqueName: \"kubernetes.io/projected/df45b5b9-1e68-471c-96e9-fa2906275144-kube-api-access-jvkph\") pod \"dnsmasq-dns-6944864c6f-cr675\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:51.434975 master-0 kubenswrapper[29936]: W1205 13:08:51.434856 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef0a5bb3_6b34_4c39_8cd6_5925f8dd3877.slice/crio-955e0c0b3e7ead22a5ffa6f986a5b9a136e3b738611a75a77b0c7b0e4e8432d3 WatchSource:0}: Error finding container 955e0c0b3e7ead22a5ffa6f986a5b9a136e3b738611a75a77b0c7b0e4e8432d3: Status 404 returned error can't find the container with id 955e0c0b3e7ead22a5ffa6f986a5b9a136e3b738611a75a77b0c7b0e4e8432d3 Dec 05 13:08:51.475490 master-0 kubenswrapper[29936]: I1205 13:08:51.475435 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Dec 05 13:08:51.502462 master-0 kubenswrapper[29936]: I1205 13:08:51.502381 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bmgf2"] Dec 05 13:08:51.520570 master-0 kubenswrapper[29936]: I1205 13:08:51.520410 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:08:51.540656 master-0 kubenswrapper[29936]: I1205 13:08:51.540507 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:08:51.557562 master-0 kubenswrapper[29936]: I1205 13:08:51.557474 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:08:51.654987 master-0 kubenswrapper[29936]: I1205 13:08:51.654898 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:51.659142 master-0 kubenswrapper[29936]: I1205 13:08:51.659056 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:08:52.037960 master-0 kubenswrapper[29936]: I1205 13:08:52.037864 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xjmxk"] Dec 05 13:08:52.041620 master-0 kubenswrapper[29936]: I1205 13:08:52.041556 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.058756 master-0 kubenswrapper[29936]: I1205 13:08:52.057938 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 13:08:52.059100 master-0 kubenswrapper[29936]: I1205 13:08:52.058912 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 13:08:52.072738 master-0 kubenswrapper[29936]: I1205 13:08:52.072664 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cg54\" (UniqueName: \"kubernetes.io/projected/fbe106b9-254b-49b9-98dd-b1cfd4697210-kube-api-access-5cg54\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.073246 master-0 kubenswrapper[29936]: I1205 13:08:52.073215 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-config-data\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.073638 master-0 kubenswrapper[29936]: I1205 13:08:52.073610 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-scripts\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.074319 master-0 kubenswrapper[29936]: I1205 13:08:52.074291 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.101862 master-0 kubenswrapper[29936]: I1205 13:08:52.101751 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xjmxk"] Dec 05 13:08:52.177471 master-0 kubenswrapper[29936]: I1205 13:08:52.177201 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.177825 master-0 kubenswrapper[29936]: I1205 13:08:52.177775 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cg54\" (UniqueName: \"kubernetes.io/projected/fbe106b9-254b-49b9-98dd-b1cfd4697210-kube-api-access-5cg54\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.178467 master-0 kubenswrapper[29936]: I1205 13:08:52.178429 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-config-data\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.179503 master-0 kubenswrapper[29936]: I1205 13:08:52.179463 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-scripts\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.184807 master-0 kubenswrapper[29936]: I1205 13:08:52.184323 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.186109 master-0 kubenswrapper[29936]: I1205 13:08:52.186066 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-scripts\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.189075 master-0 kubenswrapper[29936]: I1205 13:08:52.188675 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-config-data\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.210363 master-0 kubenswrapper[29936]: I1205 13:08:52.209073 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cg54\" (UniqueName: \"kubernetes.io/projected/fbe106b9-254b-49b9-98dd-b1cfd4697210-kube-api-access-5cg54\") pod \"nova-cell1-conductor-db-sync-xjmxk\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.331698 master-0 kubenswrapper[29936]: I1205 13:08:52.331618 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 13:08:52.413420 master-0 kubenswrapper[29936]: I1205 13:08:52.408108 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:08:52.492520 master-0 kubenswrapper[29936]: I1205 13:08:52.492322 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5372734c-00e3-40c0-91f9-17a8e5610698","Type":"ContainerStarted","Data":"2748cfa93380d82f82b53391fc1b4a4372dfae719c69f058f9f8cc45b1fc949f"} Dec 05 13:08:52.514490 master-0 kubenswrapper[29936]: I1205 13:08:52.514397 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bmgf2" event={"ID":"3ebb85b0-cfc1-4d5b-af96-01798e97b809","Type":"ContainerStarted","Data":"f0c072fdde7ca0593dd0758f937ba470cfca639b521f4b345cda700caba5363f"} Dec 05 13:08:52.514490 master-0 kubenswrapper[29936]: I1205 13:08:52.514477 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bmgf2" event={"ID":"3ebb85b0-cfc1-4d5b-af96-01798e97b809","Type":"ContainerStarted","Data":"b0585a20de316c259fdf9a78f642da9af696457462c82ba030a2d0ec3cd4511c"} Dec 05 13:08:52.529243 master-0 kubenswrapper[29936]: I1205 13:08:52.521168 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52bfe120-ee71-4c4f-8433-24164a6b82ac","Type":"ContainerStarted","Data":"312a536f951e34a53e9251245d2b9c3e7f0d478a1418aa51af960bf4609660e1"} Dec 05 13:08:52.542072 master-0 kubenswrapper[29936]: I1205 13:08:52.541982 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6944864c6f-cr675"] Dec 05 13:08:52.542870 master-0 kubenswrapper[29936]: I1205 13:08:52.542793 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c0c6457-0252-4483-b3ee-8b530b0214e7","Type":"ContainerStarted","Data":"486c590b8af509a26704a578d4913af0ebd2a9e0b9f4db8fc96d8df0e5e95974"} Dec 05 13:08:52.546024 master-0 kubenswrapper[29936]: I1205 13:08:52.545959 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7b1d92e5-7252-4099-9a92-d6566b00de62","Type":"ContainerStarted","Data":"98ff58e7b36f59843fdb2cc54682bb5a85d73a828673ae0b3796ec510308566a"} Dec 05 13:08:52.548579 master-0 kubenswrapper[29936]: I1205 13:08:52.548527 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877","Type":"ContainerStarted","Data":"955e0c0b3e7ead22a5ffa6f986a5b9a136e3b738611a75a77b0c7b0e4e8432d3"} Dec 05 13:08:52.567652 master-0 kubenswrapper[29936]: I1205 13:08:52.567530 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-bmgf2" podStartSLOduration=5.567502618 podStartE2EDuration="5.567502618s" podCreationTimestamp="2025-12-05 13:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:08:52.557850746 +0000 UTC m=+1129.689930427" watchObservedRunningTime="2025-12-05 13:08:52.567502618 +0000 UTC m=+1129.699582299" Dec 05 13:08:53.141714 master-0 kubenswrapper[29936]: I1205 13:08:53.141613 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xjmxk"] Dec 05 13:08:53.580101 master-0 kubenswrapper[29936]: I1205 13:08:53.580020 29936 generic.go:334] "Generic (PLEG): container finished" podID="df45b5b9-1e68-471c-96e9-fa2906275144" containerID="4206e9f62b5ab6541d42114fce6a197dc040814569439ea421700d3f527a2b9a" exitCode=0 Dec 05 13:08:53.581537 master-0 kubenswrapper[29936]: I1205 13:08:53.580171 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6944864c6f-cr675" event={"ID":"df45b5b9-1e68-471c-96e9-fa2906275144","Type":"ContainerDied","Data":"4206e9f62b5ab6541d42114fce6a197dc040814569439ea421700d3f527a2b9a"} Dec 05 13:08:53.581537 master-0 kubenswrapper[29936]: I1205 13:08:53.580301 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6944864c6f-cr675" event={"ID":"df45b5b9-1e68-471c-96e9-fa2906275144","Type":"ContainerStarted","Data":"271191a48769cb61a9c31e06d907a409076fb0f5b5f664d7eb63c91049cc951e"} Dec 05 13:08:54.598199 master-0 kubenswrapper[29936]: I1205 13:08:54.598114 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xjmxk" event={"ID":"fbe106b9-254b-49b9-98dd-b1cfd4697210","Type":"ContainerStarted","Data":"decb8ef6a7d224154069a5c1982c8b015696578619dcc95dfcaad52785c919f0"} Dec 05 13:08:56.772540 master-0 kubenswrapper[29936]: I1205 13:08:56.772444 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:08:56.892233 master-0 kubenswrapper[29936]: I1205 13:08:56.892144 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 13:08:58.660740 master-0 kubenswrapper[29936]: I1205 13:08:58.660537 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5372734c-00e3-40c0-91f9-17a8e5610698","Type":"ContainerStarted","Data":"c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d"} Dec 05 13:08:58.665770 master-0 kubenswrapper[29936]: I1205 13:08:58.665692 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6944864c6f-cr675" event={"ID":"df45b5b9-1e68-471c-96e9-fa2906275144","Type":"ContainerStarted","Data":"2172bd5a894f6f100bfd24ed6f31529fb361c734f767f3fa24c59434d96d049b"} Dec 05 13:08:58.665939 master-0 kubenswrapper[29936]: I1205 13:08:58.665893 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:08:58.671157 master-0 kubenswrapper[29936]: I1205 13:08:58.671112 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52bfe120-ee71-4c4f-8433-24164a6b82ac","Type":"ContainerStarted","Data":"f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6"} Dec 05 13:08:58.676276 master-0 kubenswrapper[29936]: I1205 13:08:58.673404 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="52bfe120-ee71-4c4f-8433-24164a6b82ac" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6" gracePeriod=30 Dec 05 13:08:58.682018 master-0 kubenswrapper[29936]: I1205 13:08:58.681936 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c0c6457-0252-4483-b3ee-8b530b0214e7","Type":"ContainerStarted","Data":"785a5abf9b97c3c345ea949b399d529292d39dd516259e925009bf9a72a61aba"} Dec 05 13:08:58.687636 master-0 kubenswrapper[29936]: I1205 13:08:58.687517 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7b1d92e5-7252-4099-9a92-d6566b00de62","Type":"ContainerStarted","Data":"b2bc87d1e838b0a856c301d89855dedbfb67c389ff4c8fcd9bc50fc5c62a2f96"} Dec 05 13:08:58.718243 master-0 kubenswrapper[29936]: I1205 13:08:58.701130 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xjmxk" event={"ID":"fbe106b9-254b-49b9-98dd-b1cfd4697210","Type":"ContainerStarted","Data":"660f5b16f43c93494f587bab2753f20ba1d3fa89ccf34c2349bf9f0651d2848e"} Dec 05 13:08:58.718243 master-0 kubenswrapper[29936]: I1205 13:08:58.711370 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6944864c6f-cr675" podStartSLOduration=10.711342291 podStartE2EDuration="10.711342291s" podCreationTimestamp="2025-12-05 13:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:08:58.69346068 +0000 UTC m=+1135.825540381" watchObservedRunningTime="2025-12-05 13:08:58.711342291 +0000 UTC m=+1135.843421972" Dec 05 13:08:58.768512 master-0 kubenswrapper[29936]: I1205 13:08:58.768303 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=7.196447949 podStartE2EDuration="11.768258075s" podCreationTimestamp="2025-12-05 13:08:47 +0000 UTC" firstStartedPulling="2025-12-05 13:08:52.345468059 +0000 UTC m=+1129.477547740" lastFinishedPulling="2025-12-05 13:08:56.917278005 +0000 UTC m=+1134.049357866" observedRunningTime="2025-12-05 13:08:58.717932891 +0000 UTC m=+1135.850012572" watchObservedRunningTime="2025-12-05 13:08:58.768258075 +0000 UTC m=+1135.900337746" Dec 05 13:08:58.791529 master-0 kubenswrapper[29936]: I1205 13:08:58.791187 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=6.392419411 podStartE2EDuration="11.791152398s" podCreationTimestamp="2025-12-05 13:08:47 +0000 UTC" firstStartedPulling="2025-12-05 13:08:51.46848621 +0000 UTC m=+1128.600565881" lastFinishedPulling="2025-12-05 13:08:56.867219187 +0000 UTC m=+1133.999298868" observedRunningTime="2025-12-05 13:08:58.750843395 +0000 UTC m=+1135.882923076" watchObservedRunningTime="2025-12-05 13:08:58.791152398 +0000 UTC m=+1135.923232079" Dec 05 13:08:58.800198 master-0 kubenswrapper[29936]: I1205 13:08:58.797379 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xjmxk" podStartSLOduration=7.797357428 podStartE2EDuration="7.797357428s" podCreationTimestamp="2025-12-05 13:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:08:58.788429933 +0000 UTC m=+1135.920509614" watchObservedRunningTime="2025-12-05 13:08:58.797357428 +0000 UTC m=+1135.929437109" Dec 05 13:09:00.767673 master-0 kubenswrapper[29936]: I1205 13:09:00.766274 29936 generic.go:334] "Generic (PLEG): container finished" podID="3ebb85b0-cfc1-4d5b-af96-01798e97b809" containerID="f0c072fdde7ca0593dd0758f937ba470cfca639b521f4b345cda700caba5363f" exitCode=0 Dec 05 13:09:00.767673 master-0 kubenswrapper[29936]: I1205 13:09:00.766414 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bmgf2" event={"ID":"3ebb85b0-cfc1-4d5b-af96-01798e97b809","Type":"ContainerDied","Data":"f0c072fdde7ca0593dd0758f937ba470cfca639b521f4b345cda700caba5363f"} Dec 05 13:09:01.661561 master-0 kubenswrapper[29936]: I1205 13:09:01.660870 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:03.307461 master-0 kubenswrapper[29936]: I1205 13:09:03.307369 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 13:09:03.840067 master-0 kubenswrapper[29936]: I1205 13:09:03.839997 29936 generic.go:334] "Generic (PLEG): container finished" podID="cc2221d6-014e-4bd4-962b-24512ebf84e8" containerID="86971eb55cc065fffdc859914c0e627f6872d1fbeee9c64f5dd72cbea2af43e6" exitCode=0 Dec 05 13:09:03.840067 master-0 kubenswrapper[29936]: I1205 13:09:03.840081 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"cc2221d6-014e-4bd4-962b-24512ebf84e8","Type":"ContainerDied","Data":"86971eb55cc065fffdc859914c0e627f6872d1fbeee9c64f5dd72cbea2af43e6"} Dec 05 13:09:04.670664 master-0 kubenswrapper[29936]: I1205 13:09:04.670597 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:09:04.783059 master-0 kubenswrapper[29936]: I1205 13:09:04.782798 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgg9b\" (UniqueName: \"kubernetes.io/projected/3ebb85b0-cfc1-4d5b-af96-01798e97b809-kube-api-access-pgg9b\") pod \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " Dec 05 13:09:04.783059 master-0 kubenswrapper[29936]: I1205 13:09:04.782929 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-combined-ca-bundle\") pod \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " Dec 05 13:09:04.783716 master-0 kubenswrapper[29936]: I1205 13:09:04.783294 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-scripts\") pod \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " Dec 05 13:09:04.783716 master-0 kubenswrapper[29936]: I1205 13:09:04.783468 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-config-data\") pod \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\" (UID: \"3ebb85b0-cfc1-4d5b-af96-01798e97b809\") " Dec 05 13:09:04.789937 master-0 kubenswrapper[29936]: I1205 13:09:04.789376 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-scripts" (OuterVolumeSpecName: "scripts") pod "3ebb85b0-cfc1-4d5b-af96-01798e97b809" (UID: "3ebb85b0-cfc1-4d5b-af96-01798e97b809"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:04.789937 master-0 kubenswrapper[29936]: I1205 13:09:04.789799 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ebb85b0-cfc1-4d5b-af96-01798e97b809-kube-api-access-pgg9b" (OuterVolumeSpecName: "kube-api-access-pgg9b") pod "3ebb85b0-cfc1-4d5b-af96-01798e97b809" (UID: "3ebb85b0-cfc1-4d5b-af96-01798e97b809"). InnerVolumeSpecName "kube-api-access-pgg9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:04.821661 master-0 kubenswrapper[29936]: I1205 13:09:04.821572 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ebb85b0-cfc1-4d5b-af96-01798e97b809" (UID: "3ebb85b0-cfc1-4d5b-af96-01798e97b809"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:04.847595 master-0 kubenswrapper[29936]: I1205 13:09:04.847040 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-config-data" (OuterVolumeSpecName: "config-data") pod "3ebb85b0-cfc1-4d5b-af96-01798e97b809" (UID: "3ebb85b0-cfc1-4d5b-af96-01798e97b809"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:04.856906 master-0 kubenswrapper[29936]: I1205 13:09:04.856803 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bmgf2" event={"ID":"3ebb85b0-cfc1-4d5b-af96-01798e97b809","Type":"ContainerDied","Data":"b0585a20de316c259fdf9a78f642da9af696457462c82ba030a2d0ec3cd4511c"} Dec 05 13:09:04.856906 master-0 kubenswrapper[29936]: I1205 13:09:04.856905 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0585a20de316c259fdf9a78f642da9af696457462c82ba030a2d0ec3cd4511c" Dec 05 13:09:04.857069 master-0 kubenswrapper[29936]: I1205 13:09:04.857010 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bmgf2" Dec 05 13:09:04.887809 master-0 kubenswrapper[29936]: I1205 13:09:04.887669 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgg9b\" (UniqueName: \"kubernetes.io/projected/3ebb85b0-cfc1-4d5b-af96-01798e97b809-kube-api-access-pgg9b\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:04.887809 master-0 kubenswrapper[29936]: I1205 13:09:04.887729 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:04.888291 master-0 kubenswrapper[29936]: I1205 13:09:04.888102 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:04.888291 master-0 kubenswrapper[29936]: I1205 13:09:04.888125 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebb85b0-cfc1-4d5b-af96-01798e97b809-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:05.725042 master-0 kubenswrapper[29936]: I1205 13:09:05.724959 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ftqr6"] Dec 05 13:09:05.728860 master-0 kubenswrapper[29936]: E1205 13:09:05.728792 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ebb85b0-cfc1-4d5b-af96-01798e97b809" containerName="nova-manage" Dec 05 13:09:05.728860 master-0 kubenswrapper[29936]: I1205 13:09:05.728845 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ebb85b0-cfc1-4d5b-af96-01798e97b809" containerName="nova-manage" Dec 05 13:09:05.729541 master-0 kubenswrapper[29936]: I1205 13:09:05.729498 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ebb85b0-cfc1-4d5b-af96-01798e97b809" containerName="nova-manage" Dec 05 13:09:05.732505 master-0 kubenswrapper[29936]: I1205 13:09:05.732466 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:05.760287 master-0 kubenswrapper[29936]: I1205 13:09:05.760010 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftqr6"] Dec 05 13:09:05.816251 master-0 kubenswrapper[29936]: I1205 13:09:05.816113 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-utilities\") pod \"certified-operators-ftqr6\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:05.817955 master-0 kubenswrapper[29936]: I1205 13:09:05.817002 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-catalog-content\") pod \"certified-operators-ftqr6\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:05.817955 master-0 kubenswrapper[29936]: I1205 13:09:05.817201 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rshkk\" (UniqueName: \"kubernetes.io/projected/69c28142-8016-496e-a893-4d1d220c0b6a-kube-api-access-rshkk\") pod \"certified-operators-ftqr6\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:05.880469 master-0 kubenswrapper[29936]: I1205 13:09:05.880368 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"ef0a5bb3-6b34-4c39-8cd6-5925f8dd3877","Type":"ContainerStarted","Data":"348c2892c975ddecf02be9e75b3a927dbfba1b821f422a71332ea3f06c7ac76d"} Dec 05 13:09:05.886975 master-0 kubenswrapper[29936]: I1205 13:09:05.886730 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5372734c-00e3-40c0-91f9-17a8e5610698" containerName="nova-metadata-log" containerID="cri-o://c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d" gracePeriod=30 Dec 05 13:09:05.886975 master-0 kubenswrapper[29936]: I1205 13:09:05.886886 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5372734c-00e3-40c0-91f9-17a8e5610698","Type":"ContainerStarted","Data":"84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765"} Dec 05 13:09:05.887521 master-0 kubenswrapper[29936]: I1205 13:09:05.886976 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5372734c-00e3-40c0-91f9-17a8e5610698" containerName="nova-metadata-metadata" containerID="cri-o://84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765" gracePeriod=30 Dec 05 13:09:05.904628 master-0 kubenswrapper[29936]: I1205 13:09:05.904485 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7b1d92e5-7252-4099-9a92-d6566b00de62","Type":"ContainerStarted","Data":"52b2b5998d7a0b8043ed53461295e95a79b18daae0c225afa6846d94c4b9d6a8"} Dec 05 13:09:05.920820 master-0 kubenswrapper[29936]: I1205 13:09:05.920744 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-utilities\") pod \"certified-operators-ftqr6\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:05.921174 master-0 kubenswrapper[29936]: I1205 13:09:05.920969 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-catalog-content\") pod \"certified-operators-ftqr6\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:05.921174 master-0 kubenswrapper[29936]: I1205 13:09:05.921024 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rshkk\" (UniqueName: \"kubernetes.io/projected/69c28142-8016-496e-a893-4d1d220c0b6a-kube-api-access-rshkk\") pod \"certified-operators-ftqr6\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:05.922790 master-0 kubenswrapper[29936]: I1205 13:09:05.922714 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-catalog-content\") pod \"certified-operators-ftqr6\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:05.922861 master-0 kubenswrapper[29936]: I1205 13:09:05.922791 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-utilities\") pod \"certified-operators-ftqr6\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:05.927379 master-0 kubenswrapper[29936]: I1205 13:09:05.927330 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"cc2221d6-014e-4bd4-962b-24512ebf84e8","Type":"ContainerStarted","Data":"a0a404fb7343cc23714795be7e4d43e4c15798f97c7f2f68b1d35b9dfb15fd66"} Dec 05 13:09:05.927459 master-0 kubenswrapper[29936]: I1205 13:09:05.927385 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"cc2221d6-014e-4bd4-962b-24512ebf84e8","Type":"ContainerStarted","Data":"3b4228038cba1e92ae60508f1477963aa4a35fe06e566e95c282724de7426f8b"} Dec 05 13:09:06.038134 master-0 kubenswrapper[29936]: I1205 13:09:06.037746 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rshkk\" (UniqueName: \"kubernetes.io/projected/69c28142-8016-496e-a893-4d1d220c0b6a-kube-api-access-rshkk\") pod \"certified-operators-ftqr6\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:06.062524 master-0 kubenswrapper[29936]: I1205 13:09:06.062404 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-compute-ironic-compute-0" podStartSLOduration=5.807250826 podStartE2EDuration="19.062375645s" podCreationTimestamp="2025-12-05 13:08:47 +0000 UTC" firstStartedPulling="2025-12-05 13:08:51.449043751 +0000 UTC m=+1128.581123432" lastFinishedPulling="2025-12-05 13:09:04.70416857 +0000 UTC m=+1141.836248251" observedRunningTime="2025-12-05 13:09:06.047407283 +0000 UTC m=+1143.179486964" watchObservedRunningTime="2025-12-05 13:09:06.062375645 +0000 UTC m=+1143.194455316" Dec 05 13:09:06.066362 master-0 kubenswrapper[29936]: I1205 13:09:06.062585 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:06.106225 master-0 kubenswrapper[29936]: I1205 13:09:06.105645 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:06.159155 master-0 kubenswrapper[29936]: I1205 13:09:06.158585 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:06.159155 master-0 kubenswrapper[29936]: I1205 13:09:06.158960 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4c0c6457-0252-4483-b3ee-8b530b0214e7" containerName="nova-scheduler-scheduler" containerID="cri-o://785a5abf9b97c3c345ea949b399d529292d39dd516259e925009bf9a72a61aba" gracePeriod=30 Dec 05 13:09:06.163377 master-0 kubenswrapper[29936]: I1205 13:09:06.162846 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=13.711307807 podStartE2EDuration="19.162812439s" podCreationTimestamp="2025-12-05 13:08:47 +0000 UTC" firstStartedPulling="2025-12-05 13:08:51.467715241 +0000 UTC m=+1128.599794942" lastFinishedPulling="2025-12-05 13:08:56.919219893 +0000 UTC m=+1134.051299574" observedRunningTime="2025-12-05 13:09:06.120853246 +0000 UTC m=+1143.252932927" watchObservedRunningTime="2025-12-05 13:09:06.162812439 +0000 UTC m=+1143.294892130" Dec 05 13:09:06.184227 master-0 kubenswrapper[29936]: I1205 13:09:06.183281 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=13.788449289999999 podStartE2EDuration="19.183248592s" podCreationTimestamp="2025-12-05 13:08:47 +0000 UTC" firstStartedPulling="2025-12-05 13:08:51.473559602 +0000 UTC m=+1128.605639293" lastFinishedPulling="2025-12-05 13:08:56.868358914 +0000 UTC m=+1134.000438595" observedRunningTime="2025-12-05 13:09:06.167164765 +0000 UTC m=+1143.299244456" watchObservedRunningTime="2025-12-05 13:09:06.183248592 +0000 UTC m=+1143.315328283" Dec 05 13:09:06.675256 master-0 kubenswrapper[29936]: I1205 13:09:06.671512 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:09:06.905363 master-0 kubenswrapper[29936]: I1205 13:09:06.905298 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 13:09:06.954375 master-0 kubenswrapper[29936]: I1205 13:09:06.951801 29936 generic.go:334] "Generic (PLEG): container finished" podID="5372734c-00e3-40c0-91f9-17a8e5610698" containerID="84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765" exitCode=0 Dec 05 13:09:06.954375 master-0 kubenswrapper[29936]: I1205 13:09:06.951847 29936 generic.go:334] "Generic (PLEG): container finished" podID="5372734c-00e3-40c0-91f9-17a8e5610698" containerID="c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d" exitCode=143 Dec 05 13:09:06.954375 master-0 kubenswrapper[29936]: I1205 13:09:06.953237 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 13:09:06.954375 master-0 kubenswrapper[29936]: I1205 13:09:06.953442 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5372734c-00e3-40c0-91f9-17a8e5610698","Type":"ContainerDied","Data":"84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765"} Dec 05 13:09:06.954375 master-0 kubenswrapper[29936]: I1205 13:09:06.953475 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5372734c-00e3-40c0-91f9-17a8e5610698","Type":"ContainerDied","Data":"c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d"} Dec 05 13:09:06.954375 master-0 kubenswrapper[29936]: I1205 13:09:06.953486 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5372734c-00e3-40c0-91f9-17a8e5610698","Type":"ContainerDied","Data":"2748cfa93380d82f82b53391fc1b4a4372dfae719c69f058f9f8cc45b1fc949f"} Dec 05 13:09:06.954375 master-0 kubenswrapper[29936]: I1205 13:09:06.953510 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:09:06.955468 master-0 kubenswrapper[29936]: I1205 13:09:06.955371 29936 scope.go:117] "RemoveContainer" containerID="84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765" Dec 05 13:09:06.987062 master-0 kubenswrapper[29936]: I1205 13:09:06.986864 29936 scope.go:117] "RemoveContainer" containerID="c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d" Dec 05 13:09:07.006576 master-0 kubenswrapper[29936]: I1205 13:09:07.006350 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5372734c-00e3-40c0-91f9-17a8e5610698-logs\") pod \"5372734c-00e3-40c0-91f9-17a8e5610698\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " Dec 05 13:09:07.006576 master-0 kubenswrapper[29936]: I1205 13:09:07.006491 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whvml\" (UniqueName: \"kubernetes.io/projected/5372734c-00e3-40c0-91f9-17a8e5610698-kube-api-access-whvml\") pod \"5372734c-00e3-40c0-91f9-17a8e5610698\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " Dec 05 13:09:07.006968 master-0 kubenswrapper[29936]: I1205 13:09:07.006766 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-combined-ca-bundle\") pod \"5372734c-00e3-40c0-91f9-17a8e5610698\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " Dec 05 13:09:07.010511 master-0 kubenswrapper[29936]: I1205 13:09:07.010483 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-config-data\") pod \"5372734c-00e3-40c0-91f9-17a8e5610698\" (UID: \"5372734c-00e3-40c0-91f9-17a8e5610698\") " Dec 05 13:09:07.012581 master-0 kubenswrapper[29936]: I1205 13:09:07.012304 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5372734c-00e3-40c0-91f9-17a8e5610698-logs" (OuterVolumeSpecName: "logs") pod "5372734c-00e3-40c0-91f9-17a8e5610698" (UID: "5372734c-00e3-40c0-91f9-17a8e5610698"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:09:07.012581 master-0 kubenswrapper[29936]: I1205 13:09:07.012530 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5372734c-00e3-40c0-91f9-17a8e5610698-kube-api-access-whvml" (OuterVolumeSpecName: "kube-api-access-whvml") pod "5372734c-00e3-40c0-91f9-17a8e5610698" (UID: "5372734c-00e3-40c0-91f9-17a8e5610698"). InnerVolumeSpecName "kube-api-access-whvml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:07.013809 master-0 kubenswrapper[29936]: I1205 13:09:07.013651 29936 scope.go:117] "RemoveContainer" containerID="84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765" Dec 05 13:09:07.014906 master-0 kubenswrapper[29936]: E1205 13:09:07.014835 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765\": container with ID starting with 84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765 not found: ID does not exist" containerID="84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765" Dec 05 13:09:07.014979 master-0 kubenswrapper[29936]: I1205 13:09:07.014922 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765"} err="failed to get container status \"84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765\": rpc error: code = NotFound desc = could not find container \"84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765\": container with ID starting with 84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765 not found: ID does not exist" Dec 05 13:09:07.014979 master-0 kubenswrapper[29936]: I1205 13:09:07.014969 29936 scope.go:117] "RemoveContainer" containerID="c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d" Dec 05 13:09:07.016484 master-0 kubenswrapper[29936]: E1205 13:09:07.016419 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d\": container with ID starting with c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d not found: ID does not exist" containerID="c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d" Dec 05 13:09:07.018443 master-0 kubenswrapper[29936]: I1205 13:09:07.018357 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d"} err="failed to get container status \"c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d\": rpc error: code = NotFound desc = could not find container \"c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d\": container with ID starting with c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d not found: ID does not exist" Dec 05 13:09:07.018539 master-0 kubenswrapper[29936]: I1205 13:09:07.018446 29936 scope.go:117] "RemoveContainer" containerID="84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765" Dec 05 13:09:07.018884 master-0 kubenswrapper[29936]: I1205 13:09:07.018840 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765"} err="failed to get container status \"84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765\": rpc error: code = NotFound desc = could not find container \"84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765\": container with ID starting with 84b932b2cf529268e96ad53c60c90c3902b228e9eff4187206aca0abb4651765 not found: ID does not exist" Dec 05 13:09:07.018884 master-0 kubenswrapper[29936]: I1205 13:09:07.018874 29936 scope.go:117] "RemoveContainer" containerID="c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d" Dec 05 13:09:07.019413 master-0 kubenswrapper[29936]: I1205 13:09:07.019330 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d"} err="failed to get container status \"c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d\": rpc error: code = NotFound desc = could not find container \"c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d\": container with ID starting with c28537a19f1effe6bb9e2481d515979ed22731ec5e129a658872dabc41cc921d not found: ID does not exist" Dec 05 13:09:07.023953 master-0 kubenswrapper[29936]: I1205 13:09:07.023917 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-compute-ironic-compute-0" Dec 05 13:09:07.024129 master-0 kubenswrapper[29936]: I1205 13:09:07.024003 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5372734c-00e3-40c0-91f9-17a8e5610698-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:07.024129 master-0 kubenswrapper[29936]: I1205 13:09:07.024053 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whvml\" (UniqueName: \"kubernetes.io/projected/5372734c-00e3-40c0-91f9-17a8e5610698-kube-api-access-whvml\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:07.047637 master-0 kubenswrapper[29936]: I1205 13:09:07.047558 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-config-data" (OuterVolumeSpecName: "config-data") pod "5372734c-00e3-40c0-91f9-17a8e5610698" (UID: "5372734c-00e3-40c0-91f9-17a8e5610698"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:07.052608 master-0 kubenswrapper[29936]: I1205 13:09:07.052553 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5372734c-00e3-40c0-91f9-17a8e5610698" (UID: "5372734c-00e3-40c0-91f9-17a8e5610698"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:07.127827 master-0 kubenswrapper[29936]: I1205 13:09:07.127672 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:07.127983 master-0 kubenswrapper[29936]: I1205 13:09:07.127828 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5372734c-00e3-40c0-91f9-17a8e5610698-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:07.392542 master-0 kubenswrapper[29936]: I1205 13:09:07.392453 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ftqr6"] Dec 05 13:09:07.502239 master-0 kubenswrapper[29936]: I1205 13:09:07.500486 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dc9f44fc-hmphp"] Dec 05 13:09:07.502239 master-0 kubenswrapper[29936]: I1205 13:09:07.500996 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" podUID="6046b9e1-6a97-47a9-a88b-772270e4cdaf" containerName="dnsmasq-dns" containerID="cri-o://378b55a682e4564b915ba4bef2cf9a5345a60e9b2084e132467dc39d8de40969" gracePeriod=10 Dec 05 13:09:07.562416 master-0 kubenswrapper[29936]: I1205 13:09:07.560300 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:07.590322 master-0 kubenswrapper[29936]: I1205 13:09:07.590235 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:07.614619 master-0 kubenswrapper[29936]: I1205 13:09:07.614548 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:07.615399 master-0 kubenswrapper[29936]: E1205 13:09:07.615366 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5372734c-00e3-40c0-91f9-17a8e5610698" containerName="nova-metadata-log" Dec 05 13:09:07.615399 master-0 kubenswrapper[29936]: I1205 13:09:07.615396 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5372734c-00e3-40c0-91f9-17a8e5610698" containerName="nova-metadata-log" Dec 05 13:09:07.615490 master-0 kubenswrapper[29936]: E1205 13:09:07.615433 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5372734c-00e3-40c0-91f9-17a8e5610698" containerName="nova-metadata-metadata" Dec 05 13:09:07.615490 master-0 kubenswrapper[29936]: I1205 13:09:07.615442 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5372734c-00e3-40c0-91f9-17a8e5610698" containerName="nova-metadata-metadata" Dec 05 13:09:07.615785 master-0 kubenswrapper[29936]: I1205 13:09:07.615757 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5372734c-00e3-40c0-91f9-17a8e5610698" containerName="nova-metadata-metadata" Dec 05 13:09:07.615833 master-0 kubenswrapper[29936]: I1205 13:09:07.615797 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5372734c-00e3-40c0-91f9-17a8e5610698" containerName="nova-metadata-log" Dec 05 13:09:07.617759 master-0 kubenswrapper[29936]: I1205 13:09:07.617725 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 13:09:07.623676 master-0 kubenswrapper[29936]: I1205 13:09:07.620445 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 13:09:07.631216 master-0 kubenswrapper[29936]: I1205 13:09:07.626733 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 13:09:07.644605 master-0 kubenswrapper[29936]: I1205 13:09:07.644515 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:07.713367 master-0 kubenswrapper[29936]: I1205 13:09:07.706464 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.713367 master-0 kubenswrapper[29936]: I1205 13:09:07.706652 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcp7k\" (UniqueName: \"kubernetes.io/projected/884426a8-a4eb-4387-a9ab-546f0844b879-kube-api-access-qcp7k\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.713367 master-0 kubenswrapper[29936]: I1205 13:09:07.706737 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/884426a8-a4eb-4387-a9ab-546f0844b879-logs\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.713367 master-0 kubenswrapper[29936]: I1205 13:09:07.706890 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.713367 master-0 kubenswrapper[29936]: I1205 13:09:07.707003 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-config-data\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.810092 master-0 kubenswrapper[29936]: I1205 13:09:07.809757 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.810092 master-0 kubenswrapper[29936]: I1205 13:09:07.809881 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcp7k\" (UniqueName: \"kubernetes.io/projected/884426a8-a4eb-4387-a9ab-546f0844b879-kube-api-access-qcp7k\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.810092 master-0 kubenswrapper[29936]: I1205 13:09:07.809911 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/884426a8-a4eb-4387-a9ab-546f0844b879-logs\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.810092 master-0 kubenswrapper[29936]: I1205 13:09:07.809956 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.810092 master-0 kubenswrapper[29936]: I1205 13:09:07.810051 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-config-data\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.824907 master-0 kubenswrapper[29936]: I1205 13:09:07.815384 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-config-data\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.824907 master-0 kubenswrapper[29936]: I1205 13:09:07.815718 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/884426a8-a4eb-4387-a9ab-546f0844b879-logs\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.824907 master-0 kubenswrapper[29936]: I1205 13:09:07.817201 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.824907 master-0 kubenswrapper[29936]: I1205 13:09:07.819935 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:07.993611 master-0 kubenswrapper[29936]: I1205 13:09:07.993536 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqr6" event={"ID":"69c28142-8016-496e-a893-4d1d220c0b6a","Type":"ContainerStarted","Data":"49ea2905a8ab69103d335a06b1c0c861d01623fbba3d90de60f3c2e96c6d885e"} Dec 05 13:09:07.998663 master-0 kubenswrapper[29936]: I1205 13:09:07.998636 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"cc2221d6-014e-4bd4-962b-24512ebf84e8","Type":"ContainerStarted","Data":"5698270b628540068290e7a1494b15db8236ebe2b3087ce588bc38b1da59a15a"} Dec 05 13:09:07.999104 master-0 kubenswrapper[29936]: I1205 13:09:07.999083 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Dec 05 13:09:07.999220 master-0 kubenswrapper[29936]: I1205 13:09:07.999207 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Dec 05 13:09:07.999301 master-0 kubenswrapper[29936]: I1205 13:09:07.999290 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Dec 05 13:09:07.999365 master-0 kubenswrapper[29936]: I1205 13:09:07.999288 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcp7k\" (UniqueName: \"kubernetes.io/projected/884426a8-a4eb-4387-a9ab-546f0844b879-kube-api-access-qcp7k\") pod \"nova-metadata-0\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " pod="openstack/nova-metadata-0" Dec 05 13:09:08.004966 master-0 kubenswrapper[29936]: I1205 13:09:08.004885 29936 generic.go:334] "Generic (PLEG): container finished" podID="6046b9e1-6a97-47a9-a88b-772270e4cdaf" containerID="378b55a682e4564b915ba4bef2cf9a5345a60e9b2084e132467dc39d8de40969" exitCode=0 Dec 05 13:09:08.005120 master-0 kubenswrapper[29936]: I1205 13:09:08.004973 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" event={"ID":"6046b9e1-6a97-47a9-a88b-772270e4cdaf","Type":"ContainerDied","Data":"378b55a682e4564b915ba4bef2cf9a5345a60e9b2084e132467dc39d8de40969"} Dec 05 13:09:08.005414 master-0 kubenswrapper[29936]: I1205 13:09:08.005374 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7b1d92e5-7252-4099-9a92-d6566b00de62" containerName="nova-api-log" containerID="cri-o://b2bc87d1e838b0a856c301d89855dedbfb67c389ff4c8fcd9bc50fc5c62a2f96" gracePeriod=30 Dec 05 13:09:08.005587 master-0 kubenswrapper[29936]: I1205 13:09:08.005425 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7b1d92e5-7252-4099-9a92-d6566b00de62" containerName="nova-api-api" containerID="cri-o://52b2b5998d7a0b8043ed53461295e95a79b18daae0c225afa6846d94c4b9d6a8" gracePeriod=30 Dec 05 13:09:08.261996 master-0 kubenswrapper[29936]: I1205 13:09:08.261869 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 13:09:08.300404 master-0 kubenswrapper[29936]: I1205 13:09:08.300312 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" podUID="6046b9e1-6a97-47a9-a88b-772270e4cdaf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.246:5353: connect: connection refused" Dec 05 13:09:09.072476 master-0 kubenswrapper[29936]: I1205 13:09:09.072382 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" event={"ID":"6046b9e1-6a97-47a9-a88b-772270e4cdaf","Type":"ContainerDied","Data":"7a6134451c8a5ae282bf4f802f121960d92f7de6d7f390a2b9dd979fe745fc1a"} Dec 05 13:09:09.072476 master-0 kubenswrapper[29936]: I1205 13:09:09.072434 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a6134451c8a5ae282bf4f802f121960d92f7de6d7f390a2b9dd979fe745fc1a" Dec 05 13:09:09.075458 master-0 kubenswrapper[29936]: I1205 13:09:09.075409 29936 generic.go:334] "Generic (PLEG): container finished" podID="7b1d92e5-7252-4099-9a92-d6566b00de62" containerID="52b2b5998d7a0b8043ed53461295e95a79b18daae0c225afa6846d94c4b9d6a8" exitCode=0 Dec 05 13:09:09.075458 master-0 kubenswrapper[29936]: I1205 13:09:09.075433 29936 generic.go:334] "Generic (PLEG): container finished" podID="7b1d92e5-7252-4099-9a92-d6566b00de62" containerID="b2bc87d1e838b0a856c301d89855dedbfb67c389ff4c8fcd9bc50fc5c62a2f96" exitCode=143 Dec 05 13:09:09.075697 master-0 kubenswrapper[29936]: I1205 13:09:09.075663 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7b1d92e5-7252-4099-9a92-d6566b00de62","Type":"ContainerDied","Data":"52b2b5998d7a0b8043ed53461295e95a79b18daae0c225afa6846d94c4b9d6a8"} Dec 05 13:09:09.075829 master-0 kubenswrapper[29936]: I1205 13:09:09.075795 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7b1d92e5-7252-4099-9a92-d6566b00de62","Type":"ContainerDied","Data":"b2bc87d1e838b0a856c301d89855dedbfb67c389ff4c8fcd9bc50fc5c62a2f96"} Dec 05 13:09:09.105481 master-0 kubenswrapper[29936]: I1205 13:09:09.095180 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqr6" event={"ID":"69c28142-8016-496e-a893-4d1d220c0b6a","Type":"ContainerStarted","Data":"992acd814e774c11a94b32cefaa33473403429302cc39315bb6dcd259d238835"} Dec 05 13:09:09.188934 master-0 kubenswrapper[29936]: I1205 13:09:09.188795 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:09:09.191245 master-0 kubenswrapper[29936]: W1205 13:09:09.189743 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod884426a8_a4eb_4387_a9ab_546f0844b879.slice/crio-257e24abc28a345e468534fbf1d2e3f6df66ddb767d21ae3840ab5db87e95d8a WatchSource:0}: Error finding container 257e24abc28a345e468534fbf1d2e3f6df66ddb767d21ae3840ab5db87e95d8a: Status 404 returned error can't find the container with id 257e24abc28a345e468534fbf1d2e3f6df66ddb767d21ae3840ab5db87e95d8a Dec 05 13:09:09.227124 master-0 kubenswrapper[29936]: I1205 13:09:09.227018 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5372734c-00e3-40c0-91f9-17a8e5610698" path="/var/lib/kubelet/pods/5372734c-00e3-40c0-91f9-17a8e5610698/volumes" Dec 05 13:09:09.265942 master-0 kubenswrapper[29936]: I1205 13:09:09.261953 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=64.780963044 podStartE2EDuration="2m5.261931328s" podCreationTimestamp="2025-12-05 13:07:04 +0000 UTC" firstStartedPulling="2025-12-05 13:07:14.793910246 +0000 UTC m=+1031.925989927" lastFinishedPulling="2025-12-05 13:08:15.27487853 +0000 UTC m=+1092.406958211" observedRunningTime="2025-12-05 13:09:09.259787735 +0000 UTC m=+1146.391867436" watchObservedRunningTime="2025-12-05 13:09:09.261931328 +0000 UTC m=+1146.394011019" Dec 05 13:09:09.274675 master-0 kubenswrapper[29936]: I1205 13:09:09.274597 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:09.313938 master-0 kubenswrapper[29936]: I1205 13:09:09.313023 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-992bj\" (UniqueName: \"kubernetes.io/projected/6046b9e1-6a97-47a9-a88b-772270e4cdaf-kube-api-access-992bj\") pod \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " Dec 05 13:09:09.313938 master-0 kubenswrapper[29936]: I1205 13:09:09.313342 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-sb\") pod \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " Dec 05 13:09:09.313938 master-0 kubenswrapper[29936]: I1205 13:09:09.313458 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-config\") pod \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " Dec 05 13:09:09.313938 master-0 kubenswrapper[29936]: I1205 13:09:09.313527 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-svc\") pod \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " Dec 05 13:09:09.313938 master-0 kubenswrapper[29936]: I1205 13:09:09.313620 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-swift-storage-0\") pod \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " Dec 05 13:09:09.313938 master-0 kubenswrapper[29936]: I1205 13:09:09.313773 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-nb\") pod \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\" (UID: \"6046b9e1-6a97-47a9-a88b-772270e4cdaf\") " Dec 05 13:09:09.339384 master-0 kubenswrapper[29936]: I1205 13:09:09.336016 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6046b9e1-6a97-47a9-a88b-772270e4cdaf-kube-api-access-992bj" (OuterVolumeSpecName: "kube-api-access-992bj") pod "6046b9e1-6a97-47a9-a88b-772270e4cdaf" (UID: "6046b9e1-6a97-47a9-a88b-772270e4cdaf"). InnerVolumeSpecName "kube-api-access-992bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:09.418815 master-0 kubenswrapper[29936]: I1205 13:09:09.418733 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-992bj\" (UniqueName: \"kubernetes.io/projected/6046b9e1-6a97-47a9-a88b-772270e4cdaf-kube-api-access-992bj\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:09.516142 master-0 kubenswrapper[29936]: I1205 13:09:09.515921 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-config" (OuterVolumeSpecName: "config") pod "6046b9e1-6a97-47a9-a88b-772270e4cdaf" (UID: "6046b9e1-6a97-47a9-a88b-772270e4cdaf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:09:09.520339 master-0 kubenswrapper[29936]: I1205 13:09:09.518941 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6046b9e1-6a97-47a9-a88b-772270e4cdaf" (UID: "6046b9e1-6a97-47a9-a88b-772270e4cdaf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:09:09.524573 master-0 kubenswrapper[29936]: I1205 13:09:09.521687 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:09.524573 master-0 kubenswrapper[29936]: I1205 13:09:09.521716 29936 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:09.527743 master-0 kubenswrapper[29936]: I1205 13:09:09.527704 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6046b9e1-6a97-47a9-a88b-772270e4cdaf" (UID: "6046b9e1-6a97-47a9-a88b-772270e4cdaf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:09:09.626895 master-0 kubenswrapper[29936]: I1205 13:09:09.626104 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:09.627714 master-0 kubenswrapper[29936]: I1205 13:09:09.627634 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6046b9e1-6a97-47a9-a88b-772270e4cdaf" (UID: "6046b9e1-6a97-47a9-a88b-772270e4cdaf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:09:09.629169 master-0 kubenswrapper[29936]: I1205 13:09:09.629139 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6046b9e1-6a97-47a9-a88b-772270e4cdaf" (UID: "6046b9e1-6a97-47a9-a88b-772270e4cdaf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:09:09.738710 master-0 kubenswrapper[29936]: I1205 13:09:09.731699 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:09.738710 master-0 kubenswrapper[29936]: I1205 13:09:09.731765 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6046b9e1-6a97-47a9-a88b-772270e4cdaf-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:09.871263 master-0 kubenswrapper[29936]: I1205 13:09:09.871066 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:09.936721 master-0 kubenswrapper[29936]: I1205 13:09:09.936623 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-combined-ca-bundle\") pod \"7b1d92e5-7252-4099-9a92-d6566b00de62\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " Dec 05 13:09:09.937106 master-0 kubenswrapper[29936]: I1205 13:09:09.937064 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85pnm\" (UniqueName: \"kubernetes.io/projected/7b1d92e5-7252-4099-9a92-d6566b00de62-kube-api-access-85pnm\") pod \"7b1d92e5-7252-4099-9a92-d6566b00de62\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " Dec 05 13:09:09.937163 master-0 kubenswrapper[29936]: I1205 13:09:09.937128 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b1d92e5-7252-4099-9a92-d6566b00de62-logs\") pod \"7b1d92e5-7252-4099-9a92-d6566b00de62\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " Dec 05 13:09:09.937567 master-0 kubenswrapper[29936]: I1205 13:09:09.937500 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-config-data\") pod \"7b1d92e5-7252-4099-9a92-d6566b00de62\" (UID: \"7b1d92e5-7252-4099-9a92-d6566b00de62\") " Dec 05 13:09:09.937816 master-0 kubenswrapper[29936]: I1205 13:09:09.937767 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b1d92e5-7252-4099-9a92-d6566b00de62-logs" (OuterVolumeSpecName: "logs") pod "7b1d92e5-7252-4099-9a92-d6566b00de62" (UID: "7b1d92e5-7252-4099-9a92-d6566b00de62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:09:09.939215 master-0 kubenswrapper[29936]: I1205 13:09:09.939157 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b1d92e5-7252-4099-9a92-d6566b00de62-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:09.941326 master-0 kubenswrapper[29936]: I1205 13:09:09.941252 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b1d92e5-7252-4099-9a92-d6566b00de62-kube-api-access-85pnm" (OuterVolumeSpecName: "kube-api-access-85pnm") pod "7b1d92e5-7252-4099-9a92-d6566b00de62" (UID: "7b1d92e5-7252-4099-9a92-d6566b00de62"). InnerVolumeSpecName "kube-api-access-85pnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:09.970195 master-0 kubenswrapper[29936]: I1205 13:09:09.970092 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-config-data" (OuterVolumeSpecName: "config-data") pod "7b1d92e5-7252-4099-9a92-d6566b00de62" (UID: "7b1d92e5-7252-4099-9a92-d6566b00de62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:09.972077 master-0 kubenswrapper[29936]: I1205 13:09:09.971987 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b1d92e5-7252-4099-9a92-d6566b00de62" (UID: "7b1d92e5-7252-4099-9a92-d6566b00de62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:10.042289 master-0 kubenswrapper[29936]: I1205 13:09:10.042059 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:10.042289 master-0 kubenswrapper[29936]: I1205 13:09:10.042115 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b1d92e5-7252-4099-9a92-d6566b00de62-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:10.042289 master-0 kubenswrapper[29936]: I1205 13:09:10.042133 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85pnm\" (UniqueName: \"kubernetes.io/projected/7b1d92e5-7252-4099-9a92-d6566b00de62-kube-api-access-85pnm\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:10.109831 master-0 kubenswrapper[29936]: I1205 13:09:10.109705 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"884426a8-a4eb-4387-a9ab-546f0844b879","Type":"ContainerStarted","Data":"257e24abc28a345e468534fbf1d2e3f6df66ddb767d21ae3840ab5db87e95d8a"} Dec 05 13:09:10.113416 master-0 kubenswrapper[29936]: I1205 13:09:10.113336 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7b1d92e5-7252-4099-9a92-d6566b00de62","Type":"ContainerDied","Data":"98ff58e7b36f59843fdb2cc54682bb5a85d73a828673ae0b3796ec510308566a"} Dec 05 13:09:10.113557 master-0 kubenswrapper[29936]: I1205 13:09:10.113425 29936 scope.go:117] "RemoveContainer" containerID="52b2b5998d7a0b8043ed53461295e95a79b18daae0c225afa6846d94c4b9d6a8" Dec 05 13:09:10.113557 master-0 kubenswrapper[29936]: I1205 13:09:10.113519 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:10.117864 master-0 kubenswrapper[29936]: I1205 13:09:10.117790 29936 generic.go:334] "Generic (PLEG): container finished" podID="69c28142-8016-496e-a893-4d1d220c0b6a" containerID="992acd814e774c11a94b32cefaa33473403429302cc39315bb6dcd259d238835" exitCode=0 Dec 05 13:09:10.118012 master-0 kubenswrapper[29936]: I1205 13:09:10.117912 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqr6" event={"ID":"69c28142-8016-496e-a893-4d1d220c0b6a","Type":"ContainerDied","Data":"992acd814e774c11a94b32cefaa33473403429302cc39315bb6dcd259d238835"} Dec 05 13:09:10.118097 master-0 kubenswrapper[29936]: I1205 13:09:10.117952 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dc9f44fc-hmphp" Dec 05 13:09:10.150608 master-0 kubenswrapper[29936]: I1205 13:09:10.150517 29936 scope.go:117] "RemoveContainer" containerID="b2bc87d1e838b0a856c301d89855dedbfb67c389ff4c8fcd9bc50fc5c62a2f96" Dec 05 13:09:10.779574 master-0 kubenswrapper[29936]: I1205 13:09:10.779465 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dc9f44fc-hmphp"] Dec 05 13:09:10.795543 master-0 kubenswrapper[29936]: I1205 13:09:10.795458 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75dc9f44fc-hmphp"] Dec 05 13:09:10.819102 master-0 kubenswrapper[29936]: I1205 13:09:10.812353 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:10.828265 master-0 kubenswrapper[29936]: I1205 13:09:10.826692 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:10.876214 master-0 kubenswrapper[29936]: I1205 13:09:10.874677 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:10.876214 master-0 kubenswrapper[29936]: E1205 13:09:10.875808 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6046b9e1-6a97-47a9-a88b-772270e4cdaf" containerName="init" Dec 05 13:09:10.876214 master-0 kubenswrapper[29936]: I1205 13:09:10.875836 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6046b9e1-6a97-47a9-a88b-772270e4cdaf" containerName="init" Dec 05 13:09:10.876214 master-0 kubenswrapper[29936]: E1205 13:09:10.875862 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1d92e5-7252-4099-9a92-d6566b00de62" containerName="nova-api-api" Dec 05 13:09:10.876214 master-0 kubenswrapper[29936]: I1205 13:09:10.875873 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1d92e5-7252-4099-9a92-d6566b00de62" containerName="nova-api-api" Dec 05 13:09:10.876214 master-0 kubenswrapper[29936]: E1205 13:09:10.875902 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1d92e5-7252-4099-9a92-d6566b00de62" containerName="nova-api-log" Dec 05 13:09:10.876214 master-0 kubenswrapper[29936]: I1205 13:09:10.875912 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1d92e5-7252-4099-9a92-d6566b00de62" containerName="nova-api-log" Dec 05 13:09:10.876214 master-0 kubenswrapper[29936]: E1205 13:09:10.875950 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6046b9e1-6a97-47a9-a88b-772270e4cdaf" containerName="dnsmasq-dns" Dec 05 13:09:10.876214 master-0 kubenswrapper[29936]: I1205 13:09:10.875961 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6046b9e1-6a97-47a9-a88b-772270e4cdaf" containerName="dnsmasq-dns" Dec 05 13:09:10.876807 master-0 kubenswrapper[29936]: I1205 13:09:10.876406 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b1d92e5-7252-4099-9a92-d6566b00de62" containerName="nova-api-log" Dec 05 13:09:10.876807 master-0 kubenswrapper[29936]: I1205 13:09:10.876474 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b1d92e5-7252-4099-9a92-d6566b00de62" containerName="nova-api-api" Dec 05 13:09:10.876807 master-0 kubenswrapper[29936]: I1205 13:09:10.876509 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="6046b9e1-6a97-47a9-a88b-772270e4cdaf" containerName="dnsmasq-dns" Dec 05 13:09:10.879816 master-0 kubenswrapper[29936]: I1205 13:09:10.878654 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:10.887125 master-0 kubenswrapper[29936]: I1205 13:09:10.886856 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 13:09:10.897974 master-0 kubenswrapper[29936]: I1205 13:09:10.897873 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:10.969556 master-0 kubenswrapper[29936]: I1205 13:09:10.969482 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-config-data\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:10.969889 master-0 kubenswrapper[29936]: I1205 13:09:10.969682 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:10.970391 master-0 kubenswrapper[29936]: I1205 13:09:10.970341 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vmft\" (UniqueName: \"kubernetes.io/projected/b060638c-8b2a-4dbb-95e5-e1d8839e9783-kube-api-access-6vmft\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:10.970695 master-0 kubenswrapper[29936]: I1205 13:09:10.970651 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b060638c-8b2a-4dbb-95e5-e1d8839e9783-logs\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:11.073638 master-0 kubenswrapper[29936]: I1205 13:09:11.073532 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b060638c-8b2a-4dbb-95e5-e1d8839e9783-logs\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:11.074056 master-0 kubenswrapper[29936]: I1205 13:09:11.073717 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-config-data\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:11.074056 master-0 kubenswrapper[29936]: I1205 13:09:11.073760 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:11.074056 master-0 kubenswrapper[29936]: I1205 13:09:11.073972 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vmft\" (UniqueName: \"kubernetes.io/projected/b060638c-8b2a-4dbb-95e5-e1d8839e9783-kube-api-access-6vmft\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:11.074503 master-0 kubenswrapper[29936]: I1205 13:09:11.074411 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b060638c-8b2a-4dbb-95e5-e1d8839e9783-logs\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:11.079943 master-0 kubenswrapper[29936]: I1205 13:09:11.079791 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:11.082424 master-0 kubenswrapper[29936]: I1205 13:09:11.082358 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-config-data\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:11.132484 master-0 kubenswrapper[29936]: I1205 13:09:11.096528 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vmft\" (UniqueName: \"kubernetes.io/projected/b060638c-8b2a-4dbb-95e5-e1d8839e9783-kube-api-access-6vmft\") pod \"nova-api-0\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " pod="openstack/nova-api-0" Dec 05 13:09:11.165656 master-0 kubenswrapper[29936]: I1205 13:09:11.165326 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"884426a8-a4eb-4387-a9ab-546f0844b879","Type":"ContainerStarted","Data":"611789a47c38f1fdc0977a80a51e6e86b0b6450cebc4e4bb4f961783beec151a"} Dec 05 13:09:11.165656 master-0 kubenswrapper[29936]: I1205 13:09:11.165477 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"884426a8-a4eb-4387-a9ab-546f0844b879","Type":"ContainerStarted","Data":"edaaf002e5a85715d16c07bb939e87620e427ec80567ef22989739166fd43448"} Dec 05 13:09:11.192899 master-0 kubenswrapper[29936]: I1205 13:09:11.192821 29936 generic.go:334] "Generic (PLEG): container finished" podID="4c0c6457-0252-4483-b3ee-8b530b0214e7" containerID="785a5abf9b97c3c345ea949b399d529292d39dd516259e925009bf9a72a61aba" exitCode=0 Dec 05 13:09:11.234364 master-0 kubenswrapper[29936]: I1205 13:09:11.234265 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6046b9e1-6a97-47a9-a88b-772270e4cdaf" path="/var/lib/kubelet/pods/6046b9e1-6a97-47a9-a88b-772270e4cdaf/volumes" Dec 05 13:09:11.235218 master-0 kubenswrapper[29936]: I1205 13:09:11.235160 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b1d92e5-7252-4099-9a92-d6566b00de62" path="/var/lib/kubelet/pods/7b1d92e5-7252-4099-9a92-d6566b00de62/volumes" Dec 05 13:09:11.243352 master-0 kubenswrapper[29936]: I1205 13:09:11.243266 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c0c6457-0252-4483-b3ee-8b530b0214e7","Type":"ContainerDied","Data":"785a5abf9b97c3c345ea949b399d529292d39dd516259e925009bf9a72a61aba"} Dec 05 13:09:11.244499 master-0 kubenswrapper[29936]: I1205 13:09:11.243371 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqr6" event={"ID":"69c28142-8016-496e-a893-4d1d220c0b6a","Type":"ContainerStarted","Data":"affddd2860d80e0ce7b18928534da37a1673240efc200f32164d40161fc30343"} Dec 05 13:09:11.267250 master-0 kubenswrapper[29936]: I1205 13:09:11.267133 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:11.289839 master-0 kubenswrapper[29936]: I1205 13:09:11.289733 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.289707845 podStartE2EDuration="4.289707845s" podCreationTimestamp="2025-12-05 13:09:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:09:11.223841405 +0000 UTC m=+1148.355921086" watchObservedRunningTime="2025-12-05 13:09:11.289707845 +0000 UTC m=+1148.421787526" Dec 05 13:09:11.502137 master-0 kubenswrapper[29936]: I1205 13:09:11.501874 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 13:09:11.607989 master-0 kubenswrapper[29936]: I1205 13:09:11.605209 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-config-data\") pod \"4c0c6457-0252-4483-b3ee-8b530b0214e7\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " Dec 05 13:09:11.607989 master-0 kubenswrapper[29936]: I1205 13:09:11.606155 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-combined-ca-bundle\") pod \"4c0c6457-0252-4483-b3ee-8b530b0214e7\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " Dec 05 13:09:11.607989 master-0 kubenswrapper[29936]: I1205 13:09:11.606300 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6hpt\" (UniqueName: \"kubernetes.io/projected/4c0c6457-0252-4483-b3ee-8b530b0214e7-kube-api-access-s6hpt\") pod \"4c0c6457-0252-4483-b3ee-8b530b0214e7\" (UID: \"4c0c6457-0252-4483-b3ee-8b530b0214e7\") " Dec 05 13:09:11.625923 master-0 kubenswrapper[29936]: I1205 13:09:11.625809 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0c6457-0252-4483-b3ee-8b530b0214e7-kube-api-access-s6hpt" (OuterVolumeSpecName: "kube-api-access-s6hpt") pod "4c0c6457-0252-4483-b3ee-8b530b0214e7" (UID: "4c0c6457-0252-4483-b3ee-8b530b0214e7"). InnerVolumeSpecName "kube-api-access-s6hpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:11.651757 master-0 kubenswrapper[29936]: I1205 13:09:11.651505 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-config-data" (OuterVolumeSpecName: "config-data") pod "4c0c6457-0252-4483-b3ee-8b530b0214e7" (UID: "4c0c6457-0252-4483-b3ee-8b530b0214e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:11.674527 master-0 kubenswrapper[29936]: I1205 13:09:11.674417 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c0c6457-0252-4483-b3ee-8b530b0214e7" (UID: "4c0c6457-0252-4483-b3ee-8b530b0214e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:11.753675 master-0 kubenswrapper[29936]: I1205 13:09:11.753567 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:11.753675 master-0 kubenswrapper[29936]: I1205 13:09:11.753658 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6hpt\" (UniqueName: \"kubernetes.io/projected/4c0c6457-0252-4483-b3ee-8b530b0214e7-kube-api-access-s6hpt\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:11.753675 master-0 kubenswrapper[29936]: I1205 13:09:11.753676 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c0c6457-0252-4483-b3ee-8b530b0214e7-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:11.808131 master-0 kubenswrapper[29936]: W1205 13:09:11.808042 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb060638c_8b2a_4dbb_95e5_e1d8839e9783.slice/crio-a6e938d5757db536032878c2d7e7c9dd41b8220058a971082de1afc80cccdaca WatchSource:0}: Error finding container a6e938d5757db536032878c2d7e7c9dd41b8220058a971082de1afc80cccdaca: Status 404 returned error can't find the container with id a6e938d5757db536032878c2d7e7c9dd41b8220058a971082de1afc80cccdaca Dec 05 13:09:11.823985 master-0 kubenswrapper[29936]: I1205 13:09:11.823905 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:12.268473 master-0 kubenswrapper[29936]: I1205 13:09:12.268372 29936 generic.go:334] "Generic (PLEG): container finished" podID="69c28142-8016-496e-a893-4d1d220c0b6a" containerID="affddd2860d80e0ce7b18928534da37a1673240efc200f32164d40161fc30343" exitCode=0 Dec 05 13:09:12.269326 master-0 kubenswrapper[29936]: I1205 13:09:12.268479 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqr6" event={"ID":"69c28142-8016-496e-a893-4d1d220c0b6a","Type":"ContainerDied","Data":"affddd2860d80e0ce7b18928534da37a1673240efc200f32164d40161fc30343"} Dec 05 13:09:12.279855 master-0 kubenswrapper[29936]: I1205 13:09:12.279745 29936 generic.go:334] "Generic (PLEG): container finished" podID="fbe106b9-254b-49b9-98dd-b1cfd4697210" containerID="660f5b16f43c93494f587bab2753f20ba1d3fa89ccf34c2349bf9f0651d2848e" exitCode=0 Dec 05 13:09:12.280065 master-0 kubenswrapper[29936]: I1205 13:09:12.279952 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xjmxk" event={"ID":"fbe106b9-254b-49b9-98dd-b1cfd4697210","Type":"ContainerDied","Data":"660f5b16f43c93494f587bab2753f20ba1d3fa89ccf34c2349bf9f0651d2848e"} Dec 05 13:09:12.284636 master-0 kubenswrapper[29936]: I1205 13:09:12.284450 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b060638c-8b2a-4dbb-95e5-e1d8839e9783","Type":"ContainerStarted","Data":"231c4c15fcf1b39253b2377a3bab92515e90ba7432e95e42cc28f9a61406631b"} Dec 05 13:09:12.285370 master-0 kubenswrapper[29936]: I1205 13:09:12.284510 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b060638c-8b2a-4dbb-95e5-e1d8839e9783","Type":"ContainerStarted","Data":"a6e938d5757db536032878c2d7e7c9dd41b8220058a971082de1afc80cccdaca"} Dec 05 13:09:12.288693 master-0 kubenswrapper[29936]: I1205 13:09:12.288648 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 13:09:12.288693 master-0 kubenswrapper[29936]: I1205 13:09:12.288649 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c0c6457-0252-4483-b3ee-8b530b0214e7","Type":"ContainerDied","Data":"486c590b8af509a26704a578d4913af0ebd2a9e0b9f4db8fc96d8df0e5e95974"} Dec 05 13:09:12.288845 master-0 kubenswrapper[29936]: I1205 13:09:12.288764 29936 scope.go:117] "RemoveContainer" containerID="785a5abf9b97c3c345ea949b399d529292d39dd516259e925009bf9a72a61aba" Dec 05 13:09:12.495615 master-0 kubenswrapper[29936]: I1205 13:09:12.495387 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:12.512923 master-0 kubenswrapper[29936]: I1205 13:09:12.512824 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:12.544242 master-0 kubenswrapper[29936]: I1205 13:09:12.544123 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:12.545026 master-0 kubenswrapper[29936]: E1205 13:09:12.544986 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0c6457-0252-4483-b3ee-8b530b0214e7" containerName="nova-scheduler-scheduler" Dec 05 13:09:12.545026 master-0 kubenswrapper[29936]: I1205 13:09:12.545018 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0c6457-0252-4483-b3ee-8b530b0214e7" containerName="nova-scheduler-scheduler" Dec 05 13:09:12.545463 master-0 kubenswrapper[29936]: I1205 13:09:12.545427 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0c6457-0252-4483-b3ee-8b530b0214e7" containerName="nova-scheduler-scheduler" Dec 05 13:09:12.546615 master-0 kubenswrapper[29936]: I1205 13:09:12.546580 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 13:09:12.551086 master-0 kubenswrapper[29936]: I1205 13:09:12.551021 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 13:09:12.577664 master-0 kubenswrapper[29936]: I1205 13:09:12.559538 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:12.686345 master-0 kubenswrapper[29936]: I1205 13:09:12.685941 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-config-data\") pod \"nova-scheduler-0\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:12.688051 master-0 kubenswrapper[29936]: I1205 13:09:12.687965 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnvqb\" (UniqueName: \"kubernetes.io/projected/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-kube-api-access-gnvqb\") pod \"nova-scheduler-0\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:12.688728 master-0 kubenswrapper[29936]: I1205 13:09:12.688633 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:12.791433 master-0 kubenswrapper[29936]: I1205 13:09:12.791347 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnvqb\" (UniqueName: \"kubernetes.io/projected/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-kube-api-access-gnvqb\") pod \"nova-scheduler-0\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:12.791711 master-0 kubenswrapper[29936]: I1205 13:09:12.791556 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:12.791711 master-0 kubenswrapper[29936]: I1205 13:09:12.791670 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-config-data\") pod \"nova-scheduler-0\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:12.798443 master-0 kubenswrapper[29936]: I1205 13:09:12.798358 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:12.798551 master-0 kubenswrapper[29936]: I1205 13:09:12.798514 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-config-data\") pod \"nova-scheduler-0\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:12.823523 master-0 kubenswrapper[29936]: I1205 13:09:12.823337 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnvqb\" (UniqueName: \"kubernetes.io/projected/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-kube-api-access-gnvqb\") pod \"nova-scheduler-0\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:12.872520 master-0 kubenswrapper[29936]: I1205 13:09:12.872428 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 13:09:13.211362 master-0 kubenswrapper[29936]: I1205 13:09:13.210066 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0c6457-0252-4483-b3ee-8b530b0214e7" path="/var/lib/kubelet/pods/4c0c6457-0252-4483-b3ee-8b530b0214e7/volumes" Dec 05 13:09:13.263579 master-0 kubenswrapper[29936]: I1205 13:09:13.263337 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 13:09:13.263579 master-0 kubenswrapper[29936]: I1205 13:09:13.263450 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 13:09:13.331543 master-0 kubenswrapper[29936]: I1205 13:09:13.331339 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqr6" event={"ID":"69c28142-8016-496e-a893-4d1d220c0b6a","Type":"ContainerStarted","Data":"706c6bee285eb2edfeb618a0fa0b88c87ade61151a54a24676e4e41f7c7e3ca7"} Dec 05 13:09:13.335840 master-0 kubenswrapper[29936]: I1205 13:09:13.335752 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b060638c-8b2a-4dbb-95e5-e1d8839e9783","Type":"ContainerStarted","Data":"1dad7eae3542ce79913bd6aea7afda5d404262a9aaa677d3ac98686c03740104"} Dec 05 13:09:13.372610 master-0 kubenswrapper[29936]: I1205 13:09:13.372540 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:13.374112 master-0 kubenswrapper[29936]: I1205 13:09:13.373979 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ftqr6" podStartSLOduration=5.8340854669999995 podStartE2EDuration="8.373951386s" podCreationTimestamp="2025-12-05 13:09:05 +0000 UTC" firstStartedPulling="2025-12-05 13:09:10.150718792 +0000 UTC m=+1147.282798473" lastFinishedPulling="2025-12-05 13:09:12.690584711 +0000 UTC m=+1149.822664392" observedRunningTime="2025-12-05 13:09:13.359368844 +0000 UTC m=+1150.491448545" watchObservedRunningTime="2025-12-05 13:09:13.373951386 +0000 UTC m=+1150.506031067" Dec 05 13:09:13.413401 master-0 kubenswrapper[29936]: I1205 13:09:13.413283 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.413238054 podStartE2EDuration="3.413238054s" podCreationTimestamp="2025-12-05 13:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:09:13.395436174 +0000 UTC m=+1150.527515855" watchObservedRunningTime="2025-12-05 13:09:13.413238054 +0000 UTC m=+1150.545317755" Dec 05 13:09:13.861312 master-0 kubenswrapper[29936]: I1205 13:09:13.861090 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:09:13.977570 master-0 kubenswrapper[29936]: I1205 13:09:13.977462 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-combined-ca-bundle\") pod \"fbe106b9-254b-49b9-98dd-b1cfd4697210\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " Dec 05 13:09:13.977890 master-0 kubenswrapper[29936]: I1205 13:09:13.977699 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cg54\" (UniqueName: \"kubernetes.io/projected/fbe106b9-254b-49b9-98dd-b1cfd4697210-kube-api-access-5cg54\") pod \"fbe106b9-254b-49b9-98dd-b1cfd4697210\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " Dec 05 13:09:13.978233 master-0 kubenswrapper[29936]: I1205 13:09:13.978124 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-scripts\") pod \"fbe106b9-254b-49b9-98dd-b1cfd4697210\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " Dec 05 13:09:13.978560 master-0 kubenswrapper[29936]: I1205 13:09:13.978511 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-config-data\") pod \"fbe106b9-254b-49b9-98dd-b1cfd4697210\" (UID: \"fbe106b9-254b-49b9-98dd-b1cfd4697210\") " Dec 05 13:09:13.981654 master-0 kubenswrapper[29936]: I1205 13:09:13.981578 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe106b9-254b-49b9-98dd-b1cfd4697210-kube-api-access-5cg54" (OuterVolumeSpecName: "kube-api-access-5cg54") pod "fbe106b9-254b-49b9-98dd-b1cfd4697210" (UID: "fbe106b9-254b-49b9-98dd-b1cfd4697210"). InnerVolumeSpecName "kube-api-access-5cg54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:13.982798 master-0 kubenswrapper[29936]: I1205 13:09:13.982738 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-scripts" (OuterVolumeSpecName: "scripts") pod "fbe106b9-254b-49b9-98dd-b1cfd4697210" (UID: "fbe106b9-254b-49b9-98dd-b1cfd4697210"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:14.022303 master-0 kubenswrapper[29936]: I1205 13:09:14.016542 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbe106b9-254b-49b9-98dd-b1cfd4697210" (UID: "fbe106b9-254b-49b9-98dd-b1cfd4697210"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:14.022303 master-0 kubenswrapper[29936]: I1205 13:09:14.018347 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-config-data" (OuterVolumeSpecName: "config-data") pod "fbe106b9-254b-49b9-98dd-b1cfd4697210" (UID: "fbe106b9-254b-49b9-98dd-b1cfd4697210"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:14.082342 master-0 kubenswrapper[29936]: I1205 13:09:14.082242 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:14.082342 master-0 kubenswrapper[29936]: I1205 13:09:14.082316 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:14.082342 master-0 kubenswrapper[29936]: I1205 13:09:14.082330 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbe106b9-254b-49b9-98dd-b1cfd4697210-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:14.082342 master-0 kubenswrapper[29936]: I1205 13:09:14.082344 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cg54\" (UniqueName: \"kubernetes.io/projected/fbe106b9-254b-49b9-98dd-b1cfd4697210-kube-api-access-5cg54\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:14.368345 master-0 kubenswrapper[29936]: I1205 13:09:14.367944 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027","Type":"ContainerStarted","Data":"ac0e76af22a910b46e3085e921399cae1f3ec5a05aff26ee7c7784cc7f63dfef"} Dec 05 13:09:14.368345 master-0 kubenswrapper[29936]: I1205 13:09:14.368014 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027","Type":"ContainerStarted","Data":"36730ae6e3bd6603dd1d5ca11ddb1514b6969fb888012de2f3826bd1c7395759"} Dec 05 13:09:14.375058 master-0 kubenswrapper[29936]: I1205 13:09:14.374988 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xjmxk" Dec 05 13:09:14.375217 master-0 kubenswrapper[29936]: I1205 13:09:14.375156 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xjmxk" event={"ID":"fbe106b9-254b-49b9-98dd-b1cfd4697210","Type":"ContainerDied","Data":"decb8ef6a7d224154069a5c1982c8b015696578619dcc95dfcaad52785c919f0"} Dec 05 13:09:14.375297 master-0 kubenswrapper[29936]: I1205 13:09:14.375220 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="decb8ef6a7d224154069a5c1982c8b015696578619dcc95dfcaad52785c919f0" Dec 05 13:09:14.412349 master-0 kubenswrapper[29936]: I1205 13:09:14.410227 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.410178219 podStartE2EDuration="2.410178219s" podCreationTimestamp="2025-12-05 13:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:09:14.402268618 +0000 UTC m=+1151.534348299" watchObservedRunningTime="2025-12-05 13:09:14.410178219 +0000 UTC m=+1151.542257900" Dec 05 13:09:14.617416 master-0 kubenswrapper[29936]: I1205 13:09:14.617301 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 13:09:14.618520 master-0 kubenswrapper[29936]: E1205 13:09:14.618365 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe106b9-254b-49b9-98dd-b1cfd4697210" containerName="nova-cell1-conductor-db-sync" Dec 05 13:09:14.618520 master-0 kubenswrapper[29936]: I1205 13:09:14.618396 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe106b9-254b-49b9-98dd-b1cfd4697210" containerName="nova-cell1-conductor-db-sync" Dec 05 13:09:14.619274 master-0 kubenswrapper[29936]: I1205 13:09:14.618691 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe106b9-254b-49b9-98dd-b1cfd4697210" containerName="nova-cell1-conductor-db-sync" Dec 05 13:09:14.620955 master-0 kubenswrapper[29936]: I1205 13:09:14.619904 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:14.623421 master-0 kubenswrapper[29936]: I1205 13:09:14.623358 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 13:09:14.629760 master-0 kubenswrapper[29936]: I1205 13:09:14.629658 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 13:09:14.708561 master-0 kubenswrapper[29936]: I1205 13:09:14.708472 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a8043e-bc23-495a-afaf-ba2218631cad-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"08a8043e-bc23-495a-afaf-ba2218631cad\") " pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:14.708842 master-0 kubenswrapper[29936]: I1205 13:09:14.708691 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a8043e-bc23-495a-afaf-ba2218631cad-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"08a8043e-bc23-495a-afaf-ba2218631cad\") " pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:14.708936 master-0 kubenswrapper[29936]: I1205 13:09:14.708836 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kxkz\" (UniqueName: \"kubernetes.io/projected/08a8043e-bc23-495a-afaf-ba2218631cad-kube-api-access-7kxkz\") pod \"nova-cell1-conductor-0\" (UID: \"08a8043e-bc23-495a-afaf-ba2218631cad\") " pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:14.811827 master-0 kubenswrapper[29936]: I1205 13:09:14.811724 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a8043e-bc23-495a-afaf-ba2218631cad-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"08a8043e-bc23-495a-afaf-ba2218631cad\") " pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:14.812238 master-0 kubenswrapper[29936]: I1205 13:09:14.811893 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a8043e-bc23-495a-afaf-ba2218631cad-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"08a8043e-bc23-495a-afaf-ba2218631cad\") " pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:14.812301 master-0 kubenswrapper[29936]: I1205 13:09:14.812218 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kxkz\" (UniqueName: \"kubernetes.io/projected/08a8043e-bc23-495a-afaf-ba2218631cad-kube-api-access-7kxkz\") pod \"nova-cell1-conductor-0\" (UID: \"08a8043e-bc23-495a-afaf-ba2218631cad\") " pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:14.818113 master-0 kubenswrapper[29936]: I1205 13:09:14.818047 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a8043e-bc23-495a-afaf-ba2218631cad-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"08a8043e-bc23-495a-afaf-ba2218631cad\") " pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:14.831020 master-0 kubenswrapper[29936]: I1205 13:09:14.830960 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kxkz\" (UniqueName: \"kubernetes.io/projected/08a8043e-bc23-495a-afaf-ba2218631cad-kube-api-access-7kxkz\") pod \"nova-cell1-conductor-0\" (UID: \"08a8043e-bc23-495a-afaf-ba2218631cad\") " pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:14.832347 master-0 kubenswrapper[29936]: I1205 13:09:14.832290 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a8043e-bc23-495a-afaf-ba2218631cad-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"08a8043e-bc23-495a-afaf-ba2218631cad\") " pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:14.981229 master-0 kubenswrapper[29936]: I1205 13:09:14.981141 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:15.486223 master-0 kubenswrapper[29936]: I1205 13:09:15.471354 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 13:09:15.486223 master-0 kubenswrapper[29936]: W1205 13:09:15.482886 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08a8043e_bc23_495a_afaf_ba2218631cad.slice/crio-c8697aea85a3a54d9ef7eb1960f16d6b162fc52afa60925da1b260913d7bedca WatchSource:0}: Error finding container c8697aea85a3a54d9ef7eb1960f16d6b162fc52afa60925da1b260913d7bedca: Status 404 returned error can't find the container with id c8697aea85a3a54d9ef7eb1960f16d6b162fc52afa60925da1b260913d7bedca Dec 05 13:09:16.063430 master-0 kubenswrapper[29936]: I1205 13:09:16.063317 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:16.063430 master-0 kubenswrapper[29936]: I1205 13:09:16.063403 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:16.121570 master-0 kubenswrapper[29936]: I1205 13:09:16.121489 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:16.409770 master-0 kubenswrapper[29936]: I1205 13:09:16.409610 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"08a8043e-bc23-495a-afaf-ba2218631cad","Type":"ContainerStarted","Data":"2b8705f3710c6f0cb80945c91c0adce6bfc1e28b48e02660b852a38e2ae2f2f0"} Dec 05 13:09:16.409770 master-0 kubenswrapper[29936]: I1205 13:09:16.409697 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"08a8043e-bc23-495a-afaf-ba2218631cad","Type":"ContainerStarted","Data":"c8697aea85a3a54d9ef7eb1960f16d6b162fc52afa60925da1b260913d7bedca"} Dec 05 13:09:16.896738 master-0 kubenswrapper[29936]: I1205 13:09:16.896598 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.8965700869999997 podStartE2EDuration="2.896570087s" podCreationTimestamp="2025-12-05 13:09:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:09:16.892814666 +0000 UTC m=+1154.024894357" watchObservedRunningTime="2025-12-05 13:09:16.896570087 +0000 UTC m=+1154.028649768" Dec 05 13:09:17.423744 master-0 kubenswrapper[29936]: I1205 13:09:17.423654 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:17.874280 master-0 kubenswrapper[29936]: I1205 13:09:17.874169 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 13:09:18.263108 master-0 kubenswrapper[29936]: I1205 13:09:18.263019 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 13:09:18.263108 master-0 kubenswrapper[29936]: I1205 13:09:18.263124 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 13:09:19.276663 master-0 kubenswrapper[29936]: I1205 13:09:19.276561 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.15:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 13:09:19.277455 master-0 kubenswrapper[29936]: I1205 13:09:19.276560 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.15:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 13:09:21.268810 master-0 kubenswrapper[29936]: I1205 13:09:21.268700 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 13:09:21.268810 master-0 kubenswrapper[29936]: I1205 13:09:21.268808 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 13:09:22.352241 master-0 kubenswrapper[29936]: I1205 13:09:22.351538 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.16:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 13:09:22.352241 master-0 kubenswrapper[29936]: I1205 13:09:22.351540 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.16:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 13:09:22.873879 master-0 kubenswrapper[29936]: I1205 13:09:22.873775 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 13:09:22.915426 master-0 kubenswrapper[29936]: I1205 13:09:22.914361 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 13:09:23.595485 master-0 kubenswrapper[29936]: I1205 13:09:23.595383 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 13:09:25.018685 master-0 kubenswrapper[29936]: I1205 13:09:25.018580 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 13:09:26.151006 master-0 kubenswrapper[29936]: I1205 13:09:26.150916 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:26.707205 master-0 kubenswrapper[29936]: I1205 13:09:26.707065 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sf8xl"] Dec 05 13:09:26.710775 master-0 kubenswrapper[29936]: I1205 13:09:26.710712 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:26.723323 master-0 kubenswrapper[29936]: I1205 13:09:26.722763 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sf8xl"] Dec 05 13:09:26.901438 master-0 kubenswrapper[29936]: I1205 13:09:26.901345 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5dcf\" (UniqueName: \"kubernetes.io/projected/758b7407-88d4-48ea-a598-2c48af96a89e-kube-api-access-c5dcf\") pod \"community-operators-sf8xl\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:26.901892 master-0 kubenswrapper[29936]: I1205 13:09:26.901603 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-utilities\") pod \"community-operators-sf8xl\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:26.902412 master-0 kubenswrapper[29936]: I1205 13:09:26.902380 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-catalog-content\") pod \"community-operators-sf8xl\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:27.005318 master-0 kubenswrapper[29936]: I1205 13:09:27.005080 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-catalog-content\") pod \"community-operators-sf8xl\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:27.005318 master-0 kubenswrapper[29936]: I1205 13:09:27.005262 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dcf\" (UniqueName: \"kubernetes.io/projected/758b7407-88d4-48ea-a598-2c48af96a89e-kube-api-access-c5dcf\") pod \"community-operators-sf8xl\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:27.005689 master-0 kubenswrapper[29936]: I1205 13:09:27.005322 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-utilities\") pod \"community-operators-sf8xl\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:27.005873 master-0 kubenswrapper[29936]: I1205 13:09:27.005831 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-catalog-content\") pod \"community-operators-sf8xl\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:27.006009 master-0 kubenswrapper[29936]: I1205 13:09:27.005942 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-utilities\") pod \"community-operators-sf8xl\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:27.028596 master-0 kubenswrapper[29936]: I1205 13:09:27.028527 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5dcf\" (UniqueName: \"kubernetes.io/projected/758b7407-88d4-48ea-a598-2c48af96a89e-kube-api-access-c5dcf\") pod \"community-operators-sf8xl\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:27.069383 master-0 kubenswrapper[29936]: I1205 13:09:27.069301 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:28.215392 master-0 kubenswrapper[29936]: I1205 13:09:28.215310 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftqr6"] Dec 05 13:09:28.216227 master-0 kubenswrapper[29936]: I1205 13:09:28.215612 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ftqr6" podUID="69c28142-8016-496e-a893-4d1d220c0b6a" containerName="registry-server" containerID="cri-o://706c6bee285eb2edfeb618a0fa0b88c87ade61151a54a24676e4e41f7c7e3ca7" gracePeriod=2 Dec 05 13:09:28.263707 master-0 kubenswrapper[29936]: I1205 13:09:28.259630 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sf8xl"] Dec 05 13:09:28.274074 master-0 kubenswrapper[29936]: I1205 13:09:28.273435 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 13:09:28.278448 master-0 kubenswrapper[29936]: I1205 13:09:28.278376 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 13:09:28.293360 master-0 kubenswrapper[29936]: I1205 13:09:28.293299 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 13:09:28.661292 master-0 kubenswrapper[29936]: I1205 13:09:28.661153 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf8xl" event={"ID":"758b7407-88d4-48ea-a598-2c48af96a89e","Type":"ContainerStarted","Data":"7567432e09f313bfb4f3d2eaed0e5e5714cbabb52722b1fe09f947814ea47e72"} Dec 05 13:09:28.665349 master-0 kubenswrapper[29936]: I1205 13:09:28.665283 29936 generic.go:334] "Generic (PLEG): container finished" podID="69c28142-8016-496e-a893-4d1d220c0b6a" containerID="706c6bee285eb2edfeb618a0fa0b88c87ade61151a54a24676e4e41f7c7e3ca7" exitCode=0 Dec 05 13:09:28.665473 master-0 kubenswrapper[29936]: I1205 13:09:28.665377 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqr6" event={"ID":"69c28142-8016-496e-a893-4d1d220c0b6a","Type":"ContainerDied","Data":"706c6bee285eb2edfeb618a0fa0b88c87ade61151a54a24676e4e41f7c7e3ca7"} Dec 05 13:09:28.681279 master-0 kubenswrapper[29936]: I1205 13:09:28.677874 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 13:09:28.961630 master-0 kubenswrapper[29936]: I1205 13:09:28.961565 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:29.032289 master-0 kubenswrapper[29936]: I1205 13:09:29.031739 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rshkk\" (UniqueName: \"kubernetes.io/projected/69c28142-8016-496e-a893-4d1d220c0b6a-kube-api-access-rshkk\") pod \"69c28142-8016-496e-a893-4d1d220c0b6a\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " Dec 05 13:09:29.032289 master-0 kubenswrapper[29936]: I1205 13:09:29.031979 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-utilities\") pod \"69c28142-8016-496e-a893-4d1d220c0b6a\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " Dec 05 13:09:29.032289 master-0 kubenswrapper[29936]: I1205 13:09:29.032044 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-catalog-content\") pod \"69c28142-8016-496e-a893-4d1d220c0b6a\" (UID: \"69c28142-8016-496e-a893-4d1d220c0b6a\") " Dec 05 13:09:29.036262 master-0 kubenswrapper[29936]: I1205 13:09:29.033038 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-utilities" (OuterVolumeSpecName: "utilities") pod "69c28142-8016-496e-a893-4d1d220c0b6a" (UID: "69c28142-8016-496e-a893-4d1d220c0b6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:09:29.065209 master-0 kubenswrapper[29936]: I1205 13:09:29.064223 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c28142-8016-496e-a893-4d1d220c0b6a-kube-api-access-rshkk" (OuterVolumeSpecName: "kube-api-access-rshkk") pod "69c28142-8016-496e-a893-4d1d220c0b6a" (UID: "69c28142-8016-496e-a893-4d1d220c0b6a"). InnerVolumeSpecName "kube-api-access-rshkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:29.109304 master-0 kubenswrapper[29936]: I1205 13:09:29.108558 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69c28142-8016-496e-a893-4d1d220c0b6a" (UID: "69c28142-8016-496e-a893-4d1d220c0b6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:09:29.135330 master-0 kubenswrapper[29936]: I1205 13:09:29.135231 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:29.135330 master-0 kubenswrapper[29936]: I1205 13:09:29.135306 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c28142-8016-496e-a893-4d1d220c0b6a-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:29.135330 master-0 kubenswrapper[29936]: I1205 13:09:29.135321 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rshkk\" (UniqueName: \"kubernetes.io/projected/69c28142-8016-496e-a893-4d1d220c0b6a-kube-api-access-rshkk\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:29.204347 master-0 kubenswrapper[29936]: I1205 13:09:29.204109 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:29.342307 master-0 kubenswrapper[29936]: I1205 13:09:29.340357 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-combined-ca-bundle\") pod \"52bfe120-ee71-4c4f-8433-24164a6b82ac\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " Dec 05 13:09:29.342307 master-0 kubenswrapper[29936]: I1205 13:09:29.340452 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-config-data\") pod \"52bfe120-ee71-4c4f-8433-24164a6b82ac\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " Dec 05 13:09:29.342307 master-0 kubenswrapper[29936]: I1205 13:09:29.340541 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf5v5\" (UniqueName: \"kubernetes.io/projected/52bfe120-ee71-4c4f-8433-24164a6b82ac-kube-api-access-gf5v5\") pod \"52bfe120-ee71-4c4f-8433-24164a6b82ac\" (UID: \"52bfe120-ee71-4c4f-8433-24164a6b82ac\") " Dec 05 13:09:29.346977 master-0 kubenswrapper[29936]: I1205 13:09:29.346903 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52bfe120-ee71-4c4f-8433-24164a6b82ac-kube-api-access-gf5v5" (OuterVolumeSpecName: "kube-api-access-gf5v5") pod "52bfe120-ee71-4c4f-8433-24164a6b82ac" (UID: "52bfe120-ee71-4c4f-8433-24164a6b82ac"). InnerVolumeSpecName "kube-api-access-gf5v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:29.374125 master-0 kubenswrapper[29936]: I1205 13:09:29.374043 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52bfe120-ee71-4c4f-8433-24164a6b82ac" (UID: "52bfe120-ee71-4c4f-8433-24164a6b82ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:29.379237 master-0 kubenswrapper[29936]: I1205 13:09:29.379113 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-config-data" (OuterVolumeSpecName: "config-data") pod "52bfe120-ee71-4c4f-8433-24164a6b82ac" (UID: "52bfe120-ee71-4c4f-8433-24164a6b82ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:29.446650 master-0 kubenswrapper[29936]: I1205 13:09:29.446439 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf5v5\" (UniqueName: \"kubernetes.io/projected/52bfe120-ee71-4c4f-8433-24164a6b82ac-kube-api-access-gf5v5\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:29.447054 master-0 kubenswrapper[29936]: I1205 13:09:29.447040 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:29.447150 master-0 kubenswrapper[29936]: I1205 13:09:29.447135 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bfe120-ee71-4c4f-8433-24164a6b82ac-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:29.683541 master-0 kubenswrapper[29936]: I1205 13:09:29.683346 29936 generic.go:334] "Generic (PLEG): container finished" podID="52bfe120-ee71-4c4f-8433-24164a6b82ac" containerID="f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6" exitCode=137 Dec 05 13:09:29.683541 master-0 kubenswrapper[29936]: I1205 13:09:29.683447 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52bfe120-ee71-4c4f-8433-24164a6b82ac","Type":"ContainerDied","Data":"f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6"} Dec 05 13:09:29.683955 master-0 kubenswrapper[29936]: I1205 13:09:29.683546 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"52bfe120-ee71-4c4f-8433-24164a6b82ac","Type":"ContainerDied","Data":"312a536f951e34a53e9251245d2b9c3e7f0d478a1418aa51af960bf4609660e1"} Dec 05 13:09:29.683955 master-0 kubenswrapper[29936]: I1205 13:09:29.683583 29936 scope.go:117] "RemoveContainer" containerID="f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6" Dec 05 13:09:29.684264 master-0 kubenswrapper[29936]: I1205 13:09:29.684155 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:29.695069 master-0 kubenswrapper[29936]: I1205 13:09:29.694926 29936 generic.go:334] "Generic (PLEG): container finished" podID="758b7407-88d4-48ea-a598-2c48af96a89e" containerID="de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134" exitCode=0 Dec 05 13:09:29.695069 master-0 kubenswrapper[29936]: I1205 13:09:29.694996 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf8xl" event={"ID":"758b7407-88d4-48ea-a598-2c48af96a89e","Type":"ContainerDied","Data":"de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134"} Dec 05 13:09:29.702257 master-0 kubenswrapper[29936]: I1205 13:09:29.701333 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ftqr6" event={"ID":"69c28142-8016-496e-a893-4d1d220c0b6a","Type":"ContainerDied","Data":"49ea2905a8ab69103d335a06b1c0c861d01623fbba3d90de60f3c2e96c6d885e"} Dec 05 13:09:29.702257 master-0 kubenswrapper[29936]: I1205 13:09:29.701458 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ftqr6" Dec 05 13:09:29.732214 master-0 kubenswrapper[29936]: I1205 13:09:29.732133 29936 scope.go:117] "RemoveContainer" containerID="f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6" Dec 05 13:09:29.733267 master-0 kubenswrapper[29936]: E1205 13:09:29.733220 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6\": container with ID starting with f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6 not found: ID does not exist" containerID="f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6" Dec 05 13:09:29.733350 master-0 kubenswrapper[29936]: I1205 13:09:29.733274 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6"} err="failed to get container status \"f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6\": rpc error: code = NotFound desc = could not find container \"f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6\": container with ID starting with f08ee6a3887bb302aaf03c84ecde4513dba0d4a7419c6d2cb9fd4985d468c9d6 not found: ID does not exist" Dec 05 13:09:29.733350 master-0 kubenswrapper[29936]: I1205 13:09:29.733300 29936 scope.go:117] "RemoveContainer" containerID="706c6bee285eb2edfeb618a0fa0b88c87ade61151a54a24676e4e41f7c7e3ca7" Dec 05 13:09:29.817221 master-0 kubenswrapper[29936]: I1205 13:09:29.817036 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ftqr6"] Dec 05 13:09:29.834698 master-0 kubenswrapper[29936]: I1205 13:09:29.834514 29936 scope.go:117] "RemoveContainer" containerID="affddd2860d80e0ce7b18928534da37a1673240efc200f32164d40161fc30343" Dec 05 13:09:29.839289 master-0 kubenswrapper[29936]: I1205 13:09:29.837188 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ftqr6"] Dec 05 13:09:29.872625 master-0 kubenswrapper[29936]: I1205 13:09:29.868360 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 13:09:29.897210 master-0 kubenswrapper[29936]: I1205 13:09:29.896447 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 13:09:29.897210 master-0 kubenswrapper[29936]: I1205 13:09:29.896544 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 13:09:29.899316 master-0 kubenswrapper[29936]: E1205 13:09:29.897268 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bfe120-ee71-4c4f-8433-24164a6b82ac" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 13:09:29.899316 master-0 kubenswrapper[29936]: I1205 13:09:29.897287 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bfe120-ee71-4c4f-8433-24164a6b82ac" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 13:09:29.899316 master-0 kubenswrapper[29936]: E1205 13:09:29.897310 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c28142-8016-496e-a893-4d1d220c0b6a" containerName="extract-utilities" Dec 05 13:09:29.899316 master-0 kubenswrapper[29936]: I1205 13:09:29.897317 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c28142-8016-496e-a893-4d1d220c0b6a" containerName="extract-utilities" Dec 05 13:09:29.899316 master-0 kubenswrapper[29936]: E1205 13:09:29.897343 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c28142-8016-496e-a893-4d1d220c0b6a" containerName="extract-content" Dec 05 13:09:29.899316 master-0 kubenswrapper[29936]: I1205 13:09:29.897350 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c28142-8016-496e-a893-4d1d220c0b6a" containerName="extract-content" Dec 05 13:09:29.899316 master-0 kubenswrapper[29936]: E1205 13:09:29.897386 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c28142-8016-496e-a893-4d1d220c0b6a" containerName="registry-server" Dec 05 13:09:29.899316 master-0 kubenswrapper[29936]: I1205 13:09:29.897393 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c28142-8016-496e-a893-4d1d220c0b6a" containerName="registry-server" Dec 05 13:09:29.899316 master-0 kubenswrapper[29936]: I1205 13:09:29.898171 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c28142-8016-496e-a893-4d1d220c0b6a" containerName="registry-server" Dec 05 13:09:29.899316 master-0 kubenswrapper[29936]: I1205 13:09:29.898292 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="52bfe120-ee71-4c4f-8433-24164a6b82ac" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 13:09:29.899934 master-0 kubenswrapper[29936]: I1205 13:09:29.899335 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:29.903552 master-0 kubenswrapper[29936]: I1205 13:09:29.903471 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 13:09:29.903789 master-0 kubenswrapper[29936]: I1205 13:09:29.903759 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 13:09:29.903941 master-0 kubenswrapper[29936]: I1205 13:09:29.903918 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 13:09:29.912903 master-0 kubenswrapper[29936]: I1205 13:09:29.912830 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 13:09:29.931138 master-0 kubenswrapper[29936]: I1205 13:09:29.931084 29936 scope.go:117] "RemoveContainer" containerID="992acd814e774c11a94b32cefaa33473403429302cc39315bb6dcd259d238835" Dec 05 13:09:29.983601 master-0 kubenswrapper[29936]: I1205 13:09:29.982313 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpj9r\" (UniqueName: \"kubernetes.io/projected/c8d4e497-e2fb-4cbc-9a23-b701379be37c-kube-api-access-fpj9r\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:29.983601 master-0 kubenswrapper[29936]: I1205 13:09:29.982720 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:29.983601 master-0 kubenswrapper[29936]: I1205 13:09:29.983035 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:29.983601 master-0 kubenswrapper[29936]: I1205 13:09:29.983118 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:29.983601 master-0 kubenswrapper[29936]: I1205 13:09:29.983277 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.086731 master-0 kubenswrapper[29936]: I1205 13:09:30.086535 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.086970 master-0 kubenswrapper[29936]: I1205 13:09:30.086758 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.086970 master-0 kubenswrapper[29936]: I1205 13:09:30.086794 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.086970 master-0 kubenswrapper[29936]: I1205 13:09:30.086961 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.087285 master-0 kubenswrapper[29936]: I1205 13:09:30.087223 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpj9r\" (UniqueName: \"kubernetes.io/projected/c8d4e497-e2fb-4cbc-9a23-b701379be37c-kube-api-access-fpj9r\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.091889 master-0 kubenswrapper[29936]: I1205 13:09:30.091835 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.092048 master-0 kubenswrapper[29936]: I1205 13:09:30.091947 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.092779 master-0 kubenswrapper[29936]: I1205 13:09:30.092745 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.093498 master-0 kubenswrapper[29936]: I1205 13:09:30.093455 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d4e497-e2fb-4cbc-9a23-b701379be37c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.105273 master-0 kubenswrapper[29936]: I1205 13:09:30.105231 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpj9r\" (UniqueName: \"kubernetes.io/projected/c8d4e497-e2fb-4cbc-9a23-b701379be37c-kube-api-access-fpj9r\") pod \"nova-cell1-novncproxy-0\" (UID: \"c8d4e497-e2fb-4cbc-9a23-b701379be37c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.251655 master-0 kubenswrapper[29936]: I1205 13:09:30.251483 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:30.722162 master-0 kubenswrapper[29936]: I1205 13:09:30.722093 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf8xl" event={"ID":"758b7407-88d4-48ea-a598-2c48af96a89e","Type":"ContainerStarted","Data":"986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50"} Dec 05 13:09:30.809565 master-0 kubenswrapper[29936]: I1205 13:09:30.809482 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 13:09:31.204443 master-0 kubenswrapper[29936]: I1205 13:09:31.204340 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52bfe120-ee71-4c4f-8433-24164a6b82ac" path="/var/lib/kubelet/pods/52bfe120-ee71-4c4f-8433-24164a6b82ac/volumes" Dec 05 13:09:31.205717 master-0 kubenswrapper[29936]: I1205 13:09:31.205671 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c28142-8016-496e-a893-4d1d220c0b6a" path="/var/lib/kubelet/pods/69c28142-8016-496e-a893-4d1d220c0b6a/volumes" Dec 05 13:09:31.272375 master-0 kubenswrapper[29936]: I1205 13:09:31.272299 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 13:09:31.272907 master-0 kubenswrapper[29936]: I1205 13:09:31.272853 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 13:09:31.276574 master-0 kubenswrapper[29936]: I1205 13:09:31.276515 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 13:09:31.277346 master-0 kubenswrapper[29936]: I1205 13:09:31.277292 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 13:09:31.753000 master-0 kubenswrapper[29936]: I1205 13:09:31.752927 29936 generic.go:334] "Generic (PLEG): container finished" podID="758b7407-88d4-48ea-a598-2c48af96a89e" containerID="986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50" exitCode=0 Dec 05 13:09:31.753686 master-0 kubenswrapper[29936]: I1205 13:09:31.753056 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf8xl" event={"ID":"758b7407-88d4-48ea-a598-2c48af96a89e","Type":"ContainerDied","Data":"986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50"} Dec 05 13:09:31.759300 master-0 kubenswrapper[29936]: I1205 13:09:31.759151 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c8d4e497-e2fb-4cbc-9a23-b701379be37c","Type":"ContainerStarted","Data":"a8610969ee5c18ba6a4912e8021d8a3901d4eec5b748b68421b62d6f60b035e4"} Dec 05 13:09:31.759408 master-0 kubenswrapper[29936]: I1205 13:09:31.759307 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"c8d4e497-e2fb-4cbc-9a23-b701379be37c","Type":"ContainerStarted","Data":"61ae21b23365ed894cfb14295a4486fb77ee74b8b7e1d4ec679322464e1aa2ed"} Dec 05 13:09:31.759524 master-0 kubenswrapper[29936]: I1205 13:09:31.759501 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 13:09:31.768368 master-0 kubenswrapper[29936]: I1205 13:09:31.768162 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 13:09:31.845286 master-0 kubenswrapper[29936]: I1205 13:09:31.845098 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.845073362 podStartE2EDuration="2.845073362s" podCreationTimestamp="2025-12-05 13:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:09:31.834465046 +0000 UTC m=+1168.966544727" watchObservedRunningTime="2025-12-05 13:09:31.845073362 +0000 UTC m=+1168.977153043" Dec 05 13:09:32.120083 master-0 kubenswrapper[29936]: I1205 13:09:32.116868 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb98f75c-dzrtw"] Dec 05 13:09:32.121359 master-0 kubenswrapper[29936]: I1205 13:09:32.120987 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.183404 master-0 kubenswrapper[29936]: I1205 13:09:32.183247 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb98f75c-dzrtw"] Dec 05 13:09:32.268557 master-0 kubenswrapper[29936]: I1205 13:09:32.267285 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.268557 master-0 kubenswrapper[29936]: I1205 13:09:32.267484 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.268557 master-0 kubenswrapper[29936]: I1205 13:09:32.267508 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qgbz\" (UniqueName: \"kubernetes.io/projected/4c4b9484-f3cf-4c0e-920d-8b688607db6e-kube-api-access-6qgbz\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.268557 master-0 kubenswrapper[29936]: I1205 13:09:32.267546 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.268557 master-0 kubenswrapper[29936]: I1205 13:09:32.267582 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-config\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.268557 master-0 kubenswrapper[29936]: I1205 13:09:32.267729 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-dns-svc\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.370656 master-0 kubenswrapper[29936]: I1205 13:09:32.370466 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.370656 master-0 kubenswrapper[29936]: I1205 13:09:32.370541 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qgbz\" (UniqueName: \"kubernetes.io/projected/4c4b9484-f3cf-4c0e-920d-8b688607db6e-kube-api-access-6qgbz\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.370981 master-0 kubenswrapper[29936]: I1205 13:09:32.370771 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.370981 master-0 kubenswrapper[29936]: I1205 13:09:32.370949 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-config\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.372249 master-0 kubenswrapper[29936]: I1205 13:09:32.371482 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-dns-svc\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.372249 master-0 kubenswrapper[29936]: I1205 13:09:32.371552 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.372249 master-0 kubenswrapper[29936]: I1205 13:09:32.371593 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.372249 master-0 kubenswrapper[29936]: I1205 13:09:32.371710 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.372249 master-0 kubenswrapper[29936]: I1205 13:09:32.372193 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-config\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.372503 master-0 kubenswrapper[29936]: I1205 13:09:32.372406 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-dns-svc\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.372784 master-0 kubenswrapper[29936]: I1205 13:09:32.372743 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4c4b9484-f3cf-4c0e-920d-8b688607db6e-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.390633 master-0 kubenswrapper[29936]: I1205 13:09:32.390591 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qgbz\" (UniqueName: \"kubernetes.io/projected/4c4b9484-f3cf-4c0e-920d-8b688607db6e-kube-api-access-6qgbz\") pod \"dnsmasq-dns-dbb98f75c-dzrtw\" (UID: \"4c4b9484-f3cf-4c0e-920d-8b688607db6e\") " pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.472709 master-0 kubenswrapper[29936]: I1205 13:09:32.471955 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:32.780312 master-0 kubenswrapper[29936]: I1205 13:09:32.780133 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf8xl" event={"ID":"758b7407-88d4-48ea-a598-2c48af96a89e","Type":"ContainerStarted","Data":"7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5"} Dec 05 13:09:32.816392 master-0 kubenswrapper[29936]: I1205 13:09:32.816188 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sf8xl" podStartSLOduration=4.291748577 podStartE2EDuration="6.816133002s" podCreationTimestamp="2025-12-05 13:09:26 +0000 UTC" firstStartedPulling="2025-12-05 13:09:29.697747988 +0000 UTC m=+1166.829827689" lastFinishedPulling="2025-12-05 13:09:32.222132433 +0000 UTC m=+1169.354212114" observedRunningTime="2025-12-05 13:09:32.810912416 +0000 UTC m=+1169.942992107" watchObservedRunningTime="2025-12-05 13:09:32.816133002 +0000 UTC m=+1169.948212683" Dec 05 13:09:33.071943 master-0 kubenswrapper[29936]: I1205 13:09:33.071846 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb98f75c-dzrtw"] Dec 05 13:09:33.798966 master-0 kubenswrapper[29936]: I1205 13:09:33.798869 29936 generic.go:334] "Generic (PLEG): container finished" podID="4c4b9484-f3cf-4c0e-920d-8b688607db6e" containerID="59371d761fe3b8c5874b921a576308c6f68a5f668d8236783c90a8d234a2dc67" exitCode=0 Dec 05 13:09:33.799735 master-0 kubenswrapper[29936]: I1205 13:09:33.799441 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" event={"ID":"4c4b9484-f3cf-4c0e-920d-8b688607db6e","Type":"ContainerDied","Data":"59371d761fe3b8c5874b921a576308c6f68a5f668d8236783c90a8d234a2dc67"} Dec 05 13:09:33.799735 master-0 kubenswrapper[29936]: I1205 13:09:33.799549 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" event={"ID":"4c4b9484-f3cf-4c0e-920d-8b688607db6e","Type":"ContainerStarted","Data":"2a7969626962cbb667b9bfcf15335056e6d15fa012d13e52926200654bb6ad52"} Dec 05 13:09:34.816341 master-0 kubenswrapper[29936]: I1205 13:09:34.816261 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" event={"ID":"4c4b9484-f3cf-4c0e-920d-8b688607db6e","Type":"ContainerStarted","Data":"36ddbbcc8c69a451efcffda9df12c2ebf42a7b1f16b8e17284dcb4c1ad446292"} Dec 05 13:09:34.817101 master-0 kubenswrapper[29936]: I1205 13:09:34.816463 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:34.850335 master-0 kubenswrapper[29936]: I1205 13:09:34.850205 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" podStartSLOduration=3.85016138 podStartE2EDuration="3.85016138s" podCreationTimestamp="2025-12-05 13:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:09:34.840305312 +0000 UTC m=+1171.972385053" watchObservedRunningTime="2025-12-05 13:09:34.85016138 +0000 UTC m=+1171.982241061" Dec 05 13:09:35.040244 master-0 kubenswrapper[29936]: I1205 13:09:35.040129 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:35.040749 master-0 kubenswrapper[29936]: I1205 13:09:35.040680 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerName="nova-api-log" containerID="cri-o://231c4c15fcf1b39253b2377a3bab92515e90ba7432e95e42cc28f9a61406631b" gracePeriod=30 Dec 05 13:09:35.040951 master-0 kubenswrapper[29936]: I1205 13:09:35.040894 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerName="nova-api-api" containerID="cri-o://1dad7eae3542ce79913bd6aea7afda5d404262a9aaa677d3ac98686c03740104" gracePeriod=30 Dec 05 13:09:35.252513 master-0 kubenswrapper[29936]: I1205 13:09:35.252431 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:35.834007 master-0 kubenswrapper[29936]: I1205 13:09:35.833919 29936 generic.go:334] "Generic (PLEG): container finished" podID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerID="231c4c15fcf1b39253b2377a3bab92515e90ba7432e95e42cc28f9a61406631b" exitCode=143 Dec 05 13:09:35.836308 master-0 kubenswrapper[29936]: I1205 13:09:35.835972 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b060638c-8b2a-4dbb-95e5-e1d8839e9783","Type":"ContainerDied","Data":"231c4c15fcf1b39253b2377a3bab92515e90ba7432e95e42cc28f9a61406631b"} Dec 05 13:09:37.069759 master-0 kubenswrapper[29936]: I1205 13:09:37.069655 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:37.069759 master-0 kubenswrapper[29936]: I1205 13:09:37.069759 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:37.125481 master-0 kubenswrapper[29936]: I1205 13:09:37.125409 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:37.921644 master-0 kubenswrapper[29936]: I1205 13:09:37.921546 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:38.277023 master-0 kubenswrapper[29936]: I1205 13:09:38.276933 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sf8xl"] Dec 05 13:09:38.894660 master-0 kubenswrapper[29936]: I1205 13:09:38.894574 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b060638c-8b2a-4dbb-95e5-e1d8839e9783","Type":"ContainerDied","Data":"1dad7eae3542ce79913bd6aea7afda5d404262a9aaa677d3ac98686c03740104"} Dec 05 13:09:38.895561 master-0 kubenswrapper[29936]: I1205 13:09:38.894527 29936 generic.go:334] "Generic (PLEG): container finished" podID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerID="1dad7eae3542ce79913bd6aea7afda5d404262a9aaa677d3ac98686c03740104" exitCode=0 Dec 05 13:09:38.895561 master-0 kubenswrapper[29936]: I1205 13:09:38.894716 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:38.895561 master-0 kubenswrapper[29936]: I1205 13:09:38.895496 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b060638c-8b2a-4dbb-95e5-e1d8839e9783","Type":"ContainerDied","Data":"a6e938d5757db536032878c2d7e7c9dd41b8220058a971082de1afc80cccdaca"} Dec 05 13:09:38.895719 master-0 kubenswrapper[29936]: I1205 13:09:38.895606 29936 scope.go:117] "RemoveContainer" containerID="1dad7eae3542ce79913bd6aea7afda5d404262a9aaa677d3ac98686c03740104" Dec 05 13:09:38.933354 master-0 kubenswrapper[29936]: I1205 13:09:38.933310 29936 scope.go:117] "RemoveContainer" containerID="231c4c15fcf1b39253b2377a3bab92515e90ba7432e95e42cc28f9a61406631b" Dec 05 13:09:39.010371 master-0 kubenswrapper[29936]: I1205 13:09:39.010165 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-config-data\") pod \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " Dec 05 13:09:39.010371 master-0 kubenswrapper[29936]: I1205 13:09:39.010287 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vmft\" (UniqueName: \"kubernetes.io/projected/b060638c-8b2a-4dbb-95e5-e1d8839e9783-kube-api-access-6vmft\") pod \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " Dec 05 13:09:39.021082 master-0 kubenswrapper[29936]: I1205 13:09:39.012767 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-combined-ca-bundle\") pod \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " Dec 05 13:09:39.021082 master-0 kubenswrapper[29936]: I1205 13:09:39.012896 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b060638c-8b2a-4dbb-95e5-e1d8839e9783-logs\") pod \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\" (UID: \"b060638c-8b2a-4dbb-95e5-e1d8839e9783\") " Dec 05 13:09:39.021082 master-0 kubenswrapper[29936]: I1205 13:09:39.014793 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b060638c-8b2a-4dbb-95e5-e1d8839e9783-logs" (OuterVolumeSpecName: "logs") pod "b060638c-8b2a-4dbb-95e5-e1d8839e9783" (UID: "b060638c-8b2a-4dbb-95e5-e1d8839e9783"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:09:39.021082 master-0 kubenswrapper[29936]: I1205 13:09:39.015320 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b060638c-8b2a-4dbb-95e5-e1d8839e9783-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:39.021082 master-0 kubenswrapper[29936]: I1205 13:09:39.019822 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b060638c-8b2a-4dbb-95e5-e1d8839e9783-kube-api-access-6vmft" (OuterVolumeSpecName: "kube-api-access-6vmft") pod "b060638c-8b2a-4dbb-95e5-e1d8839e9783" (UID: "b060638c-8b2a-4dbb-95e5-e1d8839e9783"). InnerVolumeSpecName "kube-api-access-6vmft". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:39.071575 master-0 kubenswrapper[29936]: I1205 13:09:39.071509 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Dec 05 13:09:39.089848 master-0 kubenswrapper[29936]: I1205 13:09:39.088013 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b060638c-8b2a-4dbb-95e5-e1d8839e9783" (UID: "b060638c-8b2a-4dbb-95e5-e1d8839e9783"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:39.091840 master-0 kubenswrapper[29936]: I1205 13:09:39.090846 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-config-data" (OuterVolumeSpecName: "config-data") pod "b060638c-8b2a-4dbb-95e5-e1d8839e9783" (UID: "b060638c-8b2a-4dbb-95e5-e1d8839e9783"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:39.118319 master-0 kubenswrapper[29936]: I1205 13:09:39.118247 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:39.118319 master-0 kubenswrapper[29936]: I1205 13:09:39.118301 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vmft\" (UniqueName: \"kubernetes.io/projected/b060638c-8b2a-4dbb-95e5-e1d8839e9783-kube-api-access-6vmft\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:39.118319 master-0 kubenswrapper[29936]: I1205 13:09:39.118315 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b060638c-8b2a-4dbb-95e5-e1d8839e9783-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:39.136602 master-0 kubenswrapper[29936]: I1205 13:09:39.133652 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Dec 05 13:09:39.222119 master-0 kubenswrapper[29936]: I1205 13:09:39.222022 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Dec 05 13:09:39.910466 master-0 kubenswrapper[29936]: I1205 13:09:39.910359 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:39.911242 master-0 kubenswrapper[29936]: I1205 13:09:39.910827 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sf8xl" podUID="758b7407-88d4-48ea-a598-2c48af96a89e" containerName="registry-server" containerID="cri-o://7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5" gracePeriod=2 Dec 05 13:09:39.950384 master-0 kubenswrapper[29936]: I1205 13:09:39.950278 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:39.965909 master-0 kubenswrapper[29936]: I1205 13:09:39.965830 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:39.996452 master-0 kubenswrapper[29936]: I1205 13:09:39.994266 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:39.996452 master-0 kubenswrapper[29936]: E1205 13:09:39.994905 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerName="nova-api-log" Dec 05 13:09:39.996452 master-0 kubenswrapper[29936]: I1205 13:09:39.994921 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerName="nova-api-log" Dec 05 13:09:39.996452 master-0 kubenswrapper[29936]: E1205 13:09:39.994975 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerName="nova-api-api" Dec 05 13:09:39.996452 master-0 kubenswrapper[29936]: I1205 13:09:39.994982 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerName="nova-api-api" Dec 05 13:09:39.996452 master-0 kubenswrapper[29936]: I1205 13:09:39.995587 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerName="nova-api-log" Dec 05 13:09:39.996452 master-0 kubenswrapper[29936]: I1205 13:09:39.995633 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" containerName="nova-api-api" Dec 05 13:09:39.997712 master-0 kubenswrapper[29936]: I1205 13:09:39.997669 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:40.008692 master-0 kubenswrapper[29936]: I1205 13:09:40.008615 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 13:09:40.013043 master-0 kubenswrapper[29936]: I1205 13:09:40.013004 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 13:09:40.013401 master-0 kubenswrapper[29936]: I1205 13:09:40.013165 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 13:09:40.015082 master-0 kubenswrapper[29936]: I1205 13:09:40.014021 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:40.149908 master-0 kubenswrapper[29936]: I1205 13:09:40.149839 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxh2g\" (UniqueName: \"kubernetes.io/projected/a4cb64aa-e5c4-45ee-a360-160256df7967-kube-api-access-gxh2g\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.150065 master-0 kubenswrapper[29936]: I1205 13:09:40.149928 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-config-data\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.151207 master-0 kubenswrapper[29936]: I1205 13:09:40.150341 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-public-tls-certs\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.151285 master-0 kubenswrapper[29936]: I1205 13:09:40.151226 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cb64aa-e5c4-45ee-a360-160256df7967-logs\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.171306 master-0 kubenswrapper[29936]: I1205 13:09:40.151478 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.180198 master-0 kubenswrapper[29936]: I1205 13:09:40.179310 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.257928 master-0 kubenswrapper[29936]: I1205 13:09:40.257817 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:40.280203 master-0 kubenswrapper[29936]: I1205 13:09:40.279898 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:40.282192 master-0 kubenswrapper[29936]: I1205 13:09:40.281878 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxh2g\" (UniqueName: \"kubernetes.io/projected/a4cb64aa-e5c4-45ee-a360-160256df7967-kube-api-access-gxh2g\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.282192 master-0 kubenswrapper[29936]: I1205 13:09:40.281930 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-config-data\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.282192 master-0 kubenswrapper[29936]: I1205 13:09:40.282013 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-public-tls-certs\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.282192 master-0 kubenswrapper[29936]: I1205 13:09:40.282145 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cb64aa-e5c4-45ee-a360-160256df7967-logs\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.282436 master-0 kubenswrapper[29936]: I1205 13:09:40.282234 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.282436 master-0 kubenswrapper[29936]: I1205 13:09:40.282352 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.295117 master-0 kubenswrapper[29936]: I1205 13:09:40.293544 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-config-data\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.307735 master-0 kubenswrapper[29936]: I1205 13:09:40.307609 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cb64aa-e5c4-45ee-a360-160256df7967-logs\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.312646 master-0 kubenswrapper[29936]: I1205 13:09:40.312576 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-public-tls-certs\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.315346 master-0 kubenswrapper[29936]: I1205 13:09:40.314812 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxh2g\" (UniqueName: \"kubernetes.io/projected/a4cb64aa-e5c4-45ee-a360-160256df7967-kube-api-access-gxh2g\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.319558 master-0 kubenswrapper[29936]: I1205 13:09:40.319494 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.322363 master-0 kubenswrapper[29936]: I1205 13:09:40.322264 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " pod="openstack/nova-api-0" Dec 05 13:09:40.441609 master-0 kubenswrapper[29936]: I1205 13:09:40.441511 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:40.558855 master-0 kubenswrapper[29936]: I1205 13:09:40.557306 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:40.695729 master-0 kubenswrapper[29936]: I1205 13:09:40.695509 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5dcf\" (UniqueName: \"kubernetes.io/projected/758b7407-88d4-48ea-a598-2c48af96a89e-kube-api-access-c5dcf\") pod \"758b7407-88d4-48ea-a598-2c48af96a89e\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " Dec 05 13:09:40.696091 master-0 kubenswrapper[29936]: I1205 13:09:40.696039 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-catalog-content\") pod \"758b7407-88d4-48ea-a598-2c48af96a89e\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " Dec 05 13:09:40.696193 master-0 kubenswrapper[29936]: I1205 13:09:40.696138 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-utilities\") pod \"758b7407-88d4-48ea-a598-2c48af96a89e\" (UID: \"758b7407-88d4-48ea-a598-2c48af96a89e\") " Dec 05 13:09:40.697716 master-0 kubenswrapper[29936]: I1205 13:09:40.697651 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-utilities" (OuterVolumeSpecName: "utilities") pod "758b7407-88d4-48ea-a598-2c48af96a89e" (UID: "758b7407-88d4-48ea-a598-2c48af96a89e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:09:40.698703 master-0 kubenswrapper[29936]: I1205 13:09:40.698666 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:40.698919 master-0 kubenswrapper[29936]: I1205 13:09:40.698890 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/758b7407-88d4-48ea-a598-2c48af96a89e-kube-api-access-c5dcf" (OuterVolumeSpecName: "kube-api-access-c5dcf") pod "758b7407-88d4-48ea-a598-2c48af96a89e" (UID: "758b7407-88d4-48ea-a598-2c48af96a89e"). InnerVolumeSpecName "kube-api-access-c5dcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:40.762279 master-0 kubenswrapper[29936]: I1205 13:09:40.762190 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "758b7407-88d4-48ea-a598-2c48af96a89e" (UID: "758b7407-88d4-48ea-a598-2c48af96a89e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:09:40.807431 master-0 kubenswrapper[29936]: I1205 13:09:40.804407 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/758b7407-88d4-48ea-a598-2c48af96a89e-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:40.807431 master-0 kubenswrapper[29936]: I1205 13:09:40.804477 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5dcf\" (UniqueName: \"kubernetes.io/projected/758b7407-88d4-48ea-a598-2c48af96a89e-kube-api-access-c5dcf\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:40.931380 master-0 kubenswrapper[29936]: I1205 13:09:40.931305 29936 generic.go:334] "Generic (PLEG): container finished" podID="758b7407-88d4-48ea-a598-2c48af96a89e" containerID="7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5" exitCode=0 Dec 05 13:09:40.932074 master-0 kubenswrapper[29936]: I1205 13:09:40.931680 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf8xl" event={"ID":"758b7407-88d4-48ea-a598-2c48af96a89e","Type":"ContainerDied","Data":"7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5"} Dec 05 13:09:40.932074 master-0 kubenswrapper[29936]: I1205 13:09:40.931745 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sf8xl" event={"ID":"758b7407-88d4-48ea-a598-2c48af96a89e","Type":"ContainerDied","Data":"7567432e09f313bfb4f3d2eaed0e5e5714cbabb52722b1fe09f947814ea47e72"} Dec 05 13:09:40.932074 master-0 kubenswrapper[29936]: I1205 13:09:40.931770 29936 scope.go:117] "RemoveContainer" containerID="7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5" Dec 05 13:09:40.932074 master-0 kubenswrapper[29936]: I1205 13:09:40.931869 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sf8xl" Dec 05 13:09:40.957588 master-0 kubenswrapper[29936]: I1205 13:09:40.957530 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 13:09:40.967156 master-0 kubenswrapper[29936]: I1205 13:09:40.967096 29936 scope.go:117] "RemoveContainer" containerID="986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50" Dec 05 13:09:41.054004 master-0 kubenswrapper[29936]: I1205 13:09:41.052927 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:41.061284 master-0 kubenswrapper[29936]: I1205 13:09:41.059418 29936 scope.go:117] "RemoveContainer" containerID="de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134" Dec 05 13:09:41.086083 master-0 kubenswrapper[29936]: I1205 13:09:41.085849 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sf8xl"] Dec 05 13:09:41.100933 master-0 kubenswrapper[29936]: I1205 13:09:41.100832 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sf8xl"] Dec 05 13:09:41.112299 master-0 kubenswrapper[29936]: W1205 13:09:41.112229 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4cb64aa_e5c4_45ee_a360_160256df7967.slice/crio-7ddb4b4c6d674ab0a49694c134ea1aeaf9c75f626cd23192507fa05d317833c9 WatchSource:0}: Error finding container 7ddb4b4c6d674ab0a49694c134ea1aeaf9c75f626cd23192507fa05d317833c9: Status 404 returned error can't find the container with id 7ddb4b4c6d674ab0a49694c134ea1aeaf9c75f626cd23192507fa05d317833c9 Dec 05 13:09:41.146555 master-0 kubenswrapper[29936]: I1205 13:09:41.139718 29936 scope.go:117] "RemoveContainer" containerID="7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5" Dec 05 13:09:41.146555 master-0 kubenswrapper[29936]: E1205 13:09:41.140532 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5\": container with ID starting with 7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5 not found: ID does not exist" containerID="7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5" Dec 05 13:09:41.146555 master-0 kubenswrapper[29936]: I1205 13:09:41.140625 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5"} err="failed to get container status \"7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5\": rpc error: code = NotFound desc = could not find container \"7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5\": container with ID starting with 7c993e2b11e2578635a54dc86c0f6eb36dcf0ef59335368fb12c8d9fec562ae5 not found: ID does not exist" Dec 05 13:09:41.146555 master-0 kubenswrapper[29936]: I1205 13:09:41.140675 29936 scope.go:117] "RemoveContainer" containerID="986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50" Dec 05 13:09:41.146555 master-0 kubenswrapper[29936]: E1205 13:09:41.141062 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50\": container with ID starting with 986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50 not found: ID does not exist" containerID="986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50" Dec 05 13:09:41.146555 master-0 kubenswrapper[29936]: I1205 13:09:41.141124 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50"} err="failed to get container status \"986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50\": rpc error: code = NotFound desc = could not find container \"986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50\": container with ID starting with 986af6a43925a7f894bd062157e7c05b59921e441cd3d03fa9658c6bf92c9d50 not found: ID does not exist" Dec 05 13:09:41.146555 master-0 kubenswrapper[29936]: I1205 13:09:41.141143 29936 scope.go:117] "RemoveContainer" containerID="de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134" Dec 05 13:09:41.146555 master-0 kubenswrapper[29936]: E1205 13:09:41.141552 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134\": container with ID starting with de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134 not found: ID does not exist" containerID="de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134" Dec 05 13:09:41.146555 master-0 kubenswrapper[29936]: I1205 13:09:41.141589 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134"} err="failed to get container status \"de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134\": rpc error: code = NotFound desc = could not find container \"de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134\": container with ID starting with de62f0f8af8279416ed5e15bbd26cfd84acec59d68491a5116b55acda2163134 not found: ID does not exist" Dec 05 13:09:41.213012 master-0 kubenswrapper[29936]: I1205 13:09:41.212815 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="758b7407-88d4-48ea-a598-2c48af96a89e" path="/var/lib/kubelet/pods/758b7407-88d4-48ea-a598-2c48af96a89e/volumes" Dec 05 13:09:41.214397 master-0 kubenswrapper[29936]: I1205 13:09:41.214288 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b060638c-8b2a-4dbb-95e5-e1d8839e9783" path="/var/lib/kubelet/pods/b060638c-8b2a-4dbb-95e5-e1d8839e9783/volumes" Dec 05 13:09:41.288079 master-0 kubenswrapper[29936]: I1205 13:09:41.287963 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dhdsz"] Dec 05 13:09:41.288841 master-0 kubenswrapper[29936]: E1205 13:09:41.288802 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758b7407-88d4-48ea-a598-2c48af96a89e" containerName="extract-utilities" Dec 05 13:09:41.288841 master-0 kubenswrapper[29936]: I1205 13:09:41.288829 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="758b7407-88d4-48ea-a598-2c48af96a89e" containerName="extract-utilities" Dec 05 13:09:41.288966 master-0 kubenswrapper[29936]: E1205 13:09:41.288887 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758b7407-88d4-48ea-a598-2c48af96a89e" containerName="registry-server" Dec 05 13:09:41.288966 master-0 kubenswrapper[29936]: I1205 13:09:41.288896 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="758b7407-88d4-48ea-a598-2c48af96a89e" containerName="registry-server" Dec 05 13:09:41.288966 master-0 kubenswrapper[29936]: E1205 13:09:41.288923 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="758b7407-88d4-48ea-a598-2c48af96a89e" containerName="extract-content" Dec 05 13:09:41.288966 master-0 kubenswrapper[29936]: I1205 13:09:41.288931 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="758b7407-88d4-48ea-a598-2c48af96a89e" containerName="extract-content" Dec 05 13:09:41.289235 master-0 kubenswrapper[29936]: I1205 13:09:41.289213 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="758b7407-88d4-48ea-a598-2c48af96a89e" containerName="registry-server" Dec 05 13:09:41.290394 master-0 kubenswrapper[29936]: I1205 13:09:41.290362 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.295335 master-0 kubenswrapper[29936]: I1205 13:09:41.293792 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 13:09:41.295335 master-0 kubenswrapper[29936]: I1205 13:09:41.294538 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 13:09:41.304527 master-0 kubenswrapper[29936]: I1205 13:09:41.303318 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-2pvkn"] Dec 05 13:09:41.306268 master-0 kubenswrapper[29936]: I1205 13:09:41.306230 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.322706 master-0 kubenswrapper[29936]: I1205 13:09:41.320995 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhdsz"] Dec 05 13:09:41.344343 master-0 kubenswrapper[29936]: I1205 13:09:41.344259 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-2pvkn"] Dec 05 13:09:41.431193 master-0 kubenswrapper[29936]: I1205 13:09:41.431090 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-scripts\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.431445 master-0 kubenswrapper[29936]: I1205 13:09:41.431250 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-config-data\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.431490 master-0 kubenswrapper[29936]: I1205 13:09:41.431453 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-combined-ca-bundle\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.431572 master-0 kubenswrapper[29936]: I1205 13:09:41.431541 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt2ct\" (UniqueName: \"kubernetes.io/projected/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-kube-api-access-rt2ct\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.431763 master-0 kubenswrapper[29936]: I1205 13:09:41.431706 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.432292 master-0 kubenswrapper[29936]: I1205 13:09:41.432006 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-config-data\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.432292 master-0 kubenswrapper[29936]: I1205 13:09:41.432108 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8hwq\" (UniqueName: \"kubernetes.io/projected/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-kube-api-access-v8hwq\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.432292 master-0 kubenswrapper[29936]: I1205 13:09:41.432215 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-scripts\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.536049 master-0 kubenswrapper[29936]: I1205 13:09:41.535997 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-scripts\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.536244 master-0 kubenswrapper[29936]: I1205 13:09:41.536227 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-config-data\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.536415 master-0 kubenswrapper[29936]: I1205 13:09:41.536399 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-combined-ca-bundle\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.536528 master-0 kubenswrapper[29936]: I1205 13:09:41.536510 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt2ct\" (UniqueName: \"kubernetes.io/projected/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-kube-api-access-rt2ct\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.536626 master-0 kubenswrapper[29936]: I1205 13:09:41.536610 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.536769 master-0 kubenswrapper[29936]: I1205 13:09:41.536753 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-config-data\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.536896 master-0 kubenswrapper[29936]: I1205 13:09:41.536878 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8hwq\" (UniqueName: \"kubernetes.io/projected/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-kube-api-access-v8hwq\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.538054 master-0 kubenswrapper[29936]: I1205 13:09:41.538033 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-scripts\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.546225 master-0 kubenswrapper[29936]: I1205 13:09:41.546153 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-scripts\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.546476 master-0 kubenswrapper[29936]: I1205 13:09:41.546153 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-scripts\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.546476 master-0 kubenswrapper[29936]: I1205 13:09:41.546337 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-config-data\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.546476 master-0 kubenswrapper[29936]: I1205 13:09:41.546370 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-config-data\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.547056 master-0 kubenswrapper[29936]: I1205 13:09:41.547010 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.551368 master-0 kubenswrapper[29936]: I1205 13:09:41.551315 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-combined-ca-bundle\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.561408 master-0 kubenswrapper[29936]: I1205 13:09:41.561301 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt2ct\" (UniqueName: \"kubernetes.io/projected/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-kube-api-access-rt2ct\") pod \"nova-cell1-host-discover-2pvkn\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.563381 master-0 kubenswrapper[29936]: I1205 13:09:41.563321 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8hwq\" (UniqueName: \"kubernetes.io/projected/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-kube-api-access-v8hwq\") pod \"nova-cell1-cell-mapping-dhdsz\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.711485 master-0 kubenswrapper[29936]: I1205 13:09:41.711412 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:41.721891 master-0 kubenswrapper[29936]: I1205 13:09:41.721825 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:41.994811 master-0 kubenswrapper[29936]: I1205 13:09:41.994054 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4cb64aa-e5c4-45ee-a360-160256df7967","Type":"ContainerStarted","Data":"72c120a721a4769f189f0d15830aa309a3f9c0dc62741f5e2972ac4e2e7718aa"} Dec 05 13:09:41.994811 master-0 kubenswrapper[29936]: I1205 13:09:41.994132 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4cb64aa-e5c4-45ee-a360-160256df7967","Type":"ContainerStarted","Data":"2799feec0e284ffd96af499d04e4ba4630eb006dd9e6984ead7bf7767f7c6c70"} Dec 05 13:09:41.994811 master-0 kubenswrapper[29936]: I1205 13:09:41.994143 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4cb64aa-e5c4-45ee-a360-160256df7967","Type":"ContainerStarted","Data":"7ddb4b4c6d674ab0a49694c134ea1aeaf9c75f626cd23192507fa05d317833c9"} Dec 05 13:09:42.031754 master-0 kubenswrapper[29936]: I1205 13:09:42.031301 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.031240521 podStartE2EDuration="3.031240521s" podCreationTimestamp="2025-12-05 13:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:09:42.026911427 +0000 UTC m=+1179.158991138" watchObservedRunningTime="2025-12-05 13:09:42.031240521 +0000 UTC m=+1179.163320212" Dec 05 13:09:42.302676 master-0 kubenswrapper[29936]: I1205 13:09:42.301954 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhdsz"] Dec 05 13:09:42.451669 master-0 kubenswrapper[29936]: I1205 13:09:42.451539 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-2pvkn"] Dec 05 13:09:42.473952 master-0 kubenswrapper[29936]: I1205 13:09:42.473543 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb98f75c-dzrtw" Dec 05 13:09:42.576866 master-0 kubenswrapper[29936]: I1205 13:09:42.574321 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6944864c6f-cr675"] Dec 05 13:09:42.576866 master-0 kubenswrapper[29936]: I1205 13:09:42.574647 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6944864c6f-cr675" podUID="df45b5b9-1e68-471c-96e9-fa2906275144" containerName="dnsmasq-dns" containerID="cri-o://2172bd5a894f6f100bfd24ed6f31529fb361c734f767f3fa24c59434d96d049b" gracePeriod=10 Dec 05 13:09:43.016957 master-0 kubenswrapper[29936]: I1205 13:09:43.016866 29936 generic.go:334] "Generic (PLEG): container finished" podID="df45b5b9-1e68-471c-96e9-fa2906275144" containerID="2172bd5a894f6f100bfd24ed6f31529fb361c734f767f3fa24c59434d96d049b" exitCode=0 Dec 05 13:09:43.016957 master-0 kubenswrapper[29936]: I1205 13:09:43.016958 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6944864c6f-cr675" event={"ID":"df45b5b9-1e68-471c-96e9-fa2906275144","Type":"ContainerDied","Data":"2172bd5a894f6f100bfd24ed6f31529fb361c734f767f3fa24c59434d96d049b"} Dec 05 13:09:43.020136 master-0 kubenswrapper[29936]: I1205 13:09:43.020096 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-2pvkn" event={"ID":"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49","Type":"ContainerStarted","Data":"0cce25521ec0474b3003ac709ac458cbdc007bb277128c3f5ecd9ce546329141"} Dec 05 13:09:43.020251 master-0 kubenswrapper[29936]: I1205 13:09:43.020138 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-2pvkn" event={"ID":"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49","Type":"ContainerStarted","Data":"5b013db07e1e1041769086cf68207f4010410e61b9e16a2363ecc6ea964ea522"} Dec 05 13:09:43.027314 master-0 kubenswrapper[29936]: I1205 13:09:43.027150 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhdsz" event={"ID":"15176a26-f0f3-4bd2-a9b2-6f450e107ae1","Type":"ContainerStarted","Data":"e24a4fcbcff7a0386a8428428e036961f43ecd0e8e57014ff38a86de77a19507"} Dec 05 13:09:43.027404 master-0 kubenswrapper[29936]: I1205 13:09:43.027356 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhdsz" event={"ID":"15176a26-f0f3-4bd2-a9b2-6f450e107ae1","Type":"ContainerStarted","Data":"308baf53cfc39ecea8b3109256eb34cbb02424a760d020b4eb92177f816f4a5a"} Dec 05 13:09:43.080593 master-0 kubenswrapper[29936]: I1205 13:09:43.069239 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-2pvkn" podStartSLOduration=2.069214567 podStartE2EDuration="2.069214567s" podCreationTimestamp="2025-12-05 13:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:09:43.060377723 +0000 UTC m=+1180.192457424" watchObservedRunningTime="2025-12-05 13:09:43.069214567 +0000 UTC m=+1180.201294258" Dec 05 13:09:43.084457 master-0 kubenswrapper[29936]: I1205 13:09:43.084339 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dhdsz" podStartSLOduration=2.08430858 podStartE2EDuration="2.08430858s" podCreationTimestamp="2025-12-05 13:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:09:43.078039659 +0000 UTC m=+1180.210119340" watchObservedRunningTime="2025-12-05 13:09:43.08430858 +0000 UTC m=+1180.216388261" Dec 05 13:09:43.223221 master-0 kubenswrapper[29936]: I1205 13:09:43.222916 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:09:43.386604 master-0 kubenswrapper[29936]: I1205 13:09:43.386538 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-svc\") pod \"df45b5b9-1e68-471c-96e9-fa2906275144\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " Dec 05 13:09:43.386829 master-0 kubenswrapper[29936]: I1205 13:09:43.386672 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-nb\") pod \"df45b5b9-1e68-471c-96e9-fa2906275144\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " Dec 05 13:09:43.386992 master-0 kubenswrapper[29936]: I1205 13:09:43.386872 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-sb\") pod \"df45b5b9-1e68-471c-96e9-fa2906275144\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " Dec 05 13:09:43.386992 master-0 kubenswrapper[29936]: I1205 13:09:43.386925 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvkph\" (UniqueName: \"kubernetes.io/projected/df45b5b9-1e68-471c-96e9-fa2906275144-kube-api-access-jvkph\") pod \"df45b5b9-1e68-471c-96e9-fa2906275144\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " Dec 05 13:09:43.387105 master-0 kubenswrapper[29936]: I1205 13:09:43.387073 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-swift-storage-0\") pod \"df45b5b9-1e68-471c-96e9-fa2906275144\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " Dec 05 13:09:43.387176 master-0 kubenswrapper[29936]: I1205 13:09:43.387119 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-config\") pod \"df45b5b9-1e68-471c-96e9-fa2906275144\" (UID: \"df45b5b9-1e68-471c-96e9-fa2906275144\") " Dec 05 13:09:43.391531 master-0 kubenswrapper[29936]: I1205 13:09:43.391455 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df45b5b9-1e68-471c-96e9-fa2906275144-kube-api-access-jvkph" (OuterVolumeSpecName: "kube-api-access-jvkph") pod "df45b5b9-1e68-471c-96e9-fa2906275144" (UID: "df45b5b9-1e68-471c-96e9-fa2906275144"). InnerVolumeSpecName "kube-api-access-jvkph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:43.462025 master-0 kubenswrapper[29936]: I1205 13:09:43.461953 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df45b5b9-1e68-471c-96e9-fa2906275144" (UID: "df45b5b9-1e68-471c-96e9-fa2906275144"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:09:43.464364 master-0 kubenswrapper[29936]: I1205 13:09:43.464241 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "df45b5b9-1e68-471c-96e9-fa2906275144" (UID: "df45b5b9-1e68-471c-96e9-fa2906275144"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:09:43.464520 master-0 kubenswrapper[29936]: I1205 13:09:43.464481 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df45b5b9-1e68-471c-96e9-fa2906275144" (UID: "df45b5b9-1e68-471c-96e9-fa2906275144"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:09:43.472252 master-0 kubenswrapper[29936]: I1205 13:09:43.472168 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-config" (OuterVolumeSpecName: "config") pod "df45b5b9-1e68-471c-96e9-fa2906275144" (UID: "df45b5b9-1e68-471c-96e9-fa2906275144"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:09:43.486223 master-0 kubenswrapper[29936]: I1205 13:09:43.486037 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df45b5b9-1e68-471c-96e9-fa2906275144" (UID: "df45b5b9-1e68-471c-96e9-fa2906275144"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:09:43.492506 master-0 kubenswrapper[29936]: I1205 13:09:43.492341 29936 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:43.492506 master-0 kubenswrapper[29936]: I1205 13:09:43.492420 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:43.492506 master-0 kubenswrapper[29936]: I1205 13:09:43.492440 29936 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:43.492506 master-0 kubenswrapper[29936]: I1205 13:09:43.492453 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvkph\" (UniqueName: \"kubernetes.io/projected/df45b5b9-1e68-471c-96e9-fa2906275144-kube-api-access-jvkph\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:43.492506 master-0 kubenswrapper[29936]: I1205 13:09:43.492466 29936 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:43.492506 master-0 kubenswrapper[29936]: I1205 13:09:43.492477 29936 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df45b5b9-1e68-471c-96e9-fa2906275144-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:44.048686 master-0 kubenswrapper[29936]: I1205 13:09:44.048612 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6944864c6f-cr675" Dec 05 13:09:44.048686 master-0 kubenswrapper[29936]: I1205 13:09:44.048613 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6944864c6f-cr675" event={"ID":"df45b5b9-1e68-471c-96e9-fa2906275144","Type":"ContainerDied","Data":"271191a48769cb61a9c31e06d907a409076fb0f5b5f664d7eb63c91049cc951e"} Dec 05 13:09:44.049671 master-0 kubenswrapper[29936]: I1205 13:09:44.048745 29936 scope.go:117] "RemoveContainer" containerID="2172bd5a894f6f100bfd24ed6f31529fb361c734f767f3fa24c59434d96d049b" Dec 05 13:09:44.087008 master-0 kubenswrapper[29936]: I1205 13:09:44.085052 29936 scope.go:117] "RemoveContainer" containerID="4206e9f62b5ab6541d42114fce6a197dc040814569439ea421700d3f527a2b9a" Dec 05 13:09:44.117890 master-0 kubenswrapper[29936]: I1205 13:09:44.117261 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6944864c6f-cr675"] Dec 05 13:09:44.137190 master-0 kubenswrapper[29936]: I1205 13:09:44.137103 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6944864c6f-cr675"] Dec 05 13:09:45.212908 master-0 kubenswrapper[29936]: I1205 13:09:45.212823 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df45b5b9-1e68-471c-96e9-fa2906275144" path="/var/lib/kubelet/pods/df45b5b9-1e68-471c-96e9-fa2906275144/volumes" Dec 05 13:09:46.083959 master-0 kubenswrapper[29936]: I1205 13:09:46.083799 29936 generic.go:334] "Generic (PLEG): container finished" podID="dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49" containerID="0cce25521ec0474b3003ac709ac458cbdc007bb277128c3f5ecd9ce546329141" exitCode=0 Dec 05 13:09:46.083959 master-0 kubenswrapper[29936]: I1205 13:09:46.083908 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-2pvkn" event={"ID":"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49","Type":"ContainerDied","Data":"0cce25521ec0474b3003ac709ac458cbdc007bb277128c3f5ecd9ce546329141"} Dec 05 13:09:47.612760 master-0 kubenswrapper[29936]: I1205 13:09:47.612638 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:47.738333 master-0 kubenswrapper[29936]: I1205 13:09:47.737807 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-combined-ca-bundle\") pod \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " Dec 05 13:09:47.738333 master-0 kubenswrapper[29936]: I1205 13:09:47.737978 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-scripts\") pod \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " Dec 05 13:09:47.738695 master-0 kubenswrapper[29936]: I1205 13:09:47.738356 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-config-data\") pod \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " Dec 05 13:09:47.738695 master-0 kubenswrapper[29936]: I1205 13:09:47.738486 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt2ct\" (UniqueName: \"kubernetes.io/projected/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-kube-api-access-rt2ct\") pod \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\" (UID: \"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49\") " Dec 05 13:09:47.745764 master-0 kubenswrapper[29936]: I1205 13:09:47.744717 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-kube-api-access-rt2ct" (OuterVolumeSpecName: "kube-api-access-rt2ct") pod "dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49" (UID: "dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49"). InnerVolumeSpecName "kube-api-access-rt2ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:47.746944 master-0 kubenswrapper[29936]: I1205 13:09:47.746582 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-scripts" (OuterVolumeSpecName: "scripts") pod "dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49" (UID: "dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:47.772425 master-0 kubenswrapper[29936]: I1205 13:09:47.772125 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49" (UID: "dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:47.791067 master-0 kubenswrapper[29936]: I1205 13:09:47.790908 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-config-data" (OuterVolumeSpecName: "config-data") pod "dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49" (UID: "dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:47.842556 master-0 kubenswrapper[29936]: I1205 13:09:47.842468 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:47.842556 master-0 kubenswrapper[29936]: I1205 13:09:47.842548 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt2ct\" (UniqueName: \"kubernetes.io/projected/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-kube-api-access-rt2ct\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:47.842556 master-0 kubenswrapper[29936]: I1205 13:09:47.842567 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:47.842556 master-0 kubenswrapper[29936]: I1205 13:09:47.842577 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:48.132935 master-0 kubenswrapper[29936]: I1205 13:09:48.132724 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-2pvkn" Dec 05 13:09:48.132935 master-0 kubenswrapper[29936]: I1205 13:09:48.132719 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-2pvkn" event={"ID":"dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49","Type":"ContainerDied","Data":"5b013db07e1e1041769086cf68207f4010410e61b9e16a2363ecc6ea964ea522"} Dec 05 13:09:48.132935 master-0 kubenswrapper[29936]: I1205 13:09:48.132833 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b013db07e1e1041769086cf68207f4010410e61b9e16a2363ecc6ea964ea522" Dec 05 13:09:49.151746 master-0 kubenswrapper[29936]: I1205 13:09:49.151572 29936 generic.go:334] "Generic (PLEG): container finished" podID="15176a26-f0f3-4bd2-a9b2-6f450e107ae1" containerID="e24a4fcbcff7a0386a8428428e036961f43ecd0e8e57014ff38a86de77a19507" exitCode=0 Dec 05 13:09:49.151746 master-0 kubenswrapper[29936]: I1205 13:09:49.151640 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhdsz" event={"ID":"15176a26-f0f3-4bd2-a9b2-6f450e107ae1","Type":"ContainerDied","Data":"e24a4fcbcff7a0386a8428428e036961f43ecd0e8e57014ff38a86de77a19507"} Dec 05 13:09:50.442089 master-0 kubenswrapper[29936]: I1205 13:09:50.441938 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 13:09:50.442089 master-0 kubenswrapper[29936]: I1205 13:09:50.442076 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 13:09:50.781259 master-0 kubenswrapper[29936]: I1205 13:09:50.781127 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:50.833779 master-0 kubenswrapper[29936]: I1205 13:09:50.833572 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-config-data\") pod \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " Dec 05 13:09:50.834134 master-0 kubenswrapper[29936]: I1205 13:09:50.833979 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-scripts\") pod \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " Dec 05 13:09:50.834134 master-0 kubenswrapper[29936]: I1205 13:09:50.834122 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-combined-ca-bundle\") pod \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " Dec 05 13:09:50.834282 master-0 kubenswrapper[29936]: I1205 13:09:50.834224 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8hwq\" (UniqueName: \"kubernetes.io/projected/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-kube-api-access-v8hwq\") pod \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\" (UID: \"15176a26-f0f3-4bd2-a9b2-6f450e107ae1\") " Dec 05 13:09:50.838534 master-0 kubenswrapper[29936]: I1205 13:09:50.838487 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-kube-api-access-v8hwq" (OuterVolumeSpecName: "kube-api-access-v8hwq") pod "15176a26-f0f3-4bd2-a9b2-6f450e107ae1" (UID: "15176a26-f0f3-4bd2-a9b2-6f450e107ae1"). InnerVolumeSpecName "kube-api-access-v8hwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:50.839481 master-0 kubenswrapper[29936]: I1205 13:09:50.839382 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-scripts" (OuterVolumeSpecName: "scripts") pod "15176a26-f0f3-4bd2-a9b2-6f450e107ae1" (UID: "15176a26-f0f3-4bd2-a9b2-6f450e107ae1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:50.886335 master-0 kubenswrapper[29936]: I1205 13:09:50.886255 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15176a26-f0f3-4bd2-a9b2-6f450e107ae1" (UID: "15176a26-f0f3-4bd2-a9b2-6f450e107ae1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:50.905449 master-0 kubenswrapper[29936]: I1205 13:09:50.905305 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-config-data" (OuterVolumeSpecName: "config-data") pod "15176a26-f0f3-4bd2-a9b2-6f450e107ae1" (UID: "15176a26-f0f3-4bd2-a9b2-6f450e107ae1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:50.938662 master-0 kubenswrapper[29936]: I1205 13:09:50.938581 29936 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-scripts\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:50.938662 master-0 kubenswrapper[29936]: I1205 13:09:50.938650 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:50.938662 master-0 kubenswrapper[29936]: I1205 13:09:50.938666 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8hwq\" (UniqueName: \"kubernetes.io/projected/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-kube-api-access-v8hwq\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:50.938662 master-0 kubenswrapper[29936]: I1205 13:09:50.938682 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15176a26-f0f3-4bd2-a9b2-6f450e107ae1-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:51.182007 master-0 kubenswrapper[29936]: I1205 13:09:51.181923 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dhdsz" event={"ID":"15176a26-f0f3-4bd2-a9b2-6f450e107ae1","Type":"ContainerDied","Data":"308baf53cfc39ecea8b3109256eb34cbb02424a760d020b4eb92177f816f4a5a"} Dec 05 13:09:51.182007 master-0 kubenswrapper[29936]: I1205 13:09:51.181998 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="308baf53cfc39ecea8b3109256eb34cbb02424a760d020b4eb92177f816f4a5a" Dec 05 13:09:51.182385 master-0 kubenswrapper[29936]: I1205 13:09:51.182019 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dhdsz" Dec 05 13:09:51.416093 master-0 kubenswrapper[29936]: I1205 13:09:51.415265 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:51.416093 master-0 kubenswrapper[29936]: I1205 13:09:51.415754 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerName="nova-api-log" containerID="cri-o://2799feec0e284ffd96af499d04e4ba4630eb006dd9e6984ead7bf7767f7c6c70" gracePeriod=30 Dec 05 13:09:51.416093 master-0 kubenswrapper[29936]: I1205 13:09:51.415966 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerName="nova-api-api" containerID="cri-o://72c120a721a4769f189f0d15830aa309a3f9c0dc62741f5e2972ac4e2e7718aa" gracePeriod=30 Dec 05 13:09:51.443273 master-0 kubenswrapper[29936]: I1205 13:09:51.440631 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.22:8774/\": EOF" Dec 05 13:09:51.443273 master-0 kubenswrapper[29936]: I1205 13:09:51.440893 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.22:8774/\": EOF" Dec 05 13:09:51.443273 master-0 kubenswrapper[29936]: I1205 13:09:51.440951 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:51.443273 master-0 kubenswrapper[29936]: I1205 13:09:51.441232 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3d8d3713-fc4a-42f0-8e0c-4cdcb6589027" containerName="nova-scheduler-scheduler" containerID="cri-o://ac0e76af22a910b46e3085e921399cae1f3ec5a05aff26ee7c7784cc7f63dfef" gracePeriod=30 Dec 05 13:09:51.498321 master-0 kubenswrapper[29936]: I1205 13:09:51.497871 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:51.501640 master-0 kubenswrapper[29936]: I1205 13:09:51.498536 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-log" containerID="cri-o://edaaf002e5a85715d16c07bb939e87620e427ec80567ef22989739166fd43448" gracePeriod=30 Dec 05 13:09:51.501640 master-0 kubenswrapper[29936]: I1205 13:09:51.499342 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-metadata" containerID="cri-o://611789a47c38f1fdc0977a80a51e6e86b0b6450cebc4e4bb4f961783beec151a" gracePeriod=30 Dec 05 13:09:52.201911 master-0 kubenswrapper[29936]: I1205 13:09:52.201715 29936 generic.go:334] "Generic (PLEG): container finished" podID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerID="2799feec0e284ffd96af499d04e4ba4630eb006dd9e6984ead7bf7767f7c6c70" exitCode=143 Dec 05 13:09:52.201911 master-0 kubenswrapper[29936]: I1205 13:09:52.201826 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4cb64aa-e5c4-45ee-a360-160256df7967","Type":"ContainerDied","Data":"2799feec0e284ffd96af499d04e4ba4630eb006dd9e6984ead7bf7767f7c6c70"} Dec 05 13:09:52.206125 master-0 kubenswrapper[29936]: I1205 13:09:52.206085 29936 generic.go:334] "Generic (PLEG): container finished" podID="884426a8-a4eb-4387-a9ab-546f0844b879" containerID="edaaf002e5a85715d16c07bb939e87620e427ec80567ef22989739166fd43448" exitCode=143 Dec 05 13:09:52.206410 master-0 kubenswrapper[29936]: I1205 13:09:52.206148 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"884426a8-a4eb-4387-a9ab-546f0844b879","Type":"ContainerDied","Data":"edaaf002e5a85715d16c07bb939e87620e427ec80567ef22989739166fd43448"} Dec 05 13:09:52.878283 master-0 kubenswrapper[29936]: E1205 13:09:52.877831 29936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0e76af22a910b46e3085e921399cae1f3ec5a05aff26ee7c7784cc7f63dfef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 13:09:52.883083 master-0 kubenswrapper[29936]: E1205 13:09:52.881257 29936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0e76af22a910b46e3085e921399cae1f3ec5a05aff26ee7c7784cc7f63dfef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 13:09:52.884934 master-0 kubenswrapper[29936]: E1205 13:09:52.884799 29936 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ac0e76af22a910b46e3085e921399cae1f3ec5a05aff26ee7c7784cc7f63dfef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 13:09:52.885093 master-0 kubenswrapper[29936]: E1205 13:09:52.884952 29936 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3d8d3713-fc4a-42f0-8e0c-4cdcb6589027" containerName="nova-scheduler-scheduler" Dec 05 13:09:54.661734 master-0 kubenswrapper[29936]: I1205 13:09:54.661620 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.15:8775/\": read tcp 10.128.0.2:40650->10.128.1.15:8775: read: connection reset by peer" Dec 05 13:09:54.662493 master-0 kubenswrapper[29936]: I1205 13:09:54.661626 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.15:8775/\": read tcp 10.128.0.2:40662->10.128.1.15:8775: read: connection reset by peer" Dec 05 13:09:55.254285 master-0 kubenswrapper[29936]: I1205 13:09:55.250922 29936 generic.go:334] "Generic (PLEG): container finished" podID="884426a8-a4eb-4387-a9ab-546f0844b879" containerID="611789a47c38f1fdc0977a80a51e6e86b0b6450cebc4e4bb4f961783beec151a" exitCode=0 Dec 05 13:09:55.254285 master-0 kubenswrapper[29936]: I1205 13:09:55.250988 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"884426a8-a4eb-4387-a9ab-546f0844b879","Type":"ContainerDied","Data":"611789a47c38f1fdc0977a80a51e6e86b0b6450cebc4e4bb4f961783beec151a"} Dec 05 13:09:55.254285 master-0 kubenswrapper[29936]: I1205 13:09:55.251021 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"884426a8-a4eb-4387-a9ab-546f0844b879","Type":"ContainerDied","Data":"257e24abc28a345e468534fbf1d2e3f6df66ddb767d21ae3840ab5db87e95d8a"} Dec 05 13:09:55.254285 master-0 kubenswrapper[29936]: I1205 13:09:55.251033 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="257e24abc28a345e468534fbf1d2e3f6df66ddb767d21ae3840ab5db87e95d8a" Dec 05 13:09:55.291474 master-0 kubenswrapper[29936]: I1205 13:09:55.291392 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 13:09:55.377240 master-0 kubenswrapper[29936]: I1205 13:09:55.377029 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/884426a8-a4eb-4387-a9ab-546f0844b879-logs\") pod \"884426a8-a4eb-4387-a9ab-546f0844b879\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " Dec 05 13:09:55.377240 master-0 kubenswrapper[29936]: I1205 13:09:55.377118 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcp7k\" (UniqueName: \"kubernetes.io/projected/884426a8-a4eb-4387-a9ab-546f0844b879-kube-api-access-qcp7k\") pod \"884426a8-a4eb-4387-a9ab-546f0844b879\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " Dec 05 13:09:55.377240 master-0 kubenswrapper[29936]: I1205 13:09:55.377210 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-nova-metadata-tls-certs\") pod \"884426a8-a4eb-4387-a9ab-546f0844b879\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " Dec 05 13:09:55.377678 master-0 kubenswrapper[29936]: I1205 13:09:55.377654 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-config-data\") pod \"884426a8-a4eb-4387-a9ab-546f0844b879\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " Dec 05 13:09:55.377823 master-0 kubenswrapper[29936]: I1205 13:09:55.377763 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-combined-ca-bundle\") pod \"884426a8-a4eb-4387-a9ab-546f0844b879\" (UID: \"884426a8-a4eb-4387-a9ab-546f0844b879\") " Dec 05 13:09:55.381298 master-0 kubenswrapper[29936]: I1205 13:09:55.380906 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/884426a8-a4eb-4387-a9ab-546f0844b879-logs" (OuterVolumeSpecName: "logs") pod "884426a8-a4eb-4387-a9ab-546f0844b879" (UID: "884426a8-a4eb-4387-a9ab-546f0844b879"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:09:55.398556 master-0 kubenswrapper[29936]: I1205 13:09:55.398478 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884426a8-a4eb-4387-a9ab-546f0844b879-kube-api-access-qcp7k" (OuterVolumeSpecName: "kube-api-access-qcp7k") pod "884426a8-a4eb-4387-a9ab-546f0844b879" (UID: "884426a8-a4eb-4387-a9ab-546f0844b879"). InnerVolumeSpecName "kube-api-access-qcp7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:55.421559 master-0 kubenswrapper[29936]: I1205 13:09:55.421441 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "884426a8-a4eb-4387-a9ab-546f0844b879" (UID: "884426a8-a4eb-4387-a9ab-546f0844b879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:55.436423 master-0 kubenswrapper[29936]: I1205 13:09:55.436366 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-config-data" (OuterVolumeSpecName: "config-data") pod "884426a8-a4eb-4387-a9ab-546f0844b879" (UID: "884426a8-a4eb-4387-a9ab-546f0844b879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:55.467546 master-0 kubenswrapper[29936]: I1205 13:09:55.467466 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "884426a8-a4eb-4387-a9ab-546f0844b879" (UID: "884426a8-a4eb-4387-a9ab-546f0844b879"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:55.482067 master-0 kubenswrapper[29936]: I1205 13:09:55.481994 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:55.482067 master-0 kubenswrapper[29936]: I1205 13:09:55.482053 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/884426a8-a4eb-4387-a9ab-546f0844b879-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:55.482067 master-0 kubenswrapper[29936]: I1205 13:09:55.482067 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcp7k\" (UniqueName: \"kubernetes.io/projected/884426a8-a4eb-4387-a9ab-546f0844b879-kube-api-access-qcp7k\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:55.482067 master-0 kubenswrapper[29936]: I1205 13:09:55.482079 29936 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:55.482438 master-0 kubenswrapper[29936]: I1205 13:09:55.482091 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884426a8-a4eb-4387-a9ab-546f0844b879-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:56.262971 master-0 kubenswrapper[29936]: I1205 13:09:56.262888 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 13:09:56.347939 master-0 kubenswrapper[29936]: I1205 13:09:56.347847 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:56.374324 master-0 kubenswrapper[29936]: I1205 13:09:56.374216 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:56.398836 master-0 kubenswrapper[29936]: I1205 13:09:56.398249 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:56.399230 master-0 kubenswrapper[29936]: E1205 13:09:56.399112 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-metadata" Dec 05 13:09:56.399230 master-0 kubenswrapper[29936]: I1205 13:09:56.399136 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-metadata" Dec 05 13:09:56.399230 master-0 kubenswrapper[29936]: E1205 13:09:56.399204 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df45b5b9-1e68-471c-96e9-fa2906275144" containerName="dnsmasq-dns" Dec 05 13:09:56.399230 master-0 kubenswrapper[29936]: I1205 13:09:56.399213 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="df45b5b9-1e68-471c-96e9-fa2906275144" containerName="dnsmasq-dns" Dec 05 13:09:56.399456 master-0 kubenswrapper[29936]: E1205 13:09:56.399260 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df45b5b9-1e68-471c-96e9-fa2906275144" containerName="init" Dec 05 13:09:56.399456 master-0 kubenswrapper[29936]: I1205 13:09:56.399269 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="df45b5b9-1e68-471c-96e9-fa2906275144" containerName="init" Dec 05 13:09:56.399456 master-0 kubenswrapper[29936]: E1205 13:09:56.399291 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15176a26-f0f3-4bd2-a9b2-6f450e107ae1" containerName="nova-manage" Dec 05 13:09:56.399456 master-0 kubenswrapper[29936]: I1205 13:09:56.399299 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="15176a26-f0f3-4bd2-a9b2-6f450e107ae1" containerName="nova-manage" Dec 05 13:09:56.399456 master-0 kubenswrapper[29936]: E1205 13:09:56.399308 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49" containerName="nova-manage" Dec 05 13:09:56.399456 master-0 kubenswrapper[29936]: I1205 13:09:56.399315 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49" containerName="nova-manage" Dec 05 13:09:56.399456 master-0 kubenswrapper[29936]: E1205 13:09:56.399326 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-log" Dec 05 13:09:56.399456 master-0 kubenswrapper[29936]: I1205 13:09:56.399333 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-log" Dec 05 13:09:56.399715 master-0 kubenswrapper[29936]: I1205 13:09:56.399623 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="df45b5b9-1e68-471c-96e9-fa2906275144" containerName="dnsmasq-dns" Dec 05 13:09:56.399715 master-0 kubenswrapper[29936]: I1205 13:09:56.399657 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49" containerName="nova-manage" Dec 05 13:09:56.399715 master-0 kubenswrapper[29936]: I1205 13:09:56.399701 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="15176a26-f0f3-4bd2-a9b2-6f450e107ae1" containerName="nova-manage" Dec 05 13:09:56.399715 master-0 kubenswrapper[29936]: I1205 13:09:56.399715 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-log" Dec 05 13:09:56.399850 master-0 kubenswrapper[29936]: I1205 13:09:56.399743 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" containerName="nova-metadata-metadata" Dec 05 13:09:56.403508 master-0 kubenswrapper[29936]: I1205 13:09:56.403414 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 13:09:56.408077 master-0 kubenswrapper[29936]: I1205 13:09:56.408014 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 13:09:56.408527 master-0 kubenswrapper[29936]: I1205 13:09:56.408133 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 13:09:56.410814 master-0 kubenswrapper[29936]: I1205 13:09:56.410749 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:56.420005 master-0 kubenswrapper[29936]: I1205 13:09:56.419300 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v92nm\" (UniqueName: \"kubernetes.io/projected/265df19f-e9c7-41ec-a083-913add4c97ed-kube-api-access-v92nm\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.420840 master-0 kubenswrapper[29936]: I1205 13:09:56.420813 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/265df19f-e9c7-41ec-a083-913add4c97ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.421075 master-0 kubenswrapper[29936]: I1205 13:09:56.421051 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265df19f-e9c7-41ec-a083-913add4c97ed-logs\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.421326 master-0 kubenswrapper[29936]: I1205 13:09:56.421307 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265df19f-e9c7-41ec-a083-913add4c97ed-config-data\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.421607 master-0 kubenswrapper[29936]: I1205 13:09:56.421592 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265df19f-e9c7-41ec-a083-913add4c97ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.524784 master-0 kubenswrapper[29936]: I1205 13:09:56.524563 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v92nm\" (UniqueName: \"kubernetes.io/projected/265df19f-e9c7-41ec-a083-913add4c97ed-kube-api-access-v92nm\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.525296 master-0 kubenswrapper[29936]: I1205 13:09:56.524798 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/265df19f-e9c7-41ec-a083-913add4c97ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.525296 master-0 kubenswrapper[29936]: I1205 13:09:56.525042 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265df19f-e9c7-41ec-a083-913add4c97ed-logs\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.525917 master-0 kubenswrapper[29936]: I1205 13:09:56.525428 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265df19f-e9c7-41ec-a083-913add4c97ed-config-data\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.525917 master-0 kubenswrapper[29936]: I1205 13:09:56.525545 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265df19f-e9c7-41ec-a083-913add4c97ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.526570 master-0 kubenswrapper[29936]: I1205 13:09:56.526052 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265df19f-e9c7-41ec-a083-913add4c97ed-logs\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.539300 master-0 kubenswrapper[29936]: I1205 13:09:56.530149 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265df19f-e9c7-41ec-a083-913add4c97ed-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.539300 master-0 kubenswrapper[29936]: I1205 13:09:56.531092 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265df19f-e9c7-41ec-a083-913add4c97ed-config-data\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.539300 master-0 kubenswrapper[29936]: I1205 13:09:56.533436 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/265df19f-e9c7-41ec-a083-913add4c97ed-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.544060 master-0 kubenswrapper[29936]: I1205 13:09:56.543992 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v92nm\" (UniqueName: \"kubernetes.io/projected/265df19f-e9c7-41ec-a083-913add4c97ed-kube-api-access-v92nm\") pod \"nova-metadata-0\" (UID: \"265df19f-e9c7-41ec-a083-913add4c97ed\") " pod="openstack/nova-metadata-0" Dec 05 13:09:56.728540 master-0 kubenswrapper[29936]: I1205 13:09:56.728458 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 13:09:57.208756 master-0 kubenswrapper[29936]: I1205 13:09:57.208681 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884426a8-a4eb-4387-a9ab-546f0844b879" path="/var/lib/kubelet/pods/884426a8-a4eb-4387-a9ab-546f0844b879/volumes" Dec 05 13:09:57.284491 master-0 kubenswrapper[29936]: I1205 13:09:57.284433 29936 generic.go:334] "Generic (PLEG): container finished" podID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerID="72c120a721a4769f189f0d15830aa309a3f9c0dc62741f5e2972ac4e2e7718aa" exitCode=0 Dec 05 13:09:57.285267 master-0 kubenswrapper[29936]: I1205 13:09:57.284530 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4cb64aa-e5c4-45ee-a360-160256df7967","Type":"ContainerDied","Data":"72c120a721a4769f189f0d15830aa309a3f9c0dc62741f5e2972ac4e2e7718aa"} Dec 05 13:09:57.294500 master-0 kubenswrapper[29936]: I1205 13:09:57.294460 29936 generic.go:334] "Generic (PLEG): container finished" podID="3d8d3713-fc4a-42f0-8e0c-4cdcb6589027" containerID="ac0e76af22a910b46e3085e921399cae1f3ec5a05aff26ee7c7784cc7f63dfef" exitCode=0 Dec 05 13:09:57.294587 master-0 kubenswrapper[29936]: I1205 13:09:57.294508 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027","Type":"ContainerDied","Data":"ac0e76af22a910b46e3085e921399cae1f3ec5a05aff26ee7c7784cc7f63dfef"} Dec 05 13:09:57.515335 master-0 kubenswrapper[29936]: I1205 13:09:57.515264 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 13:09:57.756405 master-0 kubenswrapper[29936]: I1205 13:09:57.756287 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 13:09:57.772596 master-0 kubenswrapper[29936]: I1205 13:09:57.772534 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:57.868141 master-0 kubenswrapper[29936]: I1205 13:09:57.868059 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnvqb\" (UniqueName: \"kubernetes.io/projected/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-kube-api-access-gnvqb\") pod \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " Dec 05 13:09:57.868409 master-0 kubenswrapper[29936]: I1205 13:09:57.868300 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-combined-ca-bundle\") pod \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " Dec 05 13:09:57.868693 master-0 kubenswrapper[29936]: I1205 13:09:57.868670 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-config-data\") pod \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\" (UID: \"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027\") " Dec 05 13:09:57.872646 master-0 kubenswrapper[29936]: I1205 13:09:57.872566 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-kube-api-access-gnvqb" (OuterVolumeSpecName: "kube-api-access-gnvqb") pod "3d8d3713-fc4a-42f0-8e0c-4cdcb6589027" (UID: "3d8d3713-fc4a-42f0-8e0c-4cdcb6589027"). InnerVolumeSpecName "kube-api-access-gnvqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:57.907435 master-0 kubenswrapper[29936]: I1205 13:09:57.907353 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-config-data" (OuterVolumeSpecName: "config-data") pod "3d8d3713-fc4a-42f0-8e0c-4cdcb6589027" (UID: "3d8d3713-fc4a-42f0-8e0c-4cdcb6589027"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:57.922928 master-0 kubenswrapper[29936]: I1205 13:09:57.922868 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d8d3713-fc4a-42f0-8e0c-4cdcb6589027" (UID: "3d8d3713-fc4a-42f0-8e0c-4cdcb6589027"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:57.970427 master-0 kubenswrapper[29936]: I1205 13:09:57.970345 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-internal-tls-certs\") pod \"a4cb64aa-e5c4-45ee-a360-160256df7967\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " Dec 05 13:09:57.970667 master-0 kubenswrapper[29936]: I1205 13:09:57.970513 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-public-tls-certs\") pod \"a4cb64aa-e5c4-45ee-a360-160256df7967\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " Dec 05 13:09:57.970755 master-0 kubenswrapper[29936]: I1205 13:09:57.970697 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxh2g\" (UniqueName: \"kubernetes.io/projected/a4cb64aa-e5c4-45ee-a360-160256df7967-kube-api-access-gxh2g\") pod \"a4cb64aa-e5c4-45ee-a360-160256df7967\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " Dec 05 13:09:57.971080 master-0 kubenswrapper[29936]: I1205 13:09:57.970900 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cb64aa-e5c4-45ee-a360-160256df7967-logs\") pod \"a4cb64aa-e5c4-45ee-a360-160256df7967\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " Dec 05 13:09:57.971080 master-0 kubenswrapper[29936]: I1205 13:09:57.970952 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-config-data\") pod \"a4cb64aa-e5c4-45ee-a360-160256df7967\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " Dec 05 13:09:57.971080 master-0 kubenswrapper[29936]: I1205 13:09:57.970994 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-combined-ca-bundle\") pod \"a4cb64aa-e5c4-45ee-a360-160256df7967\" (UID: \"a4cb64aa-e5c4-45ee-a360-160256df7967\") " Dec 05 13:09:57.971741 master-0 kubenswrapper[29936]: I1205 13:09:57.971425 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:57.971741 master-0 kubenswrapper[29936]: I1205 13:09:57.971441 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnvqb\" (UniqueName: \"kubernetes.io/projected/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-kube-api-access-gnvqb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:57.971741 master-0 kubenswrapper[29936]: I1205 13:09:57.971453 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:57.971902 master-0 kubenswrapper[29936]: I1205 13:09:57.971836 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4cb64aa-e5c4-45ee-a360-160256df7967-logs" (OuterVolumeSpecName: "logs") pod "a4cb64aa-e5c4-45ee-a360-160256df7967" (UID: "a4cb64aa-e5c4-45ee-a360-160256df7967"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:09:57.978540 master-0 kubenswrapper[29936]: I1205 13:09:57.977857 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4cb64aa-e5c4-45ee-a360-160256df7967-kube-api-access-gxh2g" (OuterVolumeSpecName: "kube-api-access-gxh2g") pod "a4cb64aa-e5c4-45ee-a360-160256df7967" (UID: "a4cb64aa-e5c4-45ee-a360-160256df7967"). InnerVolumeSpecName "kube-api-access-gxh2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:09:58.011388 master-0 kubenswrapper[29936]: I1205 13:09:58.011226 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4cb64aa-e5c4-45ee-a360-160256df7967" (UID: "a4cb64aa-e5c4-45ee-a360-160256df7967"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:58.015920 master-0 kubenswrapper[29936]: I1205 13:09:58.015853 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-config-data" (OuterVolumeSpecName: "config-data") pod "a4cb64aa-e5c4-45ee-a360-160256df7967" (UID: "a4cb64aa-e5c4-45ee-a360-160256df7967"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:58.048822 master-0 kubenswrapper[29936]: I1205 13:09:58.048761 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a4cb64aa-e5c4-45ee-a360-160256df7967" (UID: "a4cb64aa-e5c4-45ee-a360-160256df7967"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:58.051894 master-0 kubenswrapper[29936]: I1205 13:09:58.051848 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a4cb64aa-e5c4-45ee-a360-160256df7967" (UID: "a4cb64aa-e5c4-45ee-a360-160256df7967"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:09:58.073447 master-0 kubenswrapper[29936]: I1205 13:09:58.073368 29936 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:58.073447 master-0 kubenswrapper[29936]: I1205 13:09:58.073443 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxh2g\" (UniqueName: \"kubernetes.io/projected/a4cb64aa-e5c4-45ee-a360-160256df7967-kube-api-access-gxh2g\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:58.073447 master-0 kubenswrapper[29936]: I1205 13:09:58.073457 29936 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4cb64aa-e5c4-45ee-a360-160256df7967-logs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:58.073447 master-0 kubenswrapper[29936]: I1205 13:09:58.073467 29936 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-config-data\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:58.073880 master-0 kubenswrapper[29936]: I1205 13:09:58.073480 29936 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:58.073880 master-0 kubenswrapper[29936]: I1205 13:09:58.073490 29936 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4cb64aa-e5c4-45ee-a360-160256df7967-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 05 13:09:58.317308 master-0 kubenswrapper[29936]: I1205 13:09:58.317065 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"265df19f-e9c7-41ec-a083-913add4c97ed","Type":"ContainerStarted","Data":"83f1615b775a3554ef35c7c6b4aaae664769425ac63214538a019d03d573b089"} Dec 05 13:09:58.317308 master-0 kubenswrapper[29936]: I1205 13:09:58.317135 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"265df19f-e9c7-41ec-a083-913add4c97ed","Type":"ContainerStarted","Data":"144e451874849eb97f474e9aedc553c40de7119a09ea4208b3b2f85bfb40c84e"} Dec 05 13:09:58.317308 master-0 kubenswrapper[29936]: I1205 13:09:58.317147 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"265df19f-e9c7-41ec-a083-913add4c97ed","Type":"ContainerStarted","Data":"94c5f754fdf37abc90770ed445f4009993c966cd826260f2e6cc8a450a948714"} Dec 05 13:09:58.320541 master-0 kubenswrapper[29936]: I1205 13:09:58.320488 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d8d3713-fc4a-42f0-8e0c-4cdcb6589027","Type":"ContainerDied","Data":"36730ae6e3bd6603dd1d5ca11ddb1514b6969fb888012de2f3826bd1c7395759"} Dec 05 13:09:58.320645 master-0 kubenswrapper[29936]: I1205 13:09:58.320549 29936 scope.go:117] "RemoveContainer" containerID="ac0e76af22a910b46e3085e921399cae1f3ec5a05aff26ee7c7784cc7f63dfef" Dec 05 13:09:58.320762 master-0 kubenswrapper[29936]: I1205 13:09:58.320730 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 13:09:58.333965 master-0 kubenswrapper[29936]: I1205 13:09:58.333910 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a4cb64aa-e5c4-45ee-a360-160256df7967","Type":"ContainerDied","Data":"7ddb4b4c6d674ab0a49694c134ea1aeaf9c75f626cd23192507fa05d317833c9"} Dec 05 13:09:58.334168 master-0 kubenswrapper[29936]: I1205 13:09:58.334061 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:58.379640 master-0 kubenswrapper[29936]: I1205 13:09:58.379440 29936 scope.go:117] "RemoveContainer" containerID="72c120a721a4769f189f0d15830aa309a3f9c0dc62741f5e2972ac4e2e7718aa" Dec 05 13:09:58.381806 master-0 kubenswrapper[29936]: I1205 13:09:58.381688 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.381655036 podStartE2EDuration="2.381655036s" podCreationTimestamp="2025-12-05 13:09:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:09:58.362015322 +0000 UTC m=+1195.494095033" watchObservedRunningTime="2025-12-05 13:09:58.381655036 +0000 UTC m=+1195.513734737" Dec 05 13:09:58.403946 master-0 kubenswrapper[29936]: I1205 13:09:58.403065 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:58.439916 master-0 kubenswrapper[29936]: I1205 13:09:58.439680 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:58.442103 master-0 kubenswrapper[29936]: I1205 13:09:58.441593 29936 scope.go:117] "RemoveContainer" containerID="2799feec0e284ffd96af499d04e4ba4630eb006dd9e6984ead7bf7767f7c6c70" Dec 05 13:09:58.461559 master-0 kubenswrapper[29936]: I1205 13:09:58.461478 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:58.477243 master-0 kubenswrapper[29936]: I1205 13:09:58.476347 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:58.477243 master-0 kubenswrapper[29936]: E1205 13:09:58.476956 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8d3713-fc4a-42f0-8e0c-4cdcb6589027" containerName="nova-scheduler-scheduler" Dec 05 13:09:58.477243 master-0 kubenswrapper[29936]: I1205 13:09:58.476978 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8d3713-fc4a-42f0-8e0c-4cdcb6589027" containerName="nova-scheduler-scheduler" Dec 05 13:09:58.477243 master-0 kubenswrapper[29936]: E1205 13:09:58.477028 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerName="nova-api-log" Dec 05 13:09:58.477243 master-0 kubenswrapper[29936]: I1205 13:09:58.477039 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerName="nova-api-log" Dec 05 13:09:58.477243 master-0 kubenswrapper[29936]: E1205 13:09:58.477073 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerName="nova-api-api" Dec 05 13:09:58.477243 master-0 kubenswrapper[29936]: I1205 13:09:58.477085 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerName="nova-api-api" Dec 05 13:09:58.479206 master-0 kubenswrapper[29936]: I1205 13:09:58.477916 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerName="nova-api-api" Dec 05 13:09:58.479206 master-0 kubenswrapper[29936]: I1205 13:09:58.477967 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d8d3713-fc4a-42f0-8e0c-4cdcb6589027" containerName="nova-scheduler-scheduler" Dec 05 13:09:58.479206 master-0 kubenswrapper[29936]: I1205 13:09:58.477984 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" containerName="nova-api-log" Dec 05 13:09:58.486195 master-0 kubenswrapper[29936]: I1205 13:09:58.481502 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:58.486195 master-0 kubenswrapper[29936]: I1205 13:09:58.484386 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 13:09:58.487326 master-0 kubenswrapper[29936]: I1205 13:09:58.487301 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 13:09:58.487519 master-0 kubenswrapper[29936]: I1205 13:09:58.487495 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 13:09:58.495411 master-0 kubenswrapper[29936]: I1205 13:09:58.495366 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.495510 master-0 kubenswrapper[29936]: I1205 13:09:58.495444 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.495510 master-0 kubenswrapper[29936]: I1205 13:09:58.495495 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-config-data\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.495579 master-0 kubenswrapper[29936]: I1205 13:09:58.495565 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/192c20a6-3906-4cb8-9dcd-f93eb109af2d-logs\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.495663 master-0 kubenswrapper[29936]: I1205 13:09:58.495640 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.495712 master-0 kubenswrapper[29936]: I1205 13:09:58.495675 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4qtn\" (UniqueName: \"kubernetes.io/projected/192c20a6-3906-4cb8-9dcd-f93eb109af2d-kube-api-access-l4qtn\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.502542 master-0 kubenswrapper[29936]: I1205 13:09:58.502493 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:58.511971 master-0 kubenswrapper[29936]: I1205 13:09:58.511904 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:58.578821 master-0 kubenswrapper[29936]: I1205 13:09:58.578663 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:58.582545 master-0 kubenswrapper[29936]: I1205 13:09:58.582474 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 13:09:58.602603 master-0 kubenswrapper[29936]: I1205 13:09:58.602525 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 13:09:58.610099 master-0 kubenswrapper[29936]: I1205 13:09:58.610039 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:09:58.629314 master-0 kubenswrapper[29936]: I1205 13:09:58.629268 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/192c20a6-3906-4cb8-9dcd-f93eb109af2d-logs\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.629912 master-0 kubenswrapper[29936]: I1205 13:09:58.629893 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.630082 master-0 kubenswrapper[29936]: I1205 13:09:58.630067 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4qtn\" (UniqueName: \"kubernetes.io/projected/192c20a6-3906-4cb8-9dcd-f93eb109af2d-kube-api-access-l4qtn\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.630373 master-0 kubenswrapper[29936]: I1205 13:09:58.630359 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.630511 master-0 kubenswrapper[29936]: I1205 13:09:58.630498 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.630726 master-0 kubenswrapper[29936]: I1205 13:09:58.630713 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-config-data\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.631352 master-0 kubenswrapper[29936]: I1205 13:09:58.631296 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/192c20a6-3906-4cb8-9dcd-f93eb109af2d-logs\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.635868 master-0 kubenswrapper[29936]: I1205 13:09:58.635075 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-public-tls-certs\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.635868 master-0 kubenswrapper[29936]: I1205 13:09:58.635830 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-config-data\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.635868 master-0 kubenswrapper[29936]: I1205 13:09:58.635836 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.637590 master-0 kubenswrapper[29936]: I1205 13:09:58.637534 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/192c20a6-3906-4cb8-9dcd-f93eb109af2d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.649270 master-0 kubenswrapper[29936]: I1205 13:09:58.649224 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4qtn\" (UniqueName: \"kubernetes.io/projected/192c20a6-3906-4cb8-9dcd-f93eb109af2d-kube-api-access-l4qtn\") pod \"nova-api-0\" (UID: \"192c20a6-3906-4cb8-9dcd-f93eb109af2d\") " pod="openstack/nova-api-0" Dec 05 13:09:58.734570 master-0 kubenswrapper[29936]: I1205 13:09:58.734457 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84nls\" (UniqueName: \"kubernetes.io/projected/a02665f3-a20d-441d-9644-9254d6ce563d-kube-api-access-84nls\") pod \"nova-scheduler-0\" (UID: \"a02665f3-a20d-441d-9644-9254d6ce563d\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:58.734944 master-0 kubenswrapper[29936]: I1205 13:09:58.734737 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a02665f3-a20d-441d-9644-9254d6ce563d-config-data\") pod \"nova-scheduler-0\" (UID: \"a02665f3-a20d-441d-9644-9254d6ce563d\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:58.735064 master-0 kubenswrapper[29936]: I1205 13:09:58.735026 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02665f3-a20d-441d-9644-9254d6ce563d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a02665f3-a20d-441d-9644-9254d6ce563d\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:58.837097 master-0 kubenswrapper[29936]: I1205 13:09:58.836797 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02665f3-a20d-441d-9644-9254d6ce563d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a02665f3-a20d-441d-9644-9254d6ce563d\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:58.837097 master-0 kubenswrapper[29936]: I1205 13:09:58.837093 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84nls\" (UniqueName: \"kubernetes.io/projected/a02665f3-a20d-441d-9644-9254d6ce563d-kube-api-access-84nls\") pod \"nova-scheduler-0\" (UID: \"a02665f3-a20d-441d-9644-9254d6ce563d\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:58.837820 master-0 kubenswrapper[29936]: I1205 13:09:58.837756 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a02665f3-a20d-441d-9644-9254d6ce563d-config-data\") pod \"nova-scheduler-0\" (UID: \"a02665f3-a20d-441d-9644-9254d6ce563d\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:58.842335 master-0 kubenswrapper[29936]: I1205 13:09:58.842257 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02665f3-a20d-441d-9644-9254d6ce563d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a02665f3-a20d-441d-9644-9254d6ce563d\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:58.844760 master-0 kubenswrapper[29936]: I1205 13:09:58.844696 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a02665f3-a20d-441d-9644-9254d6ce563d-config-data\") pod \"nova-scheduler-0\" (UID: \"a02665f3-a20d-441d-9644-9254d6ce563d\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:58.857300 master-0 kubenswrapper[29936]: I1205 13:09:58.857222 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84nls\" (UniqueName: \"kubernetes.io/projected/a02665f3-a20d-441d-9644-9254d6ce563d-kube-api-access-84nls\") pod \"nova-scheduler-0\" (UID: \"a02665f3-a20d-441d-9644-9254d6ce563d\") " pod="openstack/nova-scheduler-0" Dec 05 13:09:58.877551 master-0 kubenswrapper[29936]: I1205 13:09:58.877444 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 13:09:58.928403 master-0 kubenswrapper[29936]: I1205 13:09:58.928302 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 13:09:59.206457 master-0 kubenswrapper[29936]: I1205 13:09:59.206369 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d8d3713-fc4a-42f0-8e0c-4cdcb6589027" path="/var/lib/kubelet/pods/3d8d3713-fc4a-42f0-8e0c-4cdcb6589027/volumes" Dec 05 13:09:59.207205 master-0 kubenswrapper[29936]: I1205 13:09:59.207156 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4cb64aa-e5c4-45ee-a360-160256df7967" path="/var/lib/kubelet/pods/a4cb64aa-e5c4-45ee-a360-160256df7967/volumes" Dec 05 13:09:59.451612 master-0 kubenswrapper[29936]: I1205 13:09:59.451550 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 13:09:59.454456 master-0 kubenswrapper[29936]: W1205 13:09:59.454410 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod192c20a6_3906_4cb8_9dcd_f93eb109af2d.slice/crio-d604dc084afbaef78e6fef6ed2eaa5292d46b70b5c0995ec81d2793ee7b9f395 WatchSource:0}: Error finding container d604dc084afbaef78e6fef6ed2eaa5292d46b70b5c0995ec81d2793ee7b9f395: Status 404 returned error can't find the container with id d604dc084afbaef78e6fef6ed2eaa5292d46b70b5c0995ec81d2793ee7b9f395 Dec 05 13:09:59.547074 master-0 kubenswrapper[29936]: I1205 13:09:59.546983 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 13:10:00.374649 master-0 kubenswrapper[29936]: I1205 13:10:00.374497 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"192c20a6-3906-4cb8-9dcd-f93eb109af2d","Type":"ContainerStarted","Data":"7e92e7f6aebd8b0861df7ae8f96b250cf6597db1ae0a92127a67047f6bc73df1"} Dec 05 13:10:00.374649 master-0 kubenswrapper[29936]: I1205 13:10:00.374576 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"192c20a6-3906-4cb8-9dcd-f93eb109af2d","Type":"ContainerStarted","Data":"b60b007648265d41eebebc62c2f155fdb6aec7e80e53543d957e9578b7f1a271"} Dec 05 13:10:00.374649 master-0 kubenswrapper[29936]: I1205 13:10:00.374595 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"192c20a6-3906-4cb8-9dcd-f93eb109af2d","Type":"ContainerStarted","Data":"d604dc084afbaef78e6fef6ed2eaa5292d46b70b5c0995ec81d2793ee7b9f395"} Dec 05 13:10:00.379732 master-0 kubenswrapper[29936]: I1205 13:10:00.379547 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a02665f3-a20d-441d-9644-9254d6ce563d","Type":"ContainerStarted","Data":"bd07f604d7f43173206879e6e85d7c82309118857d8aa5380c9e266a63cbd172"} Dec 05 13:10:00.379732 master-0 kubenswrapper[29936]: I1205 13:10:00.379679 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a02665f3-a20d-441d-9644-9254d6ce563d","Type":"ContainerStarted","Data":"52ad1d697bddabf7d16135d8c012159a13ab82d12d4d8399e007fd59b657c708"} Dec 05 13:10:00.407386 master-0 kubenswrapper[29936]: I1205 13:10:00.407263 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.407152679 podStartE2EDuration="2.407152679s" podCreationTimestamp="2025-12-05 13:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:10:00.40351161 +0000 UTC m=+1197.535591301" watchObservedRunningTime="2025-12-05 13:10:00.407152679 +0000 UTC m=+1197.539232360" Dec 05 13:10:00.441907 master-0 kubenswrapper[29936]: I1205 13:10:00.441786 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.441754353 podStartE2EDuration="2.441754353s" podCreationTimestamp="2025-12-05 13:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:10:00.425019029 +0000 UTC m=+1197.557098710" watchObservedRunningTime="2025-12-05 13:10:00.441754353 +0000 UTC m=+1197.573834044" Dec 05 13:10:01.730257 master-0 kubenswrapper[29936]: I1205 13:10:01.730132 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 13:10:01.730257 master-0 kubenswrapper[29936]: I1205 13:10:01.730217 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 13:10:03.930054 master-0 kubenswrapper[29936]: I1205 13:10:03.929906 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 13:10:06.730841 master-0 kubenswrapper[29936]: I1205 13:10:06.730766 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 13:10:06.730841 master-0 kubenswrapper[29936]: I1205 13:10:06.730850 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 13:10:07.751935 master-0 kubenswrapper[29936]: I1205 13:10:07.751583 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="265df19f-e9c7-41ec-a083-913add4c97ed" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.25:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 13:10:07.751935 master-0 kubenswrapper[29936]: I1205 13:10:07.751882 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="265df19f-e9c7-41ec-a083-913add4c97ed" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.25:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 13:10:08.880552 master-0 kubenswrapper[29936]: I1205 13:10:08.880361 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 13:10:08.880552 master-0 kubenswrapper[29936]: I1205 13:10:08.880441 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 13:10:08.929788 master-0 kubenswrapper[29936]: I1205 13:10:08.929707 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 13:10:08.967171 master-0 kubenswrapper[29936]: I1205 13:10:08.967090 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 13:10:09.393496 master-0 kubenswrapper[29936]: I1205 13:10:09.389773 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 13:10:09.903102 master-0 kubenswrapper[29936]: I1205 13:10:09.902993 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="192c20a6-3906-4cb8-9dcd-f93eb109af2d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.26:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 13:10:09.904212 master-0 kubenswrapper[29936]: I1205 13:10:09.903308 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="192c20a6-3906-4cb8-9dcd-f93eb109af2d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.26:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 13:10:16.737207 master-0 kubenswrapper[29936]: I1205 13:10:16.737126 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 13:10:16.738162 master-0 kubenswrapper[29936]: I1205 13:10:16.738146 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 13:10:16.743481 master-0 kubenswrapper[29936]: I1205 13:10:16.743418 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 13:10:16.746440 master-0 kubenswrapper[29936]: I1205 13:10:16.746406 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 13:10:18.892492 master-0 kubenswrapper[29936]: I1205 13:10:18.892418 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 13:10:18.893279 master-0 kubenswrapper[29936]: I1205 13:10:18.893031 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 13:10:18.895544 master-0 kubenswrapper[29936]: I1205 13:10:18.895499 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 13:10:18.901012 master-0 kubenswrapper[29936]: I1205 13:10:18.900971 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 13:10:19.520132 master-0 kubenswrapper[29936]: I1205 13:10:19.520046 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 13:10:19.535625 master-0 kubenswrapper[29936]: I1205 13:10:19.535164 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 13:10:48.662333 master-0 kubenswrapper[29936]: I1205 13:10:48.662224 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-7rmf8"] Dec 05 13:10:48.663157 master-0 kubenswrapper[29936]: I1205 13:10:48.663094 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" podUID="7d53a6f1-102a-40b2-85c0-0c4f34568cfc" containerName="sushy-emulator" containerID="cri-o://3e07889350a9d9eb1276736f41d2d8ad3044e81473d397c8dc23bb90e0f14c5c" gracePeriod=30 Dec 05 13:10:48.968004 master-0 kubenswrapper[29936]: I1205 13:10:48.967929 29936 generic.go:334] "Generic (PLEG): container finished" podID="7d53a6f1-102a-40b2-85c0-0c4f34568cfc" containerID="3e07889350a9d9eb1276736f41d2d8ad3044e81473d397c8dc23bb90e0f14c5c" exitCode=0 Dec 05 13:10:48.968004 master-0 kubenswrapper[29936]: I1205 13:10:48.967996 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" event={"ID":"7d53a6f1-102a-40b2-85c0-0c4f34568cfc","Type":"ContainerDied","Data":"3e07889350a9d9eb1276736f41d2d8ad3044e81473d397c8dc23bb90e0f14c5c"} Dec 05 13:10:49.292313 master-0 kubenswrapper[29936]: I1205 13:10:49.292252 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 13:10:49.342622 master-0 kubenswrapper[29936]: I1205 13:10:49.342504 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4w2x\" (UniqueName: \"kubernetes.io/projected/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-kube-api-access-j4w2x\") pod \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " Dec 05 13:10:49.342622 master-0 kubenswrapper[29936]: I1205 13:10:49.342588 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-os-client-config\") pod \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " Dec 05 13:10:49.342622 master-0 kubenswrapper[29936]: I1205 13:10:49.342631 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-sushy-emulator-config\") pod \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\" (UID: \"7d53a6f1-102a-40b2-85c0-0c4f34568cfc\") " Dec 05 13:10:49.344678 master-0 kubenswrapper[29936]: I1205 13:10:49.344612 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-sushy-emulator-config" (OuterVolumeSpecName: "sushy-emulator-config") pod "7d53a6f1-102a-40b2-85c0-0c4f34568cfc" (UID: "7d53a6f1-102a-40b2-85c0-0c4f34568cfc"). InnerVolumeSpecName "sushy-emulator-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:10:49.354317 master-0 kubenswrapper[29936]: I1205 13:10:49.347941 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-kube-api-access-j4w2x" (OuterVolumeSpecName: "kube-api-access-j4w2x") pod "7d53a6f1-102a-40b2-85c0-0c4f34568cfc" (UID: "7d53a6f1-102a-40b2-85c0-0c4f34568cfc"). InnerVolumeSpecName "kube-api-access-j4w2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:10:49.354317 master-0 kubenswrapper[29936]: I1205 13:10:49.348226 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-os-client-config" (OuterVolumeSpecName: "os-client-config") pod "7d53a6f1-102a-40b2-85c0-0c4f34568cfc" (UID: "7d53a6f1-102a-40b2-85c0-0c4f34568cfc"). InnerVolumeSpecName "os-client-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:10:49.421599 master-0 kubenswrapper[29936]: I1205 13:10:49.421274 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-64488c485f-nlmtj"] Dec 05 13:10:49.431156 master-0 kubenswrapper[29936]: E1205 13:10:49.428807 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d53a6f1-102a-40b2-85c0-0c4f34568cfc" containerName="sushy-emulator" Dec 05 13:10:49.431156 master-0 kubenswrapper[29936]: I1205 13:10:49.428876 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d53a6f1-102a-40b2-85c0-0c4f34568cfc" containerName="sushy-emulator" Dec 05 13:10:49.431156 master-0 kubenswrapper[29936]: I1205 13:10:49.429530 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d53a6f1-102a-40b2-85c0-0c4f34568cfc" containerName="sushy-emulator" Dec 05 13:10:49.431156 master-0 kubenswrapper[29936]: I1205 13:10:49.430602 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.444384 master-0 kubenswrapper[29936]: I1205 13:10:49.444280 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-64488c485f-nlmtj"] Dec 05 13:10:49.453216 master-0 kubenswrapper[29936]: I1205 13:10:49.453130 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/2570a3c7-2718-4fbc-8ad2-05fe776304b2-sushy-emulator-config\") pod \"sushy-emulator-64488c485f-nlmtj\" (UID: \"2570a3c7-2718-4fbc-8ad2-05fe776304b2\") " pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.453363 master-0 kubenswrapper[29936]: I1205 13:10:49.453290 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/2570a3c7-2718-4fbc-8ad2-05fe776304b2-os-client-config\") pod \"sushy-emulator-64488c485f-nlmtj\" (UID: \"2570a3c7-2718-4fbc-8ad2-05fe776304b2\") " pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.453420 master-0 kubenswrapper[29936]: I1205 13:10:49.453379 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phs9p\" (UniqueName: \"kubernetes.io/projected/2570a3c7-2718-4fbc-8ad2-05fe776304b2-kube-api-access-phs9p\") pod \"sushy-emulator-64488c485f-nlmtj\" (UID: \"2570a3c7-2718-4fbc-8ad2-05fe776304b2\") " pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.453918 master-0 kubenswrapper[29936]: I1205 13:10:49.453890 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4w2x\" (UniqueName: \"kubernetes.io/projected/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-kube-api-access-j4w2x\") on node \"master-0\" DevicePath \"\"" Dec 05 13:10:49.453918 master-0 kubenswrapper[29936]: I1205 13:10:49.453915 29936 reconciler_common.go:293] "Volume detached for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-os-client-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:10:49.454010 master-0 kubenswrapper[29936]: I1205 13:10:49.453933 29936 reconciler_common.go:293] "Volume detached for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/7d53a6f1-102a-40b2-85c0-0c4f34568cfc-sushy-emulator-config\") on node \"master-0\" DevicePath \"\"" Dec 05 13:10:49.556399 master-0 kubenswrapper[29936]: I1205 13:10:49.556254 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/2570a3c7-2718-4fbc-8ad2-05fe776304b2-sushy-emulator-config\") pod \"sushy-emulator-64488c485f-nlmtj\" (UID: \"2570a3c7-2718-4fbc-8ad2-05fe776304b2\") " pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.556399 master-0 kubenswrapper[29936]: I1205 13:10:49.556373 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/2570a3c7-2718-4fbc-8ad2-05fe776304b2-os-client-config\") pod \"sushy-emulator-64488c485f-nlmtj\" (UID: \"2570a3c7-2718-4fbc-8ad2-05fe776304b2\") " pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.556694 master-0 kubenswrapper[29936]: I1205 13:10:49.556439 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phs9p\" (UniqueName: \"kubernetes.io/projected/2570a3c7-2718-4fbc-8ad2-05fe776304b2-kube-api-access-phs9p\") pod \"sushy-emulator-64488c485f-nlmtj\" (UID: \"2570a3c7-2718-4fbc-8ad2-05fe776304b2\") " pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.558011 master-0 kubenswrapper[29936]: I1205 13:10:49.557911 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/2570a3c7-2718-4fbc-8ad2-05fe776304b2-sushy-emulator-config\") pod \"sushy-emulator-64488c485f-nlmtj\" (UID: \"2570a3c7-2718-4fbc-8ad2-05fe776304b2\") " pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.560979 master-0 kubenswrapper[29936]: I1205 13:10:49.560889 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/2570a3c7-2718-4fbc-8ad2-05fe776304b2-os-client-config\") pod \"sushy-emulator-64488c485f-nlmtj\" (UID: \"2570a3c7-2718-4fbc-8ad2-05fe776304b2\") " pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.595707 master-0 kubenswrapper[29936]: I1205 13:10:49.595295 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phs9p\" (UniqueName: \"kubernetes.io/projected/2570a3c7-2718-4fbc-8ad2-05fe776304b2-kube-api-access-phs9p\") pod \"sushy-emulator-64488c485f-nlmtj\" (UID: \"2570a3c7-2718-4fbc-8ad2-05fe776304b2\") " pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.814801 master-0 kubenswrapper[29936]: I1205 13:10:49.814730 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:49.991975 master-0 kubenswrapper[29936]: I1205 13:10:49.990735 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" event={"ID":"7d53a6f1-102a-40b2-85c0-0c4f34568cfc","Type":"ContainerDied","Data":"51fd8a98951eec83e59fcd32531abc78a7a7dd434e9f6496929fe1c7d1bd1b1b"} Dec 05 13:10:49.991975 master-0 kubenswrapper[29936]: I1205 13:10:49.990817 29936 scope.go:117] "RemoveContainer" containerID="3e07889350a9d9eb1276736f41d2d8ad3044e81473d397c8dc23bb90e0f14c5c" Dec 05 13:10:49.991975 master-0 kubenswrapper[29936]: I1205 13:10:49.991004 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-58f4c9b998-7rmf8" Dec 05 13:10:50.054262 master-0 kubenswrapper[29936]: I1205 13:10:50.054138 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-7rmf8"] Dec 05 13:10:50.070970 master-0 kubenswrapper[29936]: I1205 13:10:50.070684 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["sushy-emulator/sushy-emulator-58f4c9b998-7rmf8"] Dec 05 13:10:50.507974 master-0 kubenswrapper[29936]: I1205 13:10:50.507871 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-64488c485f-nlmtj"] Dec 05 13:10:51.012644 master-0 kubenswrapper[29936]: I1205 13:10:51.012542 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" event={"ID":"2570a3c7-2718-4fbc-8ad2-05fe776304b2","Type":"ContainerStarted","Data":"39dae1c2bad32678a03c0e92db2097ffd105826e05d88fd68c3dad641218b15b"} Dec 05 13:10:51.012644 master-0 kubenswrapper[29936]: I1205 13:10:51.012623 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" event={"ID":"2570a3c7-2718-4fbc-8ad2-05fe776304b2","Type":"ContainerStarted","Data":"69097910c5f24e0e85f7adc8835347cef973046a5b328fb22aa4f1976309be9f"} Dec 05 13:10:51.059854 master-0 kubenswrapper[29936]: I1205 13:10:51.059699 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" podStartSLOduration=2.059659264 podStartE2EDuration="2.059659264s" podCreationTimestamp="2025-12-05 13:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 13:10:51.035216673 +0000 UTC m=+1248.167296374" watchObservedRunningTime="2025-12-05 13:10:51.059659264 +0000 UTC m=+1248.191738965" Dec 05 13:10:51.207857 master-0 kubenswrapper[29936]: I1205 13:10:51.207792 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d53a6f1-102a-40b2-85c0-0c4f34568cfc" path="/var/lib/kubelet/pods/7d53a6f1-102a-40b2-85c0-0c4f34568cfc/volumes" Dec 05 13:10:59.815946 master-0 kubenswrapper[29936]: I1205 13:10:59.815877 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:59.815946 master-0 kubenswrapper[29936]: I1205 13:10:59.815953 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:10:59.828111 master-0 kubenswrapper[29936]: I1205 13:10:59.828043 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:11:00.146864 master-0 kubenswrapper[29936]: I1205 13:11:00.146644 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-64488c485f-nlmtj" Dec 05 13:12:15.937916 master-0 kubenswrapper[29936]: I1205 13:12:15.937752 29936 scope.go:117] "RemoveContainer" containerID="c4182bf062e159005d362d853f2c17d8d8e737de8526286a95ead21fc8d8861f" Dec 05 13:12:16.003067 master-0 kubenswrapper[29936]: I1205 13:12:16.002999 29936 scope.go:117] "RemoveContainer" containerID="403f92aa4818c98bd7c94d4469d3117dd62b47997b7a5ff9b7b1859eeb42bc86" Dec 05 13:12:52.373943 master-0 kubenswrapper[29936]: E1205 13:12:52.373812 29936 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 192.168.32.10:60536->192.168.32.10:35211: read tcp 192.168.32.10:60536->192.168.32.10:35211: read: connection reset by peer Dec 05 13:12:52.378527 master-0 kubenswrapper[29936]: E1205 13:12:52.378445 29936 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:60536->192.168.32.10:35211: write tcp 192.168.32.10:60536->192.168.32.10:35211: write: broken pipe Dec 05 13:13:16.117969 master-0 kubenswrapper[29936]: I1205 13:13:16.117883 29936 scope.go:117] "RemoveContainer" containerID="d37f4922a4c5f2d7041603a8e09aa08447bad4f98e7d1beff4eea2d643f97308" Dec 05 13:13:16.189801 master-0 kubenswrapper[29936]: I1205 13:13:16.189726 29936 scope.go:117] "RemoveContainer" containerID="831e37b16587efc185886f30c74346daf9db35cf41f7b5b2159773c166b4cf09" Dec 05 13:13:16.219171 master-0 kubenswrapper[29936]: I1205 13:13:16.219096 29936 scope.go:117] "RemoveContainer" containerID="41169359ec3c597beeee9a90dcb50a3caab0d83e783731a75c2549bc3df5ee9d" Dec 05 13:13:16.275391 master-0 kubenswrapper[29936]: I1205 13:13:16.275346 29936 scope.go:117] "RemoveContainer" containerID="70524485e77b0f63ff22a28523a8a5fd365423c09029a91b199af308f5ff9cc9" Dec 05 13:13:16.320595 master-0 kubenswrapper[29936]: I1205 13:13:16.320536 29936 scope.go:117] "RemoveContainer" containerID="629a2e1c62ee9742cb533842e08303c5cee1a87130f73b0f0f822f74d7518e8d" Dec 05 13:13:16.358625 master-0 kubenswrapper[29936]: I1205 13:13:16.358568 29936 scope.go:117] "RemoveContainer" containerID="9a3b05f6ff88b9c96448238538c8a34f6caf50d50918bd415887609dfb3fc194" Dec 05 13:13:27.843484 master-0 kubenswrapper[29936]: I1205 13:13:27.843376 29936 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-2fccf" podUID="22d3af20-d89a-46a1-a8cc-82ca1b92e325" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 13:13:32.283980 master-0 kubenswrapper[29936]: I1205 13:13:32.283840 29936 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" podUID="f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.162:8081/healthz\": dial tcp 10.128.0.162:8081: connect: connection refused" Dec 05 13:13:32.284895 master-0 kubenswrapper[29936]: I1205 13:13:32.284111 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" podUID="f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.162:8081/readyz\": dial tcp 10.128.0.162:8081: connect: connection refused" Dec 05 13:13:36.101596 master-0 kubenswrapper[29936]: I1205 13:13:36.101490 29936 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" podUID="ee9e12a6-a899-4e44-b4e1-d975493b6b9c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.154:8081/healthz\": dial tcp 10.128.0.154:8081: connect: connection refused" Dec 05 13:13:36.102426 master-0 kubenswrapper[29936]: I1205 13:13:36.101563 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" podUID="ee9e12a6-a899-4e44-b4e1-d975493b6b9c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.154:8081/readyz\": dial tcp 10.128.0.154:8081: connect: connection refused" Dec 05 13:13:36.661314 master-0 kubenswrapper[29936]: I1205 13:13:36.661140 29936 generic.go:334] "Generic (PLEG): container finished" podID="6e8afa75-0149-45e2-8015-1c519267961c" containerID="15762cebb5368bb9e3aef28eeef55563c81d2c20b15cc1b9953470093d4e003d" exitCode=1 Dec 05 13:13:36.661759 master-0 kubenswrapper[29936]: I1205 13:13:36.661232 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" event={"ID":"6e8afa75-0149-45e2-8015-1c519267961c","Type":"ContainerDied","Data":"15762cebb5368bb9e3aef28eeef55563c81d2c20b15cc1b9953470093d4e003d"} Dec 05 13:13:36.662776 master-0 kubenswrapper[29936]: I1205 13:13:36.662759 29936 scope.go:117] "RemoveContainer" containerID="15762cebb5368bb9e3aef28eeef55563c81d2c20b15cc1b9953470093d4e003d" Dec 05 13:13:36.665328 master-0 kubenswrapper[29936]: I1205 13:13:36.665301 29936 generic.go:334] "Generic (PLEG): container finished" podID="49bd6523-c715-46b2-8112-070019badeed" containerID="45e5fcf585d5814ad5b0cbe7f317814e0dfe349afc792a92ca00c4f55bebf638" exitCode=1 Dec 05 13:13:36.665402 master-0 kubenswrapper[29936]: I1205 13:13:36.665360 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" event={"ID":"49bd6523-c715-46b2-8112-070019badeed","Type":"ContainerDied","Data":"45e5fcf585d5814ad5b0cbe7f317814e0dfe349afc792a92ca00c4f55bebf638"} Dec 05 13:13:36.665771 master-0 kubenswrapper[29936]: I1205 13:13:36.665753 29936 scope.go:117] "RemoveContainer" containerID="45e5fcf585d5814ad5b0cbe7f317814e0dfe349afc792a92ca00c4f55bebf638" Dec 05 13:13:36.668892 master-0 kubenswrapper[29936]: I1205 13:13:36.668869 29936 generic.go:334] "Generic (PLEG): container finished" podID="ee9e12a6-a899-4e44-b4e1-d975493b6b9c" containerID="96b6da81f888d72f7ddf48309b989a8dd4260e98e2589be7969922301c257b92" exitCode=1 Dec 05 13:13:36.668971 master-0 kubenswrapper[29936]: I1205 13:13:36.668930 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" event={"ID":"ee9e12a6-a899-4e44-b4e1-d975493b6b9c","Type":"ContainerDied","Data":"96b6da81f888d72f7ddf48309b989a8dd4260e98e2589be7969922301c257b92"} Dec 05 13:13:36.669372 master-0 kubenswrapper[29936]: I1205 13:13:36.669332 29936 scope.go:117] "RemoveContainer" containerID="96b6da81f888d72f7ddf48309b989a8dd4260e98e2589be7969922301c257b92" Dec 05 13:13:36.673010 master-0 kubenswrapper[29936]: I1205 13:13:36.672952 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_f516c058086fe449b55cd324bd8e0223/kube-controller-manager/0.log" Dec 05 13:13:36.673238 master-0 kubenswrapper[29936]: I1205 13:13:36.673032 29936 generic.go:334] "Generic (PLEG): container finished" podID="f516c058086fe449b55cd324bd8e0223" containerID="1eb3bb2c03aeeaa24ad3f262fd7d7d83b942951736136a2909649e2a3fb4fa9b" exitCode=1 Dec 05 13:13:36.673292 master-0 kubenswrapper[29936]: I1205 13:13:36.673160 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"f516c058086fe449b55cd324bd8e0223","Type":"ContainerDied","Data":"1eb3bb2c03aeeaa24ad3f262fd7d7d83b942951736136a2909649e2a3fb4fa9b"} Dec 05 13:13:36.674946 master-0 kubenswrapper[29936]: I1205 13:13:36.674897 29936 scope.go:117] "RemoveContainer" containerID="1eb3bb2c03aeeaa24ad3f262fd7d7d83b942951736136a2909649e2a3fb4fa9b" Dec 05 13:13:36.677234 master-0 kubenswrapper[29936]: I1205 13:13:36.677164 29936 generic.go:334] "Generic (PLEG): container finished" podID="a9859597-6e73-4398-9adb-030bd647faa2" containerID="ab9474d81952a0874267597bc69764d51fcc9e1e5e96fb8a9bcad6dc4216d10d" exitCode=1 Dec 05 13:13:36.677325 master-0 kubenswrapper[29936]: I1205 13:13:36.677227 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" event={"ID":"a9859597-6e73-4398-9adb-030bd647faa2","Type":"ContainerDied","Data":"ab9474d81952a0874267597bc69764d51fcc9e1e5e96fb8a9bcad6dc4216d10d"} Dec 05 13:13:36.678548 master-0 kubenswrapper[29936]: I1205 13:13:36.678495 29936 scope.go:117] "RemoveContainer" containerID="ab9474d81952a0874267597bc69764d51fcc9e1e5e96fb8a9bcad6dc4216d10d" Dec 05 13:13:36.680540 master-0 kubenswrapper[29936]: I1205 13:13:36.680498 29936 generic.go:334] "Generic (PLEG): container finished" podID="f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9" containerID="1bf2e35c67c3218c13bc82b693bd3640510210e6eaed8961cfe29daa9ba8cb73" exitCode=1 Dec 05 13:13:36.680664 master-0 kubenswrapper[29936]: I1205 13:13:36.680538 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" event={"ID":"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9","Type":"ContainerDied","Data":"1bf2e35c67c3218c13bc82b693bd3640510210e6eaed8961cfe29daa9ba8cb73"} Dec 05 13:13:36.681132 master-0 kubenswrapper[29936]: I1205 13:13:36.681092 29936 scope.go:117] "RemoveContainer" containerID="1bf2e35c67c3218c13bc82b693bd3640510210e6eaed8961cfe29daa9ba8cb73" Dec 05 13:13:36.734016 master-0 kubenswrapper[29936]: I1205 13:13:36.733917 29936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:13:36.734016 master-0 kubenswrapper[29936]: I1205 13:13:36.733995 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:13:36.791012 master-0 kubenswrapper[29936]: I1205 13:13:36.790917 29936 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:13:36.791012 master-0 kubenswrapper[29936]: I1205 13:13:36.790977 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:13:37.715013 master-0 kubenswrapper[29936]: I1205 13:13:37.714930 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" event={"ID":"ee9e12a6-a899-4e44-b4e1-d975493b6b9c","Type":"ContainerStarted","Data":"6478ca3e4a608a849d9b81d8c96477d5c26519d2b05077c7dffed7ca76e18042"} Dec 05 13:13:37.717055 master-0 kubenswrapper[29936]: I1205 13:13:37.717021 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:13:37.726161 master-0 kubenswrapper[29936]: I1205 13:13:37.726078 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" event={"ID":"a9859597-6e73-4398-9adb-030bd647faa2","Type":"ContainerStarted","Data":"7ebba346e2905cbca5e5d4bd30c5ca2c622df2d1ed379b749387b6598fb6e596"} Dec 05 13:13:37.727660 master-0 kubenswrapper[29936]: I1205 13:13:37.727618 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 13:13:37.735342 master-0 kubenswrapper[29936]: I1205 13:13:37.733614 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" event={"ID":"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9","Type":"ContainerStarted","Data":"321161d05cd0c512e03cc9c7c5780c21ae2336cde18ecdbcb95e96b72bc3ead8"} Dec 05 13:13:37.735505 master-0 kubenswrapper[29936]: I1205 13:13:37.735432 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:13:37.750503 master-0 kubenswrapper[29936]: I1205 13:13:37.750438 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" event={"ID":"6e8afa75-0149-45e2-8015-1c519267961c","Type":"ContainerStarted","Data":"3fd38880aae6161003b43e52213fa9ca64e765c2f83ed746ca6892842d567d56"} Dec 05 13:13:37.751770 master-0 kubenswrapper[29936]: I1205 13:13:37.751740 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:13:37.772704 master-0 kubenswrapper[29936]: I1205 13:13:37.772613 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" event={"ID":"49bd6523-c715-46b2-8112-070019badeed","Type":"ContainerStarted","Data":"ed07333b823e7ca2042f0e70dca1b26b97185ed863a7ae192cee1e6f24ade680"} Dec 05 13:13:37.773865 master-0 kubenswrapper[29936]: I1205 13:13:37.773822 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:13:37.785941 master-0 kubenswrapper[29936]: I1205 13:13:37.783986 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_f516c058086fe449b55cd324bd8e0223/kube-controller-manager/0.log" Dec 05 13:13:37.785941 master-0 kubenswrapper[29936]: I1205 13:13:37.784091 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"f516c058086fe449b55cd324bd8e0223","Type":"ContainerStarted","Data":"db78fded0908ab1d893089c2a39c6969ef7e2e9ef72e3e53539f9f059f5a00bc"} Dec 05 13:13:40.291772 master-0 kubenswrapper[29936]: I1205 13:13:40.291691 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 13:13:40.291772 master-0 kubenswrapper[29936]: I1205 13:13:40.291772 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 13:13:40.297130 master-0 kubenswrapper[29936]: I1205 13:13:40.297042 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 13:13:42.287768 master-0 kubenswrapper[29936]: I1205 13:13:42.287641 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:13:46.104320 master-0 kubenswrapper[29936]: I1205 13:13:46.104089 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:13:46.737258 master-0 kubenswrapper[29936]: I1205 13:13:46.737156 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:13:46.795404 master-0 kubenswrapper[29936]: I1205 13:13:46.795343 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:13:50.296275 master-0 kubenswrapper[29936]: I1205 13:13:50.296085 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 05 13:14:10.856995 master-0 kubenswrapper[29936]: I1205 13:14:10.856889 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-57dccff46-h9ncv" Dec 05 13:14:16.515062 master-0 kubenswrapper[29936]: I1205 13:14:16.514957 29936 scope.go:117] "RemoveContainer" containerID="9d6204ffa9c9da109cce9a10c42e3791e0ef6e7d747178a39d437fb322e9776e" Dec 05 13:14:16.558482 master-0 kubenswrapper[29936]: I1205 13:14:16.558386 29936 scope.go:117] "RemoveContainer" containerID="72a75b024f9b496d5f1a701d70ada1f8176955f620482d82708f8692efa5dc7f" Dec 05 13:14:16.596098 master-0 kubenswrapper[29936]: I1205 13:14:16.595991 29936 scope.go:117] "RemoveContainer" containerID="378b55a682e4564b915ba4bef2cf9a5345a60e9b2084e132467dc39d8de40969" Dec 05 13:15:00.212916 master-0 kubenswrapper[29936]: I1205 13:15:00.212840 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k"] Dec 05 13:15:00.214785 master-0 kubenswrapper[29936]: I1205 13:15:00.214751 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:00.217150 master-0 kubenswrapper[29936]: I1205 13:15:00.217110 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 13:15:00.217913 master-0 kubenswrapper[29936]: I1205 13:15:00.217857 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-rdxkm" Dec 05 13:15:00.241971 master-0 kubenswrapper[29936]: I1205 13:15:00.241874 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k"] Dec 05 13:15:00.289148 master-0 kubenswrapper[29936]: I1205 13:15:00.289046 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-secret-volume\") pod \"collect-profiles-29415675-7w77k\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:00.289483 master-0 kubenswrapper[29936]: I1205 13:15:00.289306 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-config-volume\") pod \"collect-profiles-29415675-7w77k\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:00.289909 master-0 kubenswrapper[29936]: I1205 13:15:00.289839 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcxwz\" (UniqueName: \"kubernetes.io/projected/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-kube-api-access-kcxwz\") pod \"collect-profiles-29415675-7w77k\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:00.392965 master-0 kubenswrapper[29936]: I1205 13:15:00.392882 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcxwz\" (UniqueName: \"kubernetes.io/projected/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-kube-api-access-kcxwz\") pod \"collect-profiles-29415675-7w77k\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:00.393341 master-0 kubenswrapper[29936]: I1205 13:15:00.393118 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-secret-volume\") pod \"collect-profiles-29415675-7w77k\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:00.393341 master-0 kubenswrapper[29936]: I1205 13:15:00.393170 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-config-volume\") pod \"collect-profiles-29415675-7w77k\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:00.394442 master-0 kubenswrapper[29936]: I1205 13:15:00.394404 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-config-volume\") pod \"collect-profiles-29415675-7w77k\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:00.397723 master-0 kubenswrapper[29936]: I1205 13:15:00.397661 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-secret-volume\") pod \"collect-profiles-29415675-7w77k\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:00.416964 master-0 kubenswrapper[29936]: I1205 13:15:00.416875 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcxwz\" (UniqueName: \"kubernetes.io/projected/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-kube-api-access-kcxwz\") pod \"collect-profiles-29415675-7w77k\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:00.538897 master-0 kubenswrapper[29936]: I1205 13:15:00.538746 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:01.000837 master-0 kubenswrapper[29936]: I1205 13:15:01.000754 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k"] Dec 05 13:15:01.006860 master-0 kubenswrapper[29936]: W1205 13:15:01.006803 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6fb6763_7d6a_4aa9_9cdc_46306eeac490.slice/crio-b5fbae5f6c547194c1c9b5e4203643acf037764b70f2895519798d98ebcdebc3 WatchSource:0}: Error finding container b5fbae5f6c547194c1c9b5e4203643acf037764b70f2895519798d98ebcdebc3: Status 404 returned error can't find the container with id b5fbae5f6c547194c1c9b5e4203643acf037764b70f2895519798d98ebcdebc3 Dec 05 13:15:01.109610 master-0 kubenswrapper[29936]: I1205 13:15:01.109538 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" event={"ID":"d6fb6763-7d6a-4aa9-9cdc-46306eeac490","Type":"ContainerStarted","Data":"b5fbae5f6c547194c1c9b5e4203643acf037764b70f2895519798d98ebcdebc3"} Dec 05 13:15:02.130480 master-0 kubenswrapper[29936]: I1205 13:15:02.130411 29936 generic.go:334] "Generic (PLEG): container finished" podID="d6fb6763-7d6a-4aa9-9cdc-46306eeac490" containerID="75c4fa61711dde0e5f5298be971b497b6d5374b60e5a18ce130d43190d1c6790" exitCode=0 Dec 05 13:15:02.131234 master-0 kubenswrapper[29936]: I1205 13:15:02.130501 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" event={"ID":"d6fb6763-7d6a-4aa9-9cdc-46306eeac490","Type":"ContainerDied","Data":"75c4fa61711dde0e5f5298be971b497b6d5374b60e5a18ce130d43190d1c6790"} Dec 05 13:15:03.600511 master-0 kubenswrapper[29936]: I1205 13:15:03.600448 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:03.783537 master-0 kubenswrapper[29936]: I1205 13:15:03.783325 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcxwz\" (UniqueName: \"kubernetes.io/projected/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-kube-api-access-kcxwz\") pod \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " Dec 05 13:15:03.783537 master-0 kubenswrapper[29936]: I1205 13:15:03.783436 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-secret-volume\") pod \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " Dec 05 13:15:03.783864 master-0 kubenswrapper[29936]: I1205 13:15:03.783654 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-config-volume\") pod \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\" (UID: \"d6fb6763-7d6a-4aa9-9cdc-46306eeac490\") " Dec 05 13:15:03.784669 master-0 kubenswrapper[29936]: I1205 13:15:03.784618 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-config-volume" (OuterVolumeSpecName: "config-volume") pod "d6fb6763-7d6a-4aa9-9cdc-46306eeac490" (UID: "d6fb6763-7d6a-4aa9-9cdc-46306eeac490"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:15:03.787549 master-0 kubenswrapper[29936]: I1205 13:15:03.787491 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-kube-api-access-kcxwz" (OuterVolumeSpecName: "kube-api-access-kcxwz") pod "d6fb6763-7d6a-4aa9-9cdc-46306eeac490" (UID: "d6fb6763-7d6a-4aa9-9cdc-46306eeac490"). InnerVolumeSpecName "kube-api-access-kcxwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:15:03.806684 master-0 kubenswrapper[29936]: I1205 13:15:03.806612 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d6fb6763-7d6a-4aa9-9cdc-46306eeac490" (UID: "d6fb6763-7d6a-4aa9-9cdc-46306eeac490"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:15:03.887286 master-0 kubenswrapper[29936]: I1205 13:15:03.887199 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcxwz\" (UniqueName: \"kubernetes.io/projected/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-kube-api-access-kcxwz\") on node \"master-0\" DevicePath \"\"" Dec 05 13:15:03.887286 master-0 kubenswrapper[29936]: I1205 13:15:03.887258 29936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 05 13:15:03.887286 master-0 kubenswrapper[29936]: I1205 13:15:03.887268 29936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6fb6763-7d6a-4aa9-9cdc-46306eeac490-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 05 13:15:04.155924 master-0 kubenswrapper[29936]: I1205 13:15:04.155736 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" event={"ID":"d6fb6763-7d6a-4aa9-9cdc-46306eeac490","Type":"ContainerDied","Data":"b5fbae5f6c547194c1c9b5e4203643acf037764b70f2895519798d98ebcdebc3"} Dec 05 13:15:04.155924 master-0 kubenswrapper[29936]: I1205 13:15:04.155796 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5fbae5f6c547194c1c9b5e4203643acf037764b70f2895519798d98ebcdebc3" Dec 05 13:15:04.155924 master-0 kubenswrapper[29936]: I1205 13:15:04.155812 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415675-7w77k" Dec 05 13:15:16.731921 master-0 kubenswrapper[29936]: I1205 13:15:16.731829 29936 scope.go:117] "RemoveContainer" containerID="edaaf002e5a85715d16c07bb939e87620e427ec80567ef22989739166fd43448" Dec 05 13:15:16.786122 master-0 kubenswrapper[29936]: I1205 13:15:16.786048 29936 scope.go:117] "RemoveContainer" containerID="611789a47c38f1fdc0977a80a51e6e86b0b6450cebc4e4bb4f961783beec151a" Dec 05 13:15:36.104772 master-0 kubenswrapper[29936]: I1205 13:15:36.104625 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-53e0-account-create-update-2xtnl"] Dec 05 13:15:36.127597 master-0 kubenswrapper[29936]: I1205 13:15:36.127494 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-53e0-account-create-update-2xtnl"] Dec 05 13:15:37.040725 master-0 kubenswrapper[29936]: I1205 13:15:37.040635 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2k6sd"] Dec 05 13:15:37.055420 master-0 kubenswrapper[29936]: I1205 13:15:37.055334 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-tlq59"] Dec 05 13:15:37.074491 master-0 kubenswrapper[29936]: I1205 13:15:37.074409 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-80a7-account-create-update-j252l"] Dec 05 13:15:37.091832 master-0 kubenswrapper[29936]: I1205 13:15:37.091632 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-cbcfg"] Dec 05 13:15:37.104580 master-0 kubenswrapper[29936]: I1205 13:15:37.104536 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2k6sd"] Dec 05 13:15:37.119777 master-0 kubenswrapper[29936]: I1205 13:15:37.119719 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4269-account-create-update-rmqxq"] Dec 05 13:15:37.135672 master-0 kubenswrapper[29936]: I1205 13:15:37.135585 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-tlq59"] Dec 05 13:15:37.155169 master-0 kubenswrapper[29936]: I1205 13:15:37.155085 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-80a7-account-create-update-j252l"] Dec 05 13:15:37.172707 master-0 kubenswrapper[29936]: I1205 13:15:37.172551 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4269-account-create-update-rmqxq"] Dec 05 13:15:37.204569 master-0 kubenswrapper[29936]: I1205 13:15:37.204485 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44320335-848c-4aa2-b78b-672d29137770" path="/var/lib/kubelet/pods/44320335-848c-4aa2-b78b-672d29137770/volumes" Dec 05 13:15:37.206095 master-0 kubenswrapper[29936]: I1205 13:15:37.205478 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f33781-dd68-4b4b-8ca7-7b271a1aa195" path="/var/lib/kubelet/pods/60f33781-dd68-4b4b-8ca7-7b271a1aa195/volumes" Dec 05 13:15:37.206296 master-0 kubenswrapper[29936]: I1205 13:15:37.206154 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff" path="/var/lib/kubelet/pods/64e263cb-cbe8-4ab2-ada8-c68ea8ac41ff/volumes" Dec 05 13:15:37.207335 master-0 kubenswrapper[29936]: I1205 13:15:37.206895 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87999eba-73aa-43cf-be9e-1e07b1dc22e0" path="/var/lib/kubelet/pods/87999eba-73aa-43cf-be9e-1e07b1dc22e0/volumes" Dec 05 13:15:37.208173 master-0 kubenswrapper[29936]: I1205 13:15:37.208137 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c6d8ef-5d6f-475e-8533-e1879fc64f74" path="/var/lib/kubelet/pods/b8c6d8ef-5d6f-475e-8533-e1879fc64f74/volumes" Dec 05 13:15:37.208957 master-0 kubenswrapper[29936]: I1205 13:15:37.208922 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-cbcfg"] Dec 05 13:15:39.203219 master-0 kubenswrapper[29936]: I1205 13:15:39.203142 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35238950-d610-4820-bd1f-2aa4ded2c93b" path="/var/lib/kubelet/pods/35238950-d610-4820-bd1f-2aa4ded2c93b/volumes" Dec 05 13:16:06.272443 master-0 kubenswrapper[29936]: I1205 13:16:06.272271 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c9b1-account-create-update-6mbv8"] Dec 05 13:16:06.304384 master-0 kubenswrapper[29936]: I1205 13:16:06.302530 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-73e4-account-create-update-z9x2z"] Dec 05 13:16:06.316619 master-0 kubenswrapper[29936]: I1205 13:16:06.316498 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4x574"] Dec 05 13:16:06.330952 master-0 kubenswrapper[29936]: I1205 13:16:06.330839 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-92nh8"] Dec 05 13:16:06.362503 master-0 kubenswrapper[29936]: I1205 13:16:06.362374 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c9b1-account-create-update-6mbv8"] Dec 05 13:16:06.400376 master-0 kubenswrapper[29936]: I1205 13:16:06.400250 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-92nh8"] Dec 05 13:16:06.414033 master-0 kubenswrapper[29936]: I1205 13:16:06.413983 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-73e4-account-create-update-z9x2z"] Dec 05 13:16:06.426533 master-0 kubenswrapper[29936]: I1205 13:16:06.426445 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4x574"] Dec 05 13:16:07.201282 master-0 kubenswrapper[29936]: I1205 13:16:07.201207 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a61d97-31ab-485e-9f4c-f18097ce33c7" path="/var/lib/kubelet/pods/02a61d97-31ab-485e-9f4c-f18097ce33c7/volumes" Dec 05 13:16:07.201887 master-0 kubenswrapper[29936]: I1205 13:16:07.201853 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="376c0716-8fd1-432f-9e4b-a3b21373a7cc" path="/var/lib/kubelet/pods/376c0716-8fd1-432f-9e4b-a3b21373a7cc/volumes" Dec 05 13:16:07.202669 master-0 kubenswrapper[29936]: I1205 13:16:07.202636 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c67a1ed-b95a-414f-8f7d-972c98a55a88" path="/var/lib/kubelet/pods/4c67a1ed-b95a-414f-8f7d-972c98a55a88/volumes" Dec 05 13:16:07.203492 master-0 kubenswrapper[29936]: I1205 13:16:07.203460 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a34c12-cc34-47c1-af11-6c935c757db4" path="/var/lib/kubelet/pods/80a34c12-cc34-47c1-af11-6c935c757db4/volumes" Dec 05 13:16:12.046288 master-0 kubenswrapper[29936]: I1205 13:16:12.046211 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dp8qv"] Dec 05 13:16:12.057755 master-0 kubenswrapper[29936]: I1205 13:16:12.057674 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dp8qv"] Dec 05 13:16:13.053713 master-0 kubenswrapper[29936]: I1205 13:16:13.053591 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-448s4"] Dec 05 13:16:13.070740 master-0 kubenswrapper[29936]: I1205 13:16:13.070651 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-448s4"] Dec 05 13:16:13.201364 master-0 kubenswrapper[29936]: I1205 13:16:13.201291 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7bb352-8943-448f-ad3f-a06ebd4b8b30" path="/var/lib/kubelet/pods/5a7bb352-8943-448f-ad3f-a06ebd4b8b30/volumes" Dec 05 13:16:13.202060 master-0 kubenswrapper[29936]: I1205 13:16:13.202019 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1513da7-52be-4f09-8b6d-09e494522e1e" path="/var/lib/kubelet/pods/e1513da7-52be-4f09-8b6d-09e494522e1e/volumes" Dec 05 13:16:16.861507 master-0 kubenswrapper[29936]: I1205 13:16:16.861425 29936 scope.go:117] "RemoveContainer" containerID="7bcfd006685c204c32486a1fb0c7e6bfcbc4d18da7525eb75adb1ea9379958fd" Dec 05 13:16:16.922012 master-0 kubenswrapper[29936]: I1205 13:16:16.921884 29936 scope.go:117] "RemoveContainer" containerID="0301efa77391c1e2edf5f153ae70ca5bb69dea80043c28ef95573231dea6dbbf" Dec 05 13:16:16.978722 master-0 kubenswrapper[29936]: I1205 13:16:16.978660 29936 scope.go:117] "RemoveContainer" containerID="05d63a82c97a5d89a61d6472e6852f05c37c42d4e10026ed2d27d0e960357d7d" Dec 05 13:16:17.039116 master-0 kubenswrapper[29936]: I1205 13:16:17.039007 29936 scope.go:117] "RemoveContainer" containerID="96c4c247cf4c258d3fbcffc77932796c618ab7cc8392fe3caae8f89e52bb86cb" Dec 05 13:16:17.107560 master-0 kubenswrapper[29936]: I1205 13:16:17.107474 29936 scope.go:117] "RemoveContainer" containerID="096fabf329918aaa19f0dd01ada94924b6d9ef114f7c552fe8a648c12cd8583b" Dec 05 13:16:17.152038 master-0 kubenswrapper[29936]: I1205 13:16:17.151972 29936 scope.go:117] "RemoveContainer" containerID="d9dffbcd65d9fd967677c7e200645f9179512261eecadecfb2c00821dffdcb61" Dec 05 13:16:17.234665 master-0 kubenswrapper[29936]: I1205 13:16:17.234548 29936 scope.go:117] "RemoveContainer" containerID="c35f4ecfc3027159b7f91cbca16cb55907b08161f9b5b6aa2204b60d73b48bd6" Dec 05 13:16:17.260551 master-0 kubenswrapper[29936]: I1205 13:16:17.260488 29936 scope.go:117] "RemoveContainer" containerID="46d0bd9a939bd564c8f213558390311e1364aa3f24da86fb352ef1405f8ab17d" Dec 05 13:16:17.290733 master-0 kubenswrapper[29936]: I1205 13:16:17.290671 29936 scope.go:117] "RemoveContainer" containerID="82a7d9539ee911401290ec1f18bd564c6fe1dba97b22f92a893c7bd203032802" Dec 05 13:16:17.316813 master-0 kubenswrapper[29936]: I1205 13:16:17.316764 29936 scope.go:117] "RemoveContainer" containerID="5526388e8dcb2c46558dfa38fb13dcfba8dace91895cb24abc65d005040f0351" Dec 05 13:16:17.351416 master-0 kubenswrapper[29936]: I1205 13:16:17.351367 29936 scope.go:117] "RemoveContainer" containerID="3ca7a76e0f17797de05b21839a6aa098bc369697e96ab92c0e53276e210ee4d6" Dec 05 13:16:17.393439 master-0 kubenswrapper[29936]: I1205 13:16:17.393375 29936 scope.go:117] "RemoveContainer" containerID="aad82ce45003a366b65f76f7465e5da00d59d6cb694847622cd837271d0bd783" Dec 05 13:16:22.060936 master-0 kubenswrapper[29936]: I1205 13:16:22.060840 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-0279-account-create-update-wclm4"] Dec 05 13:16:22.073965 master-0 kubenswrapper[29936]: I1205 13:16:22.073885 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-9fz4t"] Dec 05 13:16:22.086993 master-0 kubenswrapper[29936]: I1205 13:16:22.086937 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-0279-account-create-update-wclm4"] Dec 05 13:16:22.098768 master-0 kubenswrapper[29936]: I1205 13:16:22.098663 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-9fz4t"] Dec 05 13:16:23.205327 master-0 kubenswrapper[29936]: I1205 13:16:23.205237 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d08438-77de-4a6e-81d7-a9f76078a1b6" path="/var/lib/kubelet/pods/72d08438-77de-4a6e-81d7-a9f76078a1b6/volumes" Dec 05 13:16:23.206160 master-0 kubenswrapper[29936]: I1205 13:16:23.206130 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d76a2c1-3728-4104-b208-67b329e52d70" path="/var/lib/kubelet/pods/9d76a2c1-3728-4104-b208-67b329e52d70/volumes" Dec 05 13:16:37.071959 master-0 kubenswrapper[29936]: I1205 13:16:37.071845 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fqqdf"] Dec 05 13:16:37.088921 master-0 kubenswrapper[29936]: I1205 13:16:37.088821 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fqqdf"] Dec 05 13:16:37.205502 master-0 kubenswrapper[29936]: I1205 13:16:37.205385 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21fcc891-90c5-47e7-97e9-e852adfae2bb" path="/var/lib/kubelet/pods/21fcc891-90c5-47e7-97e9-e852adfae2bb/volumes" Dec 05 13:16:45.041805 master-0 kubenswrapper[29936]: I1205 13:16:45.041690 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kcsv8"] Dec 05 13:16:45.056334 master-0 kubenswrapper[29936]: I1205 13:16:45.056255 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kcsv8"] Dec 05 13:16:45.200531 master-0 kubenswrapper[29936]: I1205 13:16:45.200460 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d7d168-a010-44ef-b2cb-1ec979fb38c6" path="/var/lib/kubelet/pods/d4d7d168-a010-44ef-b2cb-1ec979fb38c6/volumes" Dec 05 13:16:49.055762 master-0 kubenswrapper[29936]: I1205 13:16:49.055669 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b46d8-db-sync-6scb5"] Dec 05 13:16:49.070934 master-0 kubenswrapper[29936]: I1205 13:16:49.070856 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b46d8-db-sync-6scb5"] Dec 05 13:16:49.200999 master-0 kubenswrapper[29936]: I1205 13:16:49.200916 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82bd984-5f64-433b-a41a-f5186287a0f7" path="/var/lib/kubelet/pods/b82bd984-5f64-433b-a41a-f5186287a0f7/volumes" Dec 05 13:16:51.050776 master-0 kubenswrapper[29936]: I1205 13:16:51.050671 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-d6bnq"] Dec 05 13:16:51.061973 master-0 kubenswrapper[29936]: I1205 13:16:51.061904 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-d6bnq"] Dec 05 13:16:51.203255 master-0 kubenswrapper[29936]: I1205 13:16:51.203138 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549e5366-4e6e-4d97-aeb2-25f74ce81b4b" path="/var/lib/kubelet/pods/549e5366-4e6e-4d97-aeb2-25f74ce81b4b/volumes" Dec 05 13:17:03.049094 master-0 kubenswrapper[29936]: I1205 13:17:03.049021 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-sync-h9l4n"] Dec 05 13:17:03.064127 master-0 kubenswrapper[29936]: I1205 13:17:03.064032 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-sync-h9l4n"] Dec 05 13:17:03.202767 master-0 kubenswrapper[29936]: I1205 13:17:03.202678 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1690e553-8b77-483f-9f31-4f3968e6bd28" path="/var/lib/kubelet/pods/1690e553-8b77-483f-9f31-4f3968e6bd28/volumes" Dec 05 13:17:10.050133 master-0 kubenswrapper[29936]: I1205 13:17:10.050054 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-create-m94x4"] Dec 05 13:17:10.065046 master-0 kubenswrapper[29936]: I1205 13:17:10.064950 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-create-m94x4"] Dec 05 13:17:11.213770 master-0 kubenswrapper[29936]: I1205 13:17:11.213652 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7777eb1-fa7a-4f4b-8887-da54c42cff61" path="/var/lib/kubelet/pods/e7777eb1-fa7a-4f4b-8887-da54c42cff61/volumes" Dec 05 13:17:13.050321 master-0 kubenswrapper[29936]: I1205 13:17:13.050170 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-2c1e-account-create-update-6qdwp"] Dec 05 13:17:13.063985 master-0 kubenswrapper[29936]: I1205 13:17:13.063893 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-2c1e-account-create-update-6qdwp"] Dec 05 13:17:13.202341 master-0 kubenswrapper[29936]: I1205 13:17:13.202264 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="188a05a6-3a61-4472-b383-f44f2d022d08" path="/var/lib/kubelet/pods/188a05a6-3a61-4472-b383-f44f2d022d08/volumes" Dec 05 13:17:17.729418 master-0 kubenswrapper[29936]: I1205 13:17:17.729279 29936 scope.go:117] "RemoveContainer" containerID="d1a6313a656db2f5677f30dc51d1c451284d4f2dcdeffbdb49fd9efd782328c9" Dec 05 13:17:17.759327 master-0 kubenswrapper[29936]: I1205 13:17:17.759126 29936 scope.go:117] "RemoveContainer" containerID="141ab7a29ac7f33ed8f5ff26052076988c5f1c4aeaf98072d1684f5c36ae252d" Dec 05 13:17:17.859788 master-0 kubenswrapper[29936]: I1205 13:17:17.859728 29936 scope.go:117] "RemoveContainer" containerID="8740745c142be01a2afc231b874b9b9e2b8c88cef19049d5e0abe9d2173d2a0f" Dec 05 13:17:17.899382 master-0 kubenswrapper[29936]: I1205 13:17:17.899326 29936 scope.go:117] "RemoveContainer" containerID="8c9fdd2f2167aecbf9110e725bed0a387f56041962376d469528fcc3bbcc45a3" Dec 05 13:17:17.977691 master-0 kubenswrapper[29936]: I1205 13:17:17.977630 29936 scope.go:117] "RemoveContainer" containerID="5e5f4ceda6016dd2c94bbad5e06f1aed8177635a7cbbb6f34898b4dcbe61a46d" Dec 05 13:17:18.018978 master-0 kubenswrapper[29936]: I1205 13:17:18.018823 29936 scope.go:117] "RemoveContainer" containerID="2ba41f0827deea62e1400136039f07960c0733677767da6118aec92285f52a1d" Dec 05 13:17:18.093708 master-0 kubenswrapper[29936]: I1205 13:17:18.093627 29936 scope.go:117] "RemoveContainer" containerID="159c7b8004572d99bb7e1d0ce96a95f4f155c990d66f4df41aef77d1abfb8361" Dec 05 13:17:18.148130 master-0 kubenswrapper[29936]: I1205 13:17:18.148063 29936 scope.go:117] "RemoveContainer" containerID="32577dcea0dff0a1e68d827f23c3807f29c2ad394ed4eaa10c7f407c6fac643a" Dec 05 13:17:18.183846 master-0 kubenswrapper[29936]: I1205 13:17:18.183745 29936 scope.go:117] "RemoveContainer" containerID="87c7ae63b5c1f83dc43365b898e62fdbd922a603605747e0cef5ff0cc54f165a" Dec 05 13:17:18.224007 master-0 kubenswrapper[29936]: I1205 13:17:18.223944 29936 scope.go:117] "RemoveContainer" containerID="254d7dc18016a8d8ae26387e0351914c2065365423cc1ce28ac2831343ee5f6c" Dec 05 13:17:31.061814 master-0 kubenswrapper[29936]: I1205 13:17:31.061746 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-sync-8czvf"] Dec 05 13:17:31.076556 master-0 kubenswrapper[29936]: I1205 13:17:31.076459 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-sync-8czvf"] Dec 05 13:17:31.215447 master-0 kubenswrapper[29936]: I1205 13:17:31.215350 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32628c1d-e798-427e-97ca-322d9af2971e" path="/var/lib/kubelet/pods/32628c1d-e798-427e-97ca-322d9af2971e/volumes" Dec 05 13:17:54.043527 master-0 kubenswrapper[29936]: I1205 13:17:54.043445 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mh6s6"] Dec 05 13:17:54.055442 master-0 kubenswrapper[29936]: I1205 13:17:54.055358 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-fkxvl"] Dec 05 13:17:54.068605 master-0 kubenswrapper[29936]: I1205 13:17:54.068527 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-dc7b-account-create-update-6mmc6"] Dec 05 13:17:54.080916 master-0 kubenswrapper[29936]: I1205 13:17:54.080866 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-fkxvl"] Dec 05 13:17:54.092479 master-0 kubenswrapper[29936]: I1205 13:17:54.092407 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mh6s6"] Dec 05 13:17:54.103203 master-0 kubenswrapper[29936]: I1205 13:17:54.103084 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-dc7b-account-create-update-6mmc6"] Dec 05 13:17:55.203474 master-0 kubenswrapper[29936]: I1205 13:17:55.202948 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a056507-477d-4d77-9905-c7c6344e92ec" path="/var/lib/kubelet/pods/6a056507-477d-4d77-9905-c7c6344e92ec/volumes" Dec 05 13:17:55.204129 master-0 kubenswrapper[29936]: I1205 13:17:55.203764 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8eb8d4a-1ed8-445b-a455-e79b6659317f" path="/var/lib/kubelet/pods/a8eb8d4a-1ed8-445b-a455-e79b6659317f/volumes" Dec 05 13:17:55.204666 master-0 kubenswrapper[29936]: I1205 13:17:55.204637 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d934457f-86b7-4ccf-b0da-8625268d2a56" path="/var/lib/kubelet/pods/d934457f-86b7-4ccf-b0da-8625268d2a56/volumes" Dec 05 13:17:56.058803 master-0 kubenswrapper[29936]: I1205 13:17:56.058588 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b154-account-create-update-9nrbh"] Dec 05 13:17:56.077152 master-0 kubenswrapper[29936]: I1205 13:17:56.077074 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b154-account-create-update-9nrbh"] Dec 05 13:17:56.091820 master-0 kubenswrapper[29936]: I1205 13:17:56.091740 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-cw997"] Dec 05 13:17:56.105206 master-0 kubenswrapper[29936]: I1205 13:17:56.105102 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-cw997"] Dec 05 13:17:57.041996 master-0 kubenswrapper[29936]: I1205 13:17:57.041924 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0961-account-create-update-krxs2"] Dec 05 13:17:57.055845 master-0 kubenswrapper[29936]: I1205 13:17:57.055757 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0961-account-create-update-krxs2"] Dec 05 13:17:57.201260 master-0 kubenswrapper[29936]: I1205 13:17:57.201197 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ed197d-e0d9-4970-8d1e-f79fa7c70697" path="/var/lib/kubelet/pods/03ed197d-e0d9-4970-8d1e-f79fa7c70697/volumes" Dec 05 13:17:57.202438 master-0 kubenswrapper[29936]: I1205 13:17:57.202408 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="060bcbc7-c502-431d-8a7d-1f566a91f953" path="/var/lib/kubelet/pods/060bcbc7-c502-431d-8a7d-1f566a91f953/volumes" Dec 05 13:17:57.203158 master-0 kubenswrapper[29936]: I1205 13:17:57.203124 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60288a03-cb34-45c0-a727-0d822e01d9e8" path="/var/lib/kubelet/pods/60288a03-cb34-45c0-a727-0d822e01d9e8/volumes" Dec 05 13:18:14.946094 master-0 kubenswrapper[29936]: I1205 13:18:14.945148 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s5pf8"] Dec 05 13:18:14.946094 master-0 kubenswrapper[29936]: E1205 13:18:14.945882 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fb6763-7d6a-4aa9-9cdc-46306eeac490" containerName="collect-profiles" Dec 05 13:18:14.946094 master-0 kubenswrapper[29936]: I1205 13:18:14.945905 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fb6763-7d6a-4aa9-9cdc-46306eeac490" containerName="collect-profiles" Dec 05 13:18:14.947160 master-0 kubenswrapper[29936]: I1205 13:18:14.946288 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fb6763-7d6a-4aa9-9cdc-46306eeac490" containerName="collect-profiles" Dec 05 13:18:14.948712 master-0 kubenswrapper[29936]: I1205 13:18:14.948683 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:14.967413 master-0 kubenswrapper[29936]: I1205 13:18:14.967338 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrf79\" (UniqueName: \"kubernetes.io/projected/bc5208bc-09f1-4efc-b293-3f989ca1b97b-kube-api-access-nrf79\") pod \"redhat-marketplace-s5pf8\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:14.967701 master-0 kubenswrapper[29936]: I1205 13:18:14.967618 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-utilities\") pod \"redhat-marketplace-s5pf8\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:14.967774 master-0 kubenswrapper[29936]: I1205 13:18:14.967758 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-catalog-content\") pod \"redhat-marketplace-s5pf8\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:14.978969 master-0 kubenswrapper[29936]: I1205 13:18:14.978872 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s5pf8"] Dec 05 13:18:15.069411 master-0 kubenswrapper[29936]: I1205 13:18:15.069300 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrf79\" (UniqueName: \"kubernetes.io/projected/bc5208bc-09f1-4efc-b293-3f989ca1b97b-kube-api-access-nrf79\") pod \"redhat-marketplace-s5pf8\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:15.069707 master-0 kubenswrapper[29936]: I1205 13:18:15.069458 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-utilities\") pod \"redhat-marketplace-s5pf8\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:15.069707 master-0 kubenswrapper[29936]: I1205 13:18:15.069554 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-catalog-content\") pod \"redhat-marketplace-s5pf8\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:15.070082 master-0 kubenswrapper[29936]: I1205 13:18:15.070034 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-utilities\") pod \"redhat-marketplace-s5pf8\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:15.070296 master-0 kubenswrapper[29936]: I1205 13:18:15.070245 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-catalog-content\") pod \"redhat-marketplace-s5pf8\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:15.085587 master-0 kubenswrapper[29936]: I1205 13:18:15.085535 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrf79\" (UniqueName: \"kubernetes.io/projected/bc5208bc-09f1-4efc-b293-3f989ca1b97b-kube-api-access-nrf79\") pod \"redhat-marketplace-s5pf8\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:15.286322 master-0 kubenswrapper[29936]: I1205 13:18:15.286137 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:15.783201 master-0 kubenswrapper[29936]: I1205 13:18:15.782991 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s5pf8"] Dec 05 13:18:15.795660 master-0 kubenswrapper[29936]: W1205 13:18:15.795572 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc5208bc_09f1_4efc_b293_3f989ca1b97b.slice/crio-a9c5e54e5afc69944d8b7eea7c0feca21adcb73c4fbd78ba49da00480c7f8ada WatchSource:0}: Error finding container a9c5e54e5afc69944d8b7eea7c0feca21adcb73c4fbd78ba49da00480c7f8ada: Status 404 returned error can't find the container with id a9c5e54e5afc69944d8b7eea7c0feca21adcb73c4fbd78ba49da00480c7f8ada Dec 05 13:18:16.035510 master-0 kubenswrapper[29936]: I1205 13:18:16.035312 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5pf8" event={"ID":"bc5208bc-09f1-4efc-b293-3f989ca1b97b","Type":"ContainerStarted","Data":"a9c5e54e5afc69944d8b7eea7c0feca21adcb73c4fbd78ba49da00480c7f8ada"} Dec 05 13:18:17.050419 master-0 kubenswrapper[29936]: I1205 13:18:17.050348 29936 generic.go:334] "Generic (PLEG): container finished" podID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerID="7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36" exitCode=0 Dec 05 13:18:17.050419 master-0 kubenswrapper[29936]: I1205 13:18:17.050407 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5pf8" event={"ID":"bc5208bc-09f1-4efc-b293-3f989ca1b97b","Type":"ContainerDied","Data":"7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36"} Dec 05 13:18:17.052378 master-0 kubenswrapper[29936]: I1205 13:18:17.052343 29936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 13:18:18.473009 master-0 kubenswrapper[29936]: I1205 13:18:18.472891 29936 scope.go:117] "RemoveContainer" containerID="85250c2847b34fb5877f76749df89d7236a0c4712b66a06086954f5d107a490f" Dec 05 13:18:18.504672 master-0 kubenswrapper[29936]: I1205 13:18:18.504616 29936 scope.go:117] "RemoveContainer" containerID="e9fd626327750923d02fea0a7f9307cf165b248f43281931e05023f17de4dfed" Dec 05 13:18:18.566220 master-0 kubenswrapper[29936]: I1205 13:18:18.566133 29936 scope.go:117] "RemoveContainer" containerID="d85dbcc812d584e3a0a4ecf780f8f03295299680afe6ef9c172753a77e815f75" Dec 05 13:18:18.627783 master-0 kubenswrapper[29936]: I1205 13:18:18.627726 29936 scope.go:117] "RemoveContainer" containerID="ab0b4a7481da1a51a4001eaa1507a74ccee97a6a809b0bf807d5a3d863b56523" Dec 05 13:18:18.700546 master-0 kubenswrapper[29936]: I1205 13:18:18.698305 29936 scope.go:117] "RemoveContainer" containerID="e7c38d0df0df18cd7211f2a820ebdcdadb9f5f3b626e516a401a0a7252c4cd9e" Dec 05 13:18:18.743030 master-0 kubenswrapper[29936]: I1205 13:18:18.742956 29936 scope.go:117] "RemoveContainer" containerID="e86e7007a55ed024279e5d4b913060ea7f3399573acd4d84012481b69b00aef8" Dec 05 13:18:18.783534 master-0 kubenswrapper[29936]: I1205 13:18:18.783403 29936 scope.go:117] "RemoveContainer" containerID="2210c31023e83097945b619fe81b1a41648b8e0f5c97d20c0d6fd95a1eb5d0a3" Dec 05 13:18:19.082290 master-0 kubenswrapper[29936]: I1205 13:18:19.082210 29936 generic.go:334] "Generic (PLEG): container finished" podID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerID="6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a" exitCode=0 Dec 05 13:18:19.082599 master-0 kubenswrapper[29936]: I1205 13:18:19.082314 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5pf8" event={"ID":"bc5208bc-09f1-4efc-b293-3f989ca1b97b","Type":"ContainerDied","Data":"6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a"} Dec 05 13:18:20.110435 master-0 kubenswrapper[29936]: I1205 13:18:20.110220 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5pf8" event={"ID":"bc5208bc-09f1-4efc-b293-3f989ca1b97b","Type":"ContainerStarted","Data":"148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a"} Dec 05 13:18:20.170832 master-0 kubenswrapper[29936]: I1205 13:18:20.170691 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s5pf8" podStartSLOduration=3.484327631 podStartE2EDuration="6.170657372s" podCreationTimestamp="2025-12-05 13:18:14 +0000 UTC" firstStartedPulling="2025-12-05 13:18:17.052290543 +0000 UTC m=+1694.184370224" lastFinishedPulling="2025-12-05 13:18:19.738620284 +0000 UTC m=+1696.870699965" observedRunningTime="2025-12-05 13:18:20.153653769 +0000 UTC m=+1697.285733470" watchObservedRunningTime="2025-12-05 13:18:20.170657372 +0000 UTC m=+1697.302737053" Dec 05 13:18:25.287585 master-0 kubenswrapper[29936]: I1205 13:18:25.287403 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:25.287585 master-0 kubenswrapper[29936]: I1205 13:18:25.287475 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:25.343056 master-0 kubenswrapper[29936]: I1205 13:18:25.343002 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:26.241611 master-0 kubenswrapper[29936]: I1205 13:18:26.241475 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:26.311823 master-0 kubenswrapper[29936]: I1205 13:18:26.311741 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s5pf8"] Dec 05 13:18:28.217531 master-0 kubenswrapper[29936]: I1205 13:18:28.217394 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s5pf8" podUID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerName="registry-server" containerID="cri-o://148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a" gracePeriod=2 Dec 05 13:18:28.857231 master-0 kubenswrapper[29936]: I1205 13:18:28.857146 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:28.903195 master-0 kubenswrapper[29936]: I1205 13:18:28.903023 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrf79\" (UniqueName: \"kubernetes.io/projected/bc5208bc-09f1-4efc-b293-3f989ca1b97b-kube-api-access-nrf79\") pod \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " Dec 05 13:18:28.903482 master-0 kubenswrapper[29936]: I1205 13:18:28.903251 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-utilities\") pod \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " Dec 05 13:18:28.903820 master-0 kubenswrapper[29936]: I1205 13:18:28.903784 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-catalog-content\") pod \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\" (UID: \"bc5208bc-09f1-4efc-b293-3f989ca1b97b\") " Dec 05 13:18:28.904030 master-0 kubenswrapper[29936]: I1205 13:18:28.903977 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-utilities" (OuterVolumeSpecName: "utilities") pod "bc5208bc-09f1-4efc-b293-3f989ca1b97b" (UID: "bc5208bc-09f1-4efc-b293-3f989ca1b97b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:18:28.904666 master-0 kubenswrapper[29936]: I1205 13:18:28.904631 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:18:28.905965 master-0 kubenswrapper[29936]: I1205 13:18:28.905923 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5208bc-09f1-4efc-b293-3f989ca1b97b-kube-api-access-nrf79" (OuterVolumeSpecName: "kube-api-access-nrf79") pod "bc5208bc-09f1-4efc-b293-3f989ca1b97b" (UID: "bc5208bc-09f1-4efc-b293-3f989ca1b97b"). InnerVolumeSpecName "kube-api-access-nrf79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:18:28.922428 master-0 kubenswrapper[29936]: I1205 13:18:28.922384 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc5208bc-09f1-4efc-b293-3f989ca1b97b" (UID: "bc5208bc-09f1-4efc-b293-3f989ca1b97b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:18:29.006779 master-0 kubenswrapper[29936]: I1205 13:18:29.006595 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc5208bc-09f1-4efc-b293-3f989ca1b97b-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:18:29.006779 master-0 kubenswrapper[29936]: I1205 13:18:29.006659 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrf79\" (UniqueName: \"kubernetes.io/projected/bc5208bc-09f1-4efc-b293-3f989ca1b97b-kube-api-access-nrf79\") on node \"master-0\" DevicePath \"\"" Dec 05 13:18:29.242140 master-0 kubenswrapper[29936]: I1205 13:18:29.242016 29936 generic.go:334] "Generic (PLEG): container finished" podID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerID="148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a" exitCode=0 Dec 05 13:18:29.242140 master-0 kubenswrapper[29936]: I1205 13:18:29.242074 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5pf8" event={"ID":"bc5208bc-09f1-4efc-b293-3f989ca1b97b","Type":"ContainerDied","Data":"148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a"} Dec 05 13:18:29.242140 master-0 kubenswrapper[29936]: I1205 13:18:29.242157 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s5pf8" event={"ID":"bc5208bc-09f1-4efc-b293-3f989ca1b97b","Type":"ContainerDied","Data":"a9c5e54e5afc69944d8b7eea7c0feca21adcb73c4fbd78ba49da00480c7f8ada"} Dec 05 13:18:29.242987 master-0 kubenswrapper[29936]: I1205 13:18:29.242101 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s5pf8" Dec 05 13:18:29.242987 master-0 kubenswrapper[29936]: I1205 13:18:29.242215 29936 scope.go:117] "RemoveContainer" containerID="148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a" Dec 05 13:18:29.291718 master-0 kubenswrapper[29936]: I1205 13:18:29.291169 29936 scope.go:117] "RemoveContainer" containerID="6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a" Dec 05 13:18:29.292154 master-0 kubenswrapper[29936]: I1205 13:18:29.291979 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s5pf8"] Dec 05 13:18:29.307146 master-0 kubenswrapper[29936]: I1205 13:18:29.307042 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s5pf8"] Dec 05 13:18:29.322930 master-0 kubenswrapper[29936]: I1205 13:18:29.322836 29936 scope.go:117] "RemoveContainer" containerID="7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36" Dec 05 13:18:29.380209 master-0 kubenswrapper[29936]: I1205 13:18:29.380133 29936 scope.go:117] "RemoveContainer" containerID="148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a" Dec 05 13:18:29.381929 master-0 kubenswrapper[29936]: E1205 13:18:29.381868 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a\": container with ID starting with 148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a not found: ID does not exist" containerID="148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a" Dec 05 13:18:29.381929 master-0 kubenswrapper[29936]: I1205 13:18:29.381922 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a"} err="failed to get container status \"148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a\": rpc error: code = NotFound desc = could not find container \"148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a\": container with ID starting with 148e08d869f41d27a7b0794d7a407a48c4f7926a1f1e81330b478c38ed5a1b5a not found: ID does not exist" Dec 05 13:18:29.381929 master-0 kubenswrapper[29936]: I1205 13:18:29.381952 29936 scope.go:117] "RemoveContainer" containerID="6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a" Dec 05 13:18:29.382905 master-0 kubenswrapper[29936]: E1205 13:18:29.382864 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a\": container with ID starting with 6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a not found: ID does not exist" containerID="6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a" Dec 05 13:18:29.382985 master-0 kubenswrapper[29936]: I1205 13:18:29.382902 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a"} err="failed to get container status \"6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a\": rpc error: code = NotFound desc = could not find container \"6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a\": container with ID starting with 6711a8c9c14073b4e55aaec0923a98b6adc6ab6e2014bdeff8b6ede2f9a6845a not found: ID does not exist" Dec 05 13:18:29.382985 master-0 kubenswrapper[29936]: I1205 13:18:29.382931 29936 scope.go:117] "RemoveContainer" containerID="7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36" Dec 05 13:18:29.383540 master-0 kubenswrapper[29936]: E1205 13:18:29.383503 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36\": container with ID starting with 7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36 not found: ID does not exist" containerID="7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36" Dec 05 13:18:29.383540 master-0 kubenswrapper[29936]: I1205 13:18:29.383531 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36"} err="failed to get container status \"7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36\": rpc error: code = NotFound desc = could not find container \"7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36\": container with ID starting with 7b1a259216ef7ea1e8da35814594a23e332faa1194949b0e73634e6c0055bf36 not found: ID does not exist" Dec 05 13:18:31.201040 master-0 kubenswrapper[29936]: I1205 13:18:31.200954 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" path="/var/lib/kubelet/pods/bc5208bc-09f1-4efc-b293-3f989ca1b97b/volumes" Dec 05 13:18:36.056597 master-0 kubenswrapper[29936]: I1205 13:18:36.056490 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qjcct"] Dec 05 13:18:36.072536 master-0 kubenswrapper[29936]: I1205 13:18:36.072430 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qjcct"] Dec 05 13:18:37.202417 master-0 kubenswrapper[29936]: I1205 13:18:37.202283 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e234efb-4fe5-4c70-a992-4bfb95d2fc4c" path="/var/lib/kubelet/pods/6e234efb-4fe5-4c70-a992-4bfb95d2fc4c/volumes" Dec 05 13:19:05.891755 master-0 kubenswrapper[29936]: I1205 13:19:05.890662 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-bmgf2"] Dec 05 13:19:05.901627 master-0 kubenswrapper[29936]: I1205 13:19:05.901542 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-bmgf2"] Dec 05 13:19:07.202253 master-0 kubenswrapper[29936]: I1205 13:19:07.202175 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ebb85b0-cfc1-4d5b-af96-01798e97b809" path="/var/lib/kubelet/pods/3ebb85b0-cfc1-4d5b-af96-01798e97b809/volumes" Dec 05 13:19:14.057540 master-0 kubenswrapper[29936]: I1205 13:19:14.057427 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xjmxk"] Dec 05 13:19:14.076280 master-0 kubenswrapper[29936]: I1205 13:19:14.075631 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xjmxk"] Dec 05 13:19:15.203218 master-0 kubenswrapper[29936]: I1205 13:19:15.203138 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe106b9-254b-49b9-98dd-b1cfd4697210" path="/var/lib/kubelet/pods/fbe106b9-254b-49b9-98dd-b1cfd4697210/volumes" Dec 05 13:19:18.983935 master-0 kubenswrapper[29936]: I1205 13:19:18.983827 29936 scope.go:117] "RemoveContainer" containerID="f0c072fdde7ca0593dd0758f937ba470cfca639b521f4b345cda700caba5363f" Dec 05 13:19:19.037701 master-0 kubenswrapper[29936]: I1205 13:19:19.037631 29936 scope.go:117] "RemoveContainer" containerID="660f5b16f43c93494f587bab2753f20ba1d3fa89ccf34c2349bf9f0651d2848e" Dec 05 13:19:19.096443 master-0 kubenswrapper[29936]: I1205 13:19:19.096311 29936 scope.go:117] "RemoveContainer" containerID="76b7953df4c4e4e5ab492f23b5832f9e96dfc878f6c084b9ba459cac0ec279f3" Dec 05 13:19:48.067981 master-0 kubenswrapper[29936]: I1205 13:19:48.067555 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-host-discover-2pvkn"] Dec 05 13:19:48.104039 master-0 kubenswrapper[29936]: I1205 13:19:48.103910 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-host-discover-2pvkn"] Dec 05 13:19:49.204818 master-0 kubenswrapper[29936]: I1205 13:19:49.204741 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49" path="/var/lib/kubelet/pods/dc8e7dc4-e072-47f4-8b71-b9e3c6afdd49/volumes" Dec 05 13:19:51.040396 master-0 kubenswrapper[29936]: I1205 13:19:51.040314 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhdsz"] Dec 05 13:19:51.055401 master-0 kubenswrapper[29936]: I1205 13:19:51.054962 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dhdsz"] Dec 05 13:19:51.203254 master-0 kubenswrapper[29936]: I1205 13:19:51.203135 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15176a26-f0f3-4bd2-a9b2-6f450e107ae1" path="/var/lib/kubelet/pods/15176a26-f0f3-4bd2-a9b2-6f450e107ae1/volumes" Dec 05 13:19:57.993233 master-0 kubenswrapper[29936]: I1205 13:19:57.993113 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c5tfv"] Dec 05 13:19:57.994086 master-0 kubenswrapper[29936]: E1205 13:19:57.993852 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerName="registry-server" Dec 05 13:19:57.994086 master-0 kubenswrapper[29936]: I1205 13:19:57.993873 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerName="registry-server" Dec 05 13:19:57.994086 master-0 kubenswrapper[29936]: E1205 13:19:57.993931 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerName="extract-content" Dec 05 13:19:57.994086 master-0 kubenswrapper[29936]: I1205 13:19:57.993938 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerName="extract-content" Dec 05 13:19:57.994086 master-0 kubenswrapper[29936]: E1205 13:19:57.993969 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerName="extract-utilities" Dec 05 13:19:57.994086 master-0 kubenswrapper[29936]: I1205 13:19:57.993976 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerName="extract-utilities" Dec 05 13:19:57.994409 master-0 kubenswrapper[29936]: I1205 13:19:57.994342 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc5208bc-09f1-4efc-b293-3f989ca1b97b" containerName="registry-server" Dec 05 13:19:58.007270 master-0 kubenswrapper[29936]: I1205 13:19:58.006138 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.015217 master-0 kubenswrapper[29936]: I1205 13:19:58.012323 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c5tfv"] Dec 05 13:19:58.094223 master-0 kubenswrapper[29936]: I1205 13:19:58.093681 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-utilities\") pod \"certified-operators-c5tfv\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.094223 master-0 kubenswrapper[29936]: I1205 13:19:58.094013 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zj54\" (UniqueName: \"kubernetes.io/projected/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-kube-api-access-7zj54\") pod \"certified-operators-c5tfv\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.094223 master-0 kubenswrapper[29936]: I1205 13:19:58.094119 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-catalog-content\") pod \"certified-operators-c5tfv\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.198220 master-0 kubenswrapper[29936]: I1205 13:19:58.197934 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-catalog-content\") pod \"certified-operators-c5tfv\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.198506 master-0 kubenswrapper[29936]: I1205 13:19:58.198270 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-utilities\") pod \"certified-operators-c5tfv\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.198506 master-0 kubenswrapper[29936]: I1205 13:19:58.198330 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zj54\" (UniqueName: \"kubernetes.io/projected/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-kube-api-access-7zj54\") pod \"certified-operators-c5tfv\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.200349 master-0 kubenswrapper[29936]: I1205 13:19:58.199431 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-catalog-content\") pod \"certified-operators-c5tfv\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.201235 master-0 kubenswrapper[29936]: I1205 13:19:58.201140 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-utilities\") pod \"certified-operators-c5tfv\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.216228 master-0 kubenswrapper[29936]: I1205 13:19:58.216025 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zj54\" (UniqueName: \"kubernetes.io/projected/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-kube-api-access-7zj54\") pod \"certified-operators-c5tfv\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.360212 master-0 kubenswrapper[29936]: I1205 13:19:58.359896 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:19:58.960041 master-0 kubenswrapper[29936]: I1205 13:19:58.959938 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c5tfv"] Dec 05 13:19:59.000256 master-0 kubenswrapper[29936]: W1205 13:19:58.998827 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63f0a543_5c3e_45c5_aa43_33f7e3addbe2.slice/crio-c6826e363dba8d497e7f7751f9f31d06eaa09981546832227a34846cf958f123 WatchSource:0}: Error finding container c6826e363dba8d497e7f7751f9f31d06eaa09981546832227a34846cf958f123: Status 404 returned error can't find the container with id c6826e363dba8d497e7f7751f9f31d06eaa09981546832227a34846cf958f123 Dec 05 13:19:59.633098 master-0 kubenswrapper[29936]: I1205 13:19:59.632897 29936 generic.go:334] "Generic (PLEG): container finished" podID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerID="fa3e7e3e53b4f96d0511b49cb48556b4de414d205c46252024db2ba6531b9f8e" exitCode=0 Dec 05 13:19:59.633098 master-0 kubenswrapper[29936]: I1205 13:19:59.632961 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5tfv" event={"ID":"63f0a543-5c3e-45c5-aa43-33f7e3addbe2","Type":"ContainerDied","Data":"fa3e7e3e53b4f96d0511b49cb48556b4de414d205c46252024db2ba6531b9f8e"} Dec 05 13:19:59.633098 master-0 kubenswrapper[29936]: I1205 13:19:59.632995 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5tfv" event={"ID":"63f0a543-5c3e-45c5-aa43-33f7e3addbe2","Type":"ContainerStarted","Data":"c6826e363dba8d497e7f7751f9f31d06eaa09981546832227a34846cf958f123"} Dec 05 13:20:00.647901 master-0 kubenswrapper[29936]: I1205 13:20:00.647742 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5tfv" event={"ID":"63f0a543-5c3e-45c5-aa43-33f7e3addbe2","Type":"ContainerStarted","Data":"460e6b8687a4c5c04e105eaa796b2b4b4c3501d86bf62a773fd014d113a82745"} Dec 05 13:20:01.668792 master-0 kubenswrapper[29936]: I1205 13:20:01.668647 29936 generic.go:334] "Generic (PLEG): container finished" podID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerID="460e6b8687a4c5c04e105eaa796b2b4b4c3501d86bf62a773fd014d113a82745" exitCode=0 Dec 05 13:20:01.668792 master-0 kubenswrapper[29936]: I1205 13:20:01.668728 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5tfv" event={"ID":"63f0a543-5c3e-45c5-aa43-33f7e3addbe2","Type":"ContainerDied","Data":"460e6b8687a4c5c04e105eaa796b2b4b4c3501d86bf62a773fd014d113a82745"} Dec 05 13:20:02.685961 master-0 kubenswrapper[29936]: I1205 13:20:02.685818 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5tfv" event={"ID":"63f0a543-5c3e-45c5-aa43-33f7e3addbe2","Type":"ContainerStarted","Data":"07063facf675f62a3e9d09b325805e15f09b78eab1a76badd6861aef8a35afa5"} Dec 05 13:20:02.715251 master-0 kubenswrapper[29936]: I1205 13:20:02.713253 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c5tfv" podStartSLOduration=3.192559665 podStartE2EDuration="5.713227708s" podCreationTimestamp="2025-12-05 13:19:57 +0000 UTC" firstStartedPulling="2025-12-05 13:19:59.635132226 +0000 UTC m=+1796.767211917" lastFinishedPulling="2025-12-05 13:20:02.155800269 +0000 UTC m=+1799.287879960" observedRunningTime="2025-12-05 13:20:02.706002471 +0000 UTC m=+1799.838082182" watchObservedRunningTime="2025-12-05 13:20:02.713227708 +0000 UTC m=+1799.845307409" Dec 05 13:20:08.361405 master-0 kubenswrapper[29936]: I1205 13:20:08.361312 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:20:08.361405 master-0 kubenswrapper[29936]: I1205 13:20:08.361403 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:20:08.433351 master-0 kubenswrapper[29936]: I1205 13:20:08.433273 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:20:08.809800 master-0 kubenswrapper[29936]: I1205 13:20:08.809736 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:20:08.884537 master-0 kubenswrapper[29936]: I1205 13:20:08.884449 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c5tfv"] Dec 05 13:20:10.783837 master-0 kubenswrapper[29936]: I1205 13:20:10.783741 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c5tfv" podUID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerName="registry-server" containerID="cri-o://07063facf675f62a3e9d09b325805e15f09b78eab1a76badd6861aef8a35afa5" gracePeriod=2 Dec 05 13:20:11.806831 master-0 kubenswrapper[29936]: I1205 13:20:11.806760 29936 generic.go:334] "Generic (PLEG): container finished" podID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerID="07063facf675f62a3e9d09b325805e15f09b78eab1a76badd6861aef8a35afa5" exitCode=0 Dec 05 13:20:11.806831 master-0 kubenswrapper[29936]: I1205 13:20:11.806820 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5tfv" event={"ID":"63f0a543-5c3e-45c5-aa43-33f7e3addbe2","Type":"ContainerDied","Data":"07063facf675f62a3e9d09b325805e15f09b78eab1a76badd6861aef8a35afa5"} Dec 05 13:20:12.452430 master-0 kubenswrapper[29936]: I1205 13:20:12.452344 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:20:12.608235 master-0 kubenswrapper[29936]: I1205 13:20:12.608160 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-utilities\") pod \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " Dec 05 13:20:12.608500 master-0 kubenswrapper[29936]: I1205 13:20:12.608390 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zj54\" (UniqueName: \"kubernetes.io/projected/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-kube-api-access-7zj54\") pod \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " Dec 05 13:20:12.608500 master-0 kubenswrapper[29936]: I1205 13:20:12.608467 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-catalog-content\") pod \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\" (UID: \"63f0a543-5c3e-45c5-aa43-33f7e3addbe2\") " Dec 05 13:20:12.608976 master-0 kubenswrapper[29936]: I1205 13:20:12.608943 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-utilities" (OuterVolumeSpecName: "utilities") pod "63f0a543-5c3e-45c5-aa43-33f7e3addbe2" (UID: "63f0a543-5c3e-45c5-aa43-33f7e3addbe2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:20:12.609388 master-0 kubenswrapper[29936]: I1205 13:20:12.609358 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:20:12.612844 master-0 kubenswrapper[29936]: I1205 13:20:12.612695 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-kube-api-access-7zj54" (OuterVolumeSpecName: "kube-api-access-7zj54") pod "63f0a543-5c3e-45c5-aa43-33f7e3addbe2" (UID: "63f0a543-5c3e-45c5-aa43-33f7e3addbe2"). InnerVolumeSpecName "kube-api-access-7zj54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:20:12.661349 master-0 kubenswrapper[29936]: I1205 13:20:12.661132 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63f0a543-5c3e-45c5-aa43-33f7e3addbe2" (UID: "63f0a543-5c3e-45c5-aa43-33f7e3addbe2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:20:12.711120 master-0 kubenswrapper[29936]: I1205 13:20:12.711036 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zj54\" (UniqueName: \"kubernetes.io/projected/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-kube-api-access-7zj54\") on node \"master-0\" DevicePath \"\"" Dec 05 13:20:12.712622 master-0 kubenswrapper[29936]: I1205 13:20:12.712564 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63f0a543-5c3e-45c5-aa43-33f7e3addbe2-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:20:12.824344 master-0 kubenswrapper[29936]: I1205 13:20:12.824235 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c5tfv" event={"ID":"63f0a543-5c3e-45c5-aa43-33f7e3addbe2","Type":"ContainerDied","Data":"c6826e363dba8d497e7f7751f9f31d06eaa09981546832227a34846cf958f123"} Dec 05 13:20:12.824344 master-0 kubenswrapper[29936]: I1205 13:20:12.824315 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c5tfv" Dec 05 13:20:12.825229 master-0 kubenswrapper[29936]: I1205 13:20:12.824367 29936 scope.go:117] "RemoveContainer" containerID="07063facf675f62a3e9d09b325805e15f09b78eab1a76badd6861aef8a35afa5" Dec 05 13:20:12.856056 master-0 kubenswrapper[29936]: I1205 13:20:12.855566 29936 scope.go:117] "RemoveContainer" containerID="460e6b8687a4c5c04e105eaa796b2b4b4c3501d86bf62a773fd014d113a82745" Dec 05 13:20:12.880419 master-0 kubenswrapper[29936]: I1205 13:20:12.880345 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c5tfv"] Dec 05 13:20:12.897272 master-0 kubenswrapper[29936]: I1205 13:20:12.897166 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c5tfv"] Dec 05 13:20:12.899549 master-0 kubenswrapper[29936]: I1205 13:20:12.899514 29936 scope.go:117] "RemoveContainer" containerID="fa3e7e3e53b4f96d0511b49cb48556b4de414d205c46252024db2ba6531b9f8e" Dec 05 13:20:13.204143 master-0 kubenswrapper[29936]: I1205 13:20:13.204077 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" path="/var/lib/kubelet/pods/63f0a543-5c3e-45c5-aa43-33f7e3addbe2/volumes" Dec 05 13:20:19.260542 master-0 kubenswrapper[29936]: I1205 13:20:19.260404 29936 scope.go:117] "RemoveContainer" containerID="e24a4fcbcff7a0386a8428428e036961f43ecd0e8e57014ff38a86de77a19507" Dec 05 13:20:19.314368 master-0 kubenswrapper[29936]: I1205 13:20:19.314288 29936 scope.go:117] "RemoveContainer" containerID="0cce25521ec0474b3003ac709ac458cbdc007bb277128c3f5ecd9ce546329141" Dec 05 13:20:22.558282 master-0 kubenswrapper[29936]: I1205 13:20:22.554520 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sgsln"] Dec 05 13:20:22.558282 master-0 kubenswrapper[29936]: E1205 13:20:22.555127 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerName="extract-utilities" Dec 05 13:20:22.558282 master-0 kubenswrapper[29936]: I1205 13:20:22.555144 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerName="extract-utilities" Dec 05 13:20:22.558282 master-0 kubenswrapper[29936]: E1205 13:20:22.555218 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerName="registry-server" Dec 05 13:20:22.558282 master-0 kubenswrapper[29936]: I1205 13:20:22.555228 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerName="registry-server" Dec 05 13:20:22.558282 master-0 kubenswrapper[29936]: E1205 13:20:22.555264 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerName="extract-content" Dec 05 13:20:22.558282 master-0 kubenswrapper[29936]: I1205 13:20:22.555272 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerName="extract-content" Dec 05 13:20:22.558282 master-0 kubenswrapper[29936]: I1205 13:20:22.555524 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f0a543-5c3e-45c5-aa43-33f7e3addbe2" containerName="registry-server" Dec 05 13:20:22.559129 master-0 kubenswrapper[29936]: I1205 13:20:22.559093 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:22.627789 master-0 kubenswrapper[29936]: I1205 13:20:22.627718 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sgsln"] Dec 05 13:20:22.685696 master-0 kubenswrapper[29936]: I1205 13:20:22.685602 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-utilities\") pod \"community-operators-sgsln\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:22.686035 master-0 kubenswrapper[29936]: I1205 13:20:22.685791 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-catalog-content\") pod \"community-operators-sgsln\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:22.686206 master-0 kubenswrapper[29936]: I1205 13:20:22.686155 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7crq\" (UniqueName: \"kubernetes.io/projected/9cc37800-10d8-4634-a1f3-dff200dbf986-kube-api-access-c7crq\") pod \"community-operators-sgsln\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:22.795560 master-0 kubenswrapper[29936]: I1205 13:20:22.795456 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-utilities\") pod \"community-operators-sgsln\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:22.795560 master-0 kubenswrapper[29936]: I1205 13:20:22.795546 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-catalog-content\") pod \"community-operators-sgsln\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:22.795906 master-0 kubenswrapper[29936]: I1205 13:20:22.795862 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7crq\" (UniqueName: \"kubernetes.io/projected/9cc37800-10d8-4634-a1f3-dff200dbf986-kube-api-access-c7crq\") pod \"community-operators-sgsln\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:22.796263 master-0 kubenswrapper[29936]: I1205 13:20:22.796147 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-utilities\") pod \"community-operators-sgsln\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:22.796263 master-0 kubenswrapper[29936]: I1205 13:20:22.796147 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-catalog-content\") pod \"community-operators-sgsln\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:22.814723 master-0 kubenswrapper[29936]: I1205 13:20:22.814422 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7crq\" (UniqueName: \"kubernetes.io/projected/9cc37800-10d8-4634-a1f3-dff200dbf986-kube-api-access-c7crq\") pod \"community-operators-sgsln\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:22.884333 master-0 kubenswrapper[29936]: I1205 13:20:22.884157 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:23.446802 master-0 kubenswrapper[29936]: W1205 13:20:23.446718 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cc37800_10d8_4634_a1f3_dff200dbf986.slice/crio-1a16490f0a22053aa263a23aad445fc5da6b631087898fc7d91c7f7f413312d8 WatchSource:0}: Error finding container 1a16490f0a22053aa263a23aad445fc5da6b631087898fc7d91c7f7f413312d8: Status 404 returned error can't find the container with id 1a16490f0a22053aa263a23aad445fc5da6b631087898fc7d91c7f7f413312d8 Dec 05 13:20:23.450721 master-0 kubenswrapper[29936]: I1205 13:20:23.450655 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sgsln"] Dec 05 13:20:23.974916 master-0 kubenswrapper[29936]: I1205 13:20:23.974708 29936 generic.go:334] "Generic (PLEG): container finished" podID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerID="7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228" exitCode=0 Dec 05 13:20:23.974916 master-0 kubenswrapper[29936]: I1205 13:20:23.974818 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgsln" event={"ID":"9cc37800-10d8-4634-a1f3-dff200dbf986","Type":"ContainerDied","Data":"7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228"} Dec 05 13:20:23.975668 master-0 kubenswrapper[29936]: I1205 13:20:23.974925 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgsln" event={"ID":"9cc37800-10d8-4634-a1f3-dff200dbf986","Type":"ContainerStarted","Data":"1a16490f0a22053aa263a23aad445fc5da6b631087898fc7d91c7f7f413312d8"} Dec 05 13:20:24.997500 master-0 kubenswrapper[29936]: I1205 13:20:24.997411 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgsln" event={"ID":"9cc37800-10d8-4634-a1f3-dff200dbf986","Type":"ContainerStarted","Data":"122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8"} Dec 05 13:20:26.016343 master-0 kubenswrapper[29936]: I1205 13:20:26.016239 29936 generic.go:334] "Generic (PLEG): container finished" podID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerID="122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8" exitCode=0 Dec 05 13:20:26.016343 master-0 kubenswrapper[29936]: I1205 13:20:26.016339 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgsln" event={"ID":"9cc37800-10d8-4634-a1f3-dff200dbf986","Type":"ContainerDied","Data":"122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8"} Dec 05 13:20:27.032524 master-0 kubenswrapper[29936]: I1205 13:20:27.032445 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgsln" event={"ID":"9cc37800-10d8-4634-a1f3-dff200dbf986","Type":"ContainerStarted","Data":"ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda"} Dec 05 13:20:27.082235 master-0 kubenswrapper[29936]: I1205 13:20:27.082061 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sgsln" podStartSLOduration=2.642970038 podStartE2EDuration="5.082037755s" podCreationTimestamp="2025-12-05 13:20:22 +0000 UTC" firstStartedPulling="2025-12-05 13:20:23.977227755 +0000 UTC m=+1821.109307436" lastFinishedPulling="2025-12-05 13:20:26.416295462 +0000 UTC m=+1823.548375153" observedRunningTime="2025-12-05 13:20:27.065975327 +0000 UTC m=+1824.198055028" watchObservedRunningTime="2025-12-05 13:20:27.082037755 +0000 UTC m=+1824.214117436" Dec 05 13:20:32.885443 master-0 kubenswrapper[29936]: I1205 13:20:32.885304 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:32.885443 master-0 kubenswrapper[29936]: I1205 13:20:32.885390 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:32.936275 master-0 kubenswrapper[29936]: I1205 13:20:32.936212 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:33.225091 master-0 kubenswrapper[29936]: I1205 13:20:33.225034 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:37.253529 master-0 kubenswrapper[29936]: I1205 13:20:37.253440 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sgsln"] Dec 05 13:20:37.254350 master-0 kubenswrapper[29936]: I1205 13:20:37.253753 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sgsln" podUID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerName="registry-server" containerID="cri-o://ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda" gracePeriod=2 Dec 05 13:20:37.886159 master-0 kubenswrapper[29936]: I1205 13:20:37.886075 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:37.954917 master-0 kubenswrapper[29936]: I1205 13:20:37.954785 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-utilities\") pod \"9cc37800-10d8-4634-a1f3-dff200dbf986\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " Dec 05 13:20:37.954917 master-0 kubenswrapper[29936]: I1205 13:20:37.954882 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-catalog-content\") pod \"9cc37800-10d8-4634-a1f3-dff200dbf986\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " Dec 05 13:20:37.954917 master-0 kubenswrapper[29936]: I1205 13:20:37.954916 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7crq\" (UniqueName: \"kubernetes.io/projected/9cc37800-10d8-4634-a1f3-dff200dbf986-kube-api-access-c7crq\") pod \"9cc37800-10d8-4634-a1f3-dff200dbf986\" (UID: \"9cc37800-10d8-4634-a1f3-dff200dbf986\") " Dec 05 13:20:37.956225 master-0 kubenswrapper[29936]: I1205 13:20:37.956137 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-utilities" (OuterVolumeSpecName: "utilities") pod "9cc37800-10d8-4634-a1f3-dff200dbf986" (UID: "9cc37800-10d8-4634-a1f3-dff200dbf986"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:20:37.965689 master-0 kubenswrapper[29936]: I1205 13:20:37.965621 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc37800-10d8-4634-a1f3-dff200dbf986-kube-api-access-c7crq" (OuterVolumeSpecName: "kube-api-access-c7crq") pod "9cc37800-10d8-4634-a1f3-dff200dbf986" (UID: "9cc37800-10d8-4634-a1f3-dff200dbf986"). InnerVolumeSpecName "kube-api-access-c7crq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:20:38.034774 master-0 kubenswrapper[29936]: I1205 13:20:38.034710 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cc37800-10d8-4634-a1f3-dff200dbf986" (UID: "9cc37800-10d8-4634-a1f3-dff200dbf986"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:20:38.058915 master-0 kubenswrapper[29936]: I1205 13:20:38.058761 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7crq\" (UniqueName: \"kubernetes.io/projected/9cc37800-10d8-4634-a1f3-dff200dbf986-kube-api-access-c7crq\") on node \"master-0\" DevicePath \"\"" Dec 05 13:20:38.058915 master-0 kubenswrapper[29936]: I1205 13:20:38.058840 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:20:38.058915 master-0 kubenswrapper[29936]: I1205 13:20:38.058854 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37800-10d8-4634-a1f3-dff200dbf986-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:20:38.204613 master-0 kubenswrapper[29936]: I1205 13:20:38.204522 29936 generic.go:334] "Generic (PLEG): container finished" podID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerID="ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda" exitCode=0 Dec 05 13:20:38.204613 master-0 kubenswrapper[29936]: I1205 13:20:38.204583 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgsln" event={"ID":"9cc37800-10d8-4634-a1f3-dff200dbf986","Type":"ContainerDied","Data":"ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda"} Dec 05 13:20:38.204613 master-0 kubenswrapper[29936]: I1205 13:20:38.204628 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sgsln" event={"ID":"9cc37800-10d8-4634-a1f3-dff200dbf986","Type":"ContainerDied","Data":"1a16490f0a22053aa263a23aad445fc5da6b631087898fc7d91c7f7f413312d8"} Dec 05 13:20:38.204992 master-0 kubenswrapper[29936]: I1205 13:20:38.204651 29936 scope.go:117] "RemoveContainer" containerID="ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda" Dec 05 13:20:38.204992 master-0 kubenswrapper[29936]: I1205 13:20:38.204597 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sgsln" Dec 05 13:20:38.227457 master-0 kubenswrapper[29936]: I1205 13:20:38.227412 29936 scope.go:117] "RemoveContainer" containerID="122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8" Dec 05 13:20:38.272354 master-0 kubenswrapper[29936]: I1205 13:20:38.272069 29936 scope.go:117] "RemoveContainer" containerID="7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228" Dec 05 13:20:38.302911 master-0 kubenswrapper[29936]: I1205 13:20:38.302851 29936 scope.go:117] "RemoveContainer" containerID="ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda" Dec 05 13:20:38.303661 master-0 kubenswrapper[29936]: E1205 13:20:38.303624 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda\": container with ID starting with ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda not found: ID does not exist" containerID="ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda" Dec 05 13:20:38.303753 master-0 kubenswrapper[29936]: I1205 13:20:38.303661 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda"} err="failed to get container status \"ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda\": rpc error: code = NotFound desc = could not find container \"ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda\": container with ID starting with ae3e7cb8de2d895b12e433f067a0819959fc4edd5cc1b6b7473c56b5ee07ebda not found: ID does not exist" Dec 05 13:20:38.303753 master-0 kubenswrapper[29936]: I1205 13:20:38.303687 29936 scope.go:117] "RemoveContainer" containerID="122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8" Dec 05 13:20:38.304047 master-0 kubenswrapper[29936]: E1205 13:20:38.304016 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8\": container with ID starting with 122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8 not found: ID does not exist" containerID="122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8" Dec 05 13:20:38.304047 master-0 kubenswrapper[29936]: I1205 13:20:38.304038 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8"} err="failed to get container status \"122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8\": rpc error: code = NotFound desc = could not find container \"122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8\": container with ID starting with 122e0668cf35dab8bc0d84d91960444043461e891873979937b887fe46621df8 not found: ID does not exist" Dec 05 13:20:38.304308 master-0 kubenswrapper[29936]: I1205 13:20:38.304052 29936 scope.go:117] "RemoveContainer" containerID="7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228" Dec 05 13:20:38.304510 master-0 kubenswrapper[29936]: E1205 13:20:38.304476 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228\": container with ID starting with 7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228 not found: ID does not exist" containerID="7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228" Dec 05 13:20:38.304510 master-0 kubenswrapper[29936]: I1205 13:20:38.304499 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228"} err="failed to get container status \"7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228\": rpc error: code = NotFound desc = could not find container \"7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228\": container with ID starting with 7eb46c15ca5e58c0f30883fd1d2af5f7daa07d5bc0f0559275409884d7b3d228 not found: ID does not exist" Dec 05 13:20:40.811209 master-0 kubenswrapper[29936]: I1205 13:20:40.807303 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sgsln"] Dec 05 13:20:40.829207 master-0 kubenswrapper[29936]: I1205 13:20:40.825815 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sgsln"] Dec 05 13:20:41.200892 master-0 kubenswrapper[29936]: I1205 13:20:41.200805 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc37800-10d8-4634-a1f3-dff200dbf986" path="/var/lib/kubelet/pods/9cc37800-10d8-4634-a1f3-dff200dbf986/volumes" Dec 05 13:21:06.314964 master-0 kubenswrapper[29936]: I1205 13:21:06.314852 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rnc29"] Dec 05 13:21:06.315781 master-0 kubenswrapper[29936]: E1205 13:21:06.315653 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerName="extract-content" Dec 05 13:21:06.315781 master-0 kubenswrapper[29936]: I1205 13:21:06.315683 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerName="extract-content" Dec 05 13:21:06.315781 master-0 kubenswrapper[29936]: E1205 13:21:06.315733 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerName="registry-server" Dec 05 13:21:06.315781 master-0 kubenswrapper[29936]: I1205 13:21:06.315747 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerName="registry-server" Dec 05 13:21:06.315938 master-0 kubenswrapper[29936]: E1205 13:21:06.315815 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerName="extract-utilities" Dec 05 13:21:06.315938 master-0 kubenswrapper[29936]: I1205 13:21:06.315830 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerName="extract-utilities" Dec 05 13:21:06.316375 master-0 kubenswrapper[29936]: I1205 13:21:06.316338 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc37800-10d8-4634-a1f3-dff200dbf986" containerName="registry-server" Dec 05 13:21:06.319408 master-0 kubenswrapper[29936]: I1205 13:21:06.319368 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:06.356681 master-0 kubenswrapper[29936]: I1205 13:21:06.356592 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnc29"] Dec 05 13:21:06.462539 master-0 kubenswrapper[29936]: I1205 13:21:06.462442 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-catalog-content\") pod \"redhat-operators-rnc29\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:06.462852 master-0 kubenswrapper[29936]: I1205 13:21:06.462730 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lgwp\" (UniqueName: \"kubernetes.io/projected/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-kube-api-access-4lgwp\") pod \"redhat-operators-rnc29\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:06.462852 master-0 kubenswrapper[29936]: I1205 13:21:06.462820 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-utilities\") pod \"redhat-operators-rnc29\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:06.574831 master-0 kubenswrapper[29936]: I1205 13:21:06.574632 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-catalog-content\") pod \"redhat-operators-rnc29\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:06.574831 master-0 kubenswrapper[29936]: I1205 13:21:06.574803 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lgwp\" (UniqueName: \"kubernetes.io/projected/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-kube-api-access-4lgwp\") pod \"redhat-operators-rnc29\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:06.575208 master-0 kubenswrapper[29936]: I1205 13:21:06.574872 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-utilities\") pod \"redhat-operators-rnc29\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:06.575960 master-0 kubenswrapper[29936]: I1205 13:21:06.575918 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-utilities\") pod \"redhat-operators-rnc29\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:06.576216 master-0 kubenswrapper[29936]: I1205 13:21:06.576151 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-catalog-content\") pod \"redhat-operators-rnc29\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:06.594554 master-0 kubenswrapper[29936]: I1205 13:21:06.594481 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lgwp\" (UniqueName: \"kubernetes.io/projected/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-kube-api-access-4lgwp\") pod \"redhat-operators-rnc29\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:06.647332 master-0 kubenswrapper[29936]: I1205 13:21:06.647252 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:07.217138 master-0 kubenswrapper[29936]: I1205 13:21:07.215875 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnc29"] Dec 05 13:21:07.221159 master-0 kubenswrapper[29936]: W1205 13:21:07.221115 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9ec16f2_c53a_4611_a1d1_7d3e9718ecbd.slice/crio-af79ff7fcf5717af168de74f4e82b96a3347d8521eba5b2752a65c391810508a WatchSource:0}: Error finding container af79ff7fcf5717af168de74f4e82b96a3347d8521eba5b2752a65c391810508a: Status 404 returned error can't find the container with id af79ff7fcf5717af168de74f4e82b96a3347d8521eba5b2752a65c391810508a Dec 05 13:21:07.620848 master-0 kubenswrapper[29936]: I1205 13:21:07.620749 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnc29" event={"ID":"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd","Type":"ContainerDied","Data":"5c07ce980e9924dc73fbdef7eeee14e1c57177a16a8f37781de675fd3e892600"} Dec 05 13:21:07.621600 master-0 kubenswrapper[29936]: I1205 13:21:07.620545 29936 generic.go:334] "Generic (PLEG): container finished" podID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerID="5c07ce980e9924dc73fbdef7eeee14e1c57177a16a8f37781de675fd3e892600" exitCode=0 Dec 05 13:21:07.621600 master-0 kubenswrapper[29936]: I1205 13:21:07.621483 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnc29" event={"ID":"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd","Type":"ContainerStarted","Data":"af79ff7fcf5717af168de74f4e82b96a3347d8521eba5b2752a65c391810508a"} Dec 05 13:21:08.637453 master-0 kubenswrapper[29936]: I1205 13:21:08.637285 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnc29" event={"ID":"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd","Type":"ContainerStarted","Data":"7dccf81db5053edc192949bfd38f4f54f2dd8cc19c7a278df2e255c38c12d8cd"} Dec 05 13:21:09.666990 master-0 kubenswrapper[29936]: I1205 13:21:09.666892 29936 generic.go:334] "Generic (PLEG): container finished" podID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerID="7dccf81db5053edc192949bfd38f4f54f2dd8cc19c7a278df2e255c38c12d8cd" exitCode=0 Dec 05 13:21:09.667896 master-0 kubenswrapper[29936]: I1205 13:21:09.666992 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnc29" event={"ID":"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd","Type":"ContainerDied","Data":"7dccf81db5053edc192949bfd38f4f54f2dd8cc19c7a278df2e255c38c12d8cd"} Dec 05 13:21:10.683749 master-0 kubenswrapper[29936]: I1205 13:21:10.683553 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnc29" event={"ID":"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd","Type":"ContainerStarted","Data":"0c0a22dc24504a173dcd97904d6bbb632524dc32bf4298265127f023c03ea69b"} Dec 05 13:21:12.851203 master-0 kubenswrapper[29936]: I1205 13:21:12.851071 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rnc29" podStartSLOduration=6.411209376 podStartE2EDuration="8.851049256s" podCreationTimestamp="2025-12-05 13:21:04 +0000 UTC" firstStartedPulling="2025-12-05 13:21:07.623246122 +0000 UTC m=+1864.755325803" lastFinishedPulling="2025-12-05 13:21:10.063086002 +0000 UTC m=+1867.195165683" observedRunningTime="2025-12-05 13:21:12.845154516 +0000 UTC m=+1869.977234197" watchObservedRunningTime="2025-12-05 13:21:12.851049256 +0000 UTC m=+1869.983128937" Dec 05 13:21:16.648549 master-0 kubenswrapper[29936]: I1205 13:21:16.648407 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:16.648549 master-0 kubenswrapper[29936]: I1205 13:21:16.648474 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:16.708961 master-0 kubenswrapper[29936]: I1205 13:21:16.708895 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:16.815812 master-0 kubenswrapper[29936]: I1205 13:21:16.815660 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:17.000044 master-0 kubenswrapper[29936]: I1205 13:21:16.999920 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rnc29"] Dec 05 13:21:18.788993 master-0 kubenswrapper[29936]: I1205 13:21:18.788558 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rnc29" podUID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerName="registry-server" containerID="cri-o://0c0a22dc24504a173dcd97904d6bbb632524dc32bf4298265127f023c03ea69b" gracePeriod=2 Dec 05 13:21:23.864914 master-0 kubenswrapper[29936]: I1205 13:21:23.864807 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rnc29_c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd/registry-server/0.log" Dec 05 13:21:23.866079 master-0 kubenswrapper[29936]: I1205 13:21:23.866029 29936 generic.go:334] "Generic (PLEG): container finished" podID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerID="0c0a22dc24504a173dcd97904d6bbb632524dc32bf4298265127f023c03ea69b" exitCode=137 Dec 05 13:21:23.866226 master-0 kubenswrapper[29936]: I1205 13:21:23.866119 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnc29" event={"ID":"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd","Type":"ContainerDied","Data":"0c0a22dc24504a173dcd97904d6bbb632524dc32bf4298265127f023c03ea69b"} Dec 05 13:21:25.299653 master-0 kubenswrapper[29936]: I1205 13:21:25.299607 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rnc29_c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd/registry-server/0.log" Dec 05 13:21:25.301407 master-0 kubenswrapper[29936]: I1205 13:21:25.301376 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:25.754085 master-0 kubenswrapper[29936]: I1205 13:21:25.754014 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lgwp\" (UniqueName: \"kubernetes.io/projected/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-kube-api-access-4lgwp\") pod \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " Dec 05 13:21:25.754953 master-0 kubenswrapper[29936]: I1205 13:21:25.754919 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-utilities\") pod \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " Dec 05 13:21:25.755467 master-0 kubenswrapper[29936]: I1205 13:21:25.755439 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-catalog-content\") pod \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\" (UID: \"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd\") " Dec 05 13:21:25.757140 master-0 kubenswrapper[29936]: I1205 13:21:25.756376 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-utilities" (OuterVolumeSpecName: "utilities") pod "c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" (UID: "c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:21:25.758300 master-0 kubenswrapper[29936]: I1205 13:21:25.758255 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:21:25.789706 master-0 kubenswrapper[29936]: I1205 13:21:25.789614 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-kube-api-access-4lgwp" (OuterVolumeSpecName: "kube-api-access-4lgwp") pod "c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" (UID: "c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd"). InnerVolumeSpecName "kube-api-access-4lgwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:21:25.864550 master-0 kubenswrapper[29936]: I1205 13:21:25.862937 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lgwp\" (UniqueName: \"kubernetes.io/projected/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-kube-api-access-4lgwp\") on node \"master-0\" DevicePath \"\"" Dec 05 13:21:25.899301 master-0 kubenswrapper[29936]: I1205 13:21:25.899141 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" (UID: "c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:21:25.915100 master-0 kubenswrapper[29936]: I1205 13:21:25.915034 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rnc29_c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd/registry-server/0.log" Dec 05 13:21:25.918016 master-0 kubenswrapper[29936]: I1205 13:21:25.917162 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnc29" event={"ID":"c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd","Type":"ContainerDied","Data":"af79ff7fcf5717af168de74f4e82b96a3347d8521eba5b2752a65c391810508a"} Dec 05 13:21:25.918016 master-0 kubenswrapper[29936]: I1205 13:21:25.917311 29936 scope.go:117] "RemoveContainer" containerID="0c0a22dc24504a173dcd97904d6bbb632524dc32bf4298265127f023c03ea69b" Dec 05 13:21:25.918016 master-0 kubenswrapper[29936]: I1205 13:21:25.917636 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnc29" Dec 05 13:21:25.990224 master-0 kubenswrapper[29936]: I1205 13:21:25.987745 29936 scope.go:117] "RemoveContainer" containerID="7dccf81db5053edc192949bfd38f4f54f2dd8cc19c7a278df2e255c38c12d8cd" Dec 05 13:21:25.990224 master-0 kubenswrapper[29936]: I1205 13:21:25.988634 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:21:26.011339 master-0 kubenswrapper[29936]: I1205 13:21:26.006100 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rnc29"] Dec 05 13:21:26.016423 master-0 kubenswrapper[29936]: I1205 13:21:26.016359 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rnc29"] Dec 05 13:21:26.036828 master-0 kubenswrapper[29936]: I1205 13:21:26.036754 29936 scope.go:117] "RemoveContainer" containerID="5c07ce980e9924dc73fbdef7eeee14e1c57177a16a8f37781de675fd3e892600" Dec 05 13:21:27.203666 master-0 kubenswrapper[29936]: I1205 13:21:27.203596 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" path="/var/lib/kubelet/pods/c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd/volumes" Dec 05 13:30:00.188494 master-0 kubenswrapper[29936]: I1205 13:30:00.188400 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9"] Dec 05 13:30:00.189290 master-0 kubenswrapper[29936]: E1205 13:30:00.188875 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerName="registry-server" Dec 05 13:30:00.189290 master-0 kubenswrapper[29936]: I1205 13:30:00.188901 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerName="registry-server" Dec 05 13:30:00.189290 master-0 kubenswrapper[29936]: E1205 13:30:00.188917 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerName="extract-utilities" Dec 05 13:30:00.189290 master-0 kubenswrapper[29936]: I1205 13:30:00.188923 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerName="extract-utilities" Dec 05 13:30:00.189290 master-0 kubenswrapper[29936]: E1205 13:30:00.188944 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerName="extract-content" Dec 05 13:30:00.189290 master-0 kubenswrapper[29936]: I1205 13:30:00.188951 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerName="extract-content" Dec 05 13:30:00.189290 master-0 kubenswrapper[29936]: I1205 13:30:00.189232 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ec16f2-c53a-4611-a1d1-7d3e9718ecbd" containerName="registry-server" Dec 05 13:30:00.189999 master-0 kubenswrapper[29936]: I1205 13:30:00.189971 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:00.193491 master-0 kubenswrapper[29936]: I1205 13:30:00.193429 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 13:30:00.194703 master-0 kubenswrapper[29936]: I1205 13:30:00.194649 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-rdxkm" Dec 05 13:30:00.209036 master-0 kubenswrapper[29936]: I1205 13:30:00.208987 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9"] Dec 05 13:30:00.305420 master-0 kubenswrapper[29936]: I1205 13:30:00.305362 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4272d907-6b7b-41c3-a6e8-709c79e4ebab-config-volume\") pod \"collect-profiles-29415690-dmcq9\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:00.305856 master-0 kubenswrapper[29936]: I1205 13:30:00.305837 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4272d907-6b7b-41c3-a6e8-709c79e4ebab-secret-volume\") pod \"collect-profiles-29415690-dmcq9\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:00.305998 master-0 kubenswrapper[29936]: I1205 13:30:00.305983 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vlgb\" (UniqueName: \"kubernetes.io/projected/4272d907-6b7b-41c3-a6e8-709c79e4ebab-kube-api-access-8vlgb\") pod \"collect-profiles-29415690-dmcq9\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:00.410789 master-0 kubenswrapper[29936]: I1205 13:30:00.410666 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4272d907-6b7b-41c3-a6e8-709c79e4ebab-secret-volume\") pod \"collect-profiles-29415690-dmcq9\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:00.411055 master-0 kubenswrapper[29936]: I1205 13:30:00.410895 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vlgb\" (UniqueName: \"kubernetes.io/projected/4272d907-6b7b-41c3-a6e8-709c79e4ebab-kube-api-access-8vlgb\") pod \"collect-profiles-29415690-dmcq9\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:00.411112 master-0 kubenswrapper[29936]: I1205 13:30:00.411079 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4272d907-6b7b-41c3-a6e8-709c79e4ebab-config-volume\") pod \"collect-profiles-29415690-dmcq9\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:00.414295 master-0 kubenswrapper[29936]: I1205 13:30:00.412576 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4272d907-6b7b-41c3-a6e8-709c79e4ebab-config-volume\") pod \"collect-profiles-29415690-dmcq9\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:00.416352 master-0 kubenswrapper[29936]: I1205 13:30:00.416301 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4272d907-6b7b-41c3-a6e8-709c79e4ebab-secret-volume\") pod \"collect-profiles-29415690-dmcq9\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:00.445864 master-0 kubenswrapper[29936]: I1205 13:30:00.445783 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vlgb\" (UniqueName: \"kubernetes.io/projected/4272d907-6b7b-41c3-a6e8-709c79e4ebab-kube-api-access-8vlgb\") pod \"collect-profiles-29415690-dmcq9\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:00.554583 master-0 kubenswrapper[29936]: I1205 13:30:00.554470 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:01.049211 master-0 kubenswrapper[29936]: I1205 13:30:01.049062 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9"] Dec 05 13:30:01.878436 master-0 kubenswrapper[29936]: I1205 13:30:01.878332 29936 generic.go:334] "Generic (PLEG): container finished" podID="4272d907-6b7b-41c3-a6e8-709c79e4ebab" containerID="f40cac3ff6f31d614f566c039628f27eb546d6fff7d51b208d738f82ef527759" exitCode=0 Dec 05 13:30:01.879146 master-0 kubenswrapper[29936]: I1205 13:30:01.878444 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" event={"ID":"4272d907-6b7b-41c3-a6e8-709c79e4ebab","Type":"ContainerDied","Data":"f40cac3ff6f31d614f566c039628f27eb546d6fff7d51b208d738f82ef527759"} Dec 05 13:30:01.879146 master-0 kubenswrapper[29936]: I1205 13:30:01.878486 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" event={"ID":"4272d907-6b7b-41c3-a6e8-709c79e4ebab","Type":"ContainerStarted","Data":"27bc57ff44b64ff1a9f9350fd93ab0d1b03878f9510d72902790bc6519dda74c"} Dec 05 13:30:03.324900 master-0 kubenswrapper[29936]: I1205 13:30:03.324816 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:03.503740 master-0 kubenswrapper[29936]: I1205 13:30:03.503550 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4272d907-6b7b-41c3-a6e8-709c79e4ebab-config-volume\") pod \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " Dec 05 13:30:03.503740 master-0 kubenswrapper[29936]: I1205 13:30:03.503721 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vlgb\" (UniqueName: \"kubernetes.io/projected/4272d907-6b7b-41c3-a6e8-709c79e4ebab-kube-api-access-8vlgb\") pod \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " Dec 05 13:30:03.504042 master-0 kubenswrapper[29936]: I1205 13:30:03.503769 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4272d907-6b7b-41c3-a6e8-709c79e4ebab-secret-volume\") pod \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\" (UID: \"4272d907-6b7b-41c3-a6e8-709c79e4ebab\") " Dec 05 13:30:03.504581 master-0 kubenswrapper[29936]: I1205 13:30:03.504491 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4272d907-6b7b-41c3-a6e8-709c79e4ebab-config-volume" (OuterVolumeSpecName: "config-volume") pod "4272d907-6b7b-41c3-a6e8-709c79e4ebab" (UID: "4272d907-6b7b-41c3-a6e8-709c79e4ebab"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:30:03.507431 master-0 kubenswrapper[29936]: I1205 13:30:03.507321 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4272d907-6b7b-41c3-a6e8-709c79e4ebab-kube-api-access-8vlgb" (OuterVolumeSpecName: "kube-api-access-8vlgb") pod "4272d907-6b7b-41c3-a6e8-709c79e4ebab" (UID: "4272d907-6b7b-41c3-a6e8-709c79e4ebab"). InnerVolumeSpecName "kube-api-access-8vlgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:30:03.507662 master-0 kubenswrapper[29936]: I1205 13:30:03.507599 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4272d907-6b7b-41c3-a6e8-709c79e4ebab-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4272d907-6b7b-41c3-a6e8-709c79e4ebab" (UID: "4272d907-6b7b-41c3-a6e8-709c79e4ebab"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:30:03.606895 master-0 kubenswrapper[29936]: I1205 13:30:03.606828 29936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4272d907-6b7b-41c3-a6e8-709c79e4ebab-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 05 13:30:03.606895 master-0 kubenswrapper[29936]: I1205 13:30:03.606885 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vlgb\" (UniqueName: \"kubernetes.io/projected/4272d907-6b7b-41c3-a6e8-709c79e4ebab-kube-api-access-8vlgb\") on node \"master-0\" DevicePath \"\"" Dec 05 13:30:03.606895 master-0 kubenswrapper[29936]: I1205 13:30:03.606899 29936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4272d907-6b7b-41c3-a6e8-709c79e4ebab-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 05 13:30:03.902288 master-0 kubenswrapper[29936]: I1205 13:30:03.902101 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" event={"ID":"4272d907-6b7b-41c3-a6e8-709c79e4ebab","Type":"ContainerDied","Data":"27bc57ff44b64ff1a9f9350fd93ab0d1b03878f9510d72902790bc6519dda74c"} Dec 05 13:30:03.902288 master-0 kubenswrapper[29936]: I1205 13:30:03.902152 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27bc57ff44b64ff1a9f9350fd93ab0d1b03878f9510d72902790bc6519dda74c" Dec 05 13:30:03.902288 master-0 kubenswrapper[29936]: I1205 13:30:03.902209 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415690-dmcq9" Dec 05 13:30:04.442812 master-0 kubenswrapper[29936]: I1205 13:30:04.442721 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv"] Dec 05 13:30:04.463089 master-0 kubenswrapper[29936]: I1205 13:30:04.462990 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415645-h72bv"] Dec 05 13:30:05.202021 master-0 kubenswrapper[29936]: I1205 13:30:05.201372 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="954c5c79-a96c-4c47-a4bc-024aaf4dc789" path="/var/lib/kubelet/pods/954c5c79-a96c-4c47-a4bc-024aaf4dc789/volumes" Dec 05 13:30:19.785278 master-0 kubenswrapper[29936]: I1205 13:30:19.785064 29936 scope.go:117] "RemoveContainer" containerID="6a64d74f0d5ef7e0f5020ef79722fa9a1cfa622ec3d5ca7d9169d099609498b7" Dec 05 13:31:39.741057 master-0 kubenswrapper[29936]: I1205 13:31:39.740947 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x9knp"] Dec 05 13:31:39.742659 master-0 kubenswrapper[29936]: E1205 13:31:39.742631 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4272d907-6b7b-41c3-a6e8-709c79e4ebab" containerName="collect-profiles" Dec 05 13:31:39.742730 master-0 kubenswrapper[29936]: I1205 13:31:39.742679 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="4272d907-6b7b-41c3-a6e8-709c79e4ebab" containerName="collect-profiles" Dec 05 13:31:39.743364 master-0 kubenswrapper[29936]: I1205 13:31:39.743341 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="4272d907-6b7b-41c3-a6e8-709c79e4ebab" containerName="collect-profiles" Dec 05 13:31:39.748467 master-0 kubenswrapper[29936]: I1205 13:31:39.748437 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:39.820614 master-0 kubenswrapper[29936]: I1205 13:31:39.820475 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x9knp"] Dec 05 13:31:39.844744 master-0 kubenswrapper[29936]: I1205 13:31:39.844643 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks769\" (UniqueName: \"kubernetes.io/projected/3ba59227-47c3-4611-9668-aa723fc93972-kube-api-access-ks769\") pod \"community-operators-x9knp\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:39.846940 master-0 kubenswrapper[29936]: I1205 13:31:39.846840 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-catalog-content\") pod \"community-operators-x9knp\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:39.848220 master-0 kubenswrapper[29936]: I1205 13:31:39.847381 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-utilities\") pod \"community-operators-x9knp\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:39.950161 master-0 kubenswrapper[29936]: I1205 13:31:39.950063 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-utilities\") pod \"community-operators-x9knp\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:39.950579 master-0 kubenswrapper[29936]: I1205 13:31:39.950298 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks769\" (UniqueName: \"kubernetes.io/projected/3ba59227-47c3-4611-9668-aa723fc93972-kube-api-access-ks769\") pod \"community-operators-x9knp\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:39.950579 master-0 kubenswrapper[29936]: I1205 13:31:39.950403 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-catalog-content\") pod \"community-operators-x9knp\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:39.953019 master-0 kubenswrapper[29936]: I1205 13:31:39.951311 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-utilities\") pod \"community-operators-x9knp\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:39.953019 master-0 kubenswrapper[29936]: I1205 13:31:39.951409 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-catalog-content\") pod \"community-operators-x9knp\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:39.969699 master-0 kubenswrapper[29936]: I1205 13:31:39.969609 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks769\" (UniqueName: \"kubernetes.io/projected/3ba59227-47c3-4611-9668-aa723fc93972-kube-api-access-ks769\") pod \"community-operators-x9knp\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:40.116172 master-0 kubenswrapper[29936]: I1205 13:31:40.115994 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:40.690445 master-0 kubenswrapper[29936]: I1205 13:31:40.690375 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x9knp"] Dec 05 13:31:40.692451 master-0 kubenswrapper[29936]: W1205 13:31:40.692266 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ba59227_47c3_4611_9668_aa723fc93972.slice/crio-a732d725fdf64091075770715d4ff1cec57cadac593421b46c354692eab7342c WatchSource:0}: Error finding container a732d725fdf64091075770715d4ff1cec57cadac593421b46c354692eab7342c: Status 404 returned error can't find the container with id a732d725fdf64091075770715d4ff1cec57cadac593421b46c354692eab7342c Dec 05 13:31:41.123452 master-0 kubenswrapper[29936]: I1205 13:31:41.123385 29936 generic.go:334] "Generic (PLEG): container finished" podID="3ba59227-47c3-4611-9668-aa723fc93972" containerID="76b444b50fcade46d5d5bde4b9ee31a028489262ff2d003d230024c5f50cf736" exitCode=0 Dec 05 13:31:41.123452 master-0 kubenswrapper[29936]: I1205 13:31:41.123448 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9knp" event={"ID":"3ba59227-47c3-4611-9668-aa723fc93972","Type":"ContainerDied","Data":"76b444b50fcade46d5d5bde4b9ee31a028489262ff2d003d230024c5f50cf736"} Dec 05 13:31:41.124121 master-0 kubenswrapper[29936]: I1205 13:31:41.123493 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9knp" event={"ID":"3ba59227-47c3-4611-9668-aa723fc93972","Type":"ContainerStarted","Data":"a732d725fdf64091075770715d4ff1cec57cadac593421b46c354692eab7342c"} Dec 05 13:31:41.125923 master-0 kubenswrapper[29936]: I1205 13:31:41.125881 29936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 13:31:42.137996 master-0 kubenswrapper[29936]: I1205 13:31:42.137926 29936 generic.go:334] "Generic (PLEG): container finished" podID="3ba59227-47c3-4611-9668-aa723fc93972" containerID="573e040a9c7d46bc075a5d23e429db503d9e689474e3d3a849cbcafe563fd154" exitCode=0 Dec 05 13:31:42.138953 master-0 kubenswrapper[29936]: I1205 13:31:42.137992 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9knp" event={"ID":"3ba59227-47c3-4611-9668-aa723fc93972","Type":"ContainerDied","Data":"573e040a9c7d46bc075a5d23e429db503d9e689474e3d3a849cbcafe563fd154"} Dec 05 13:31:43.155945 master-0 kubenswrapper[29936]: I1205 13:31:43.155784 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9knp" event={"ID":"3ba59227-47c3-4611-9668-aa723fc93972","Type":"ContainerStarted","Data":"7f61868f29667202cd43fdcca073040ba013215ecc16e2580e79529b750dbf21"} Dec 05 13:31:43.189572 master-0 kubenswrapper[29936]: I1205 13:31:43.185163 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x9knp" podStartSLOduration=2.766247282 podStartE2EDuration="4.185136078s" podCreationTimestamp="2025-12-05 13:31:39 +0000 UTC" firstStartedPulling="2025-12-05 13:31:41.125814765 +0000 UTC m=+2498.257894446" lastFinishedPulling="2025-12-05 13:31:42.544703561 +0000 UTC m=+2499.676783242" observedRunningTime="2025-12-05 13:31:43.178338064 +0000 UTC m=+2500.310417765" watchObservedRunningTime="2025-12-05 13:31:43.185136078 +0000 UTC m=+2500.317215759" Dec 05 13:31:50.117278 master-0 kubenswrapper[29936]: I1205 13:31:50.117127 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:50.117278 master-0 kubenswrapper[29936]: I1205 13:31:50.117229 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:50.189172 master-0 kubenswrapper[29936]: I1205 13:31:50.188882 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:50.330201 master-0 kubenswrapper[29936]: I1205 13:31:50.330090 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:50.436764 master-0 kubenswrapper[29936]: I1205 13:31:50.436679 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x9knp"] Dec 05 13:31:52.278521 master-0 kubenswrapper[29936]: I1205 13:31:52.278407 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x9knp" podUID="3ba59227-47c3-4611-9668-aa723fc93972" containerName="registry-server" containerID="cri-o://7f61868f29667202cd43fdcca073040ba013215ecc16e2580e79529b750dbf21" gracePeriod=2 Dec 05 13:31:55.337051 master-0 kubenswrapper[29936]: I1205 13:31:55.336953 29936 generic.go:334] "Generic (PLEG): container finished" podID="3ba59227-47c3-4611-9668-aa723fc93972" containerID="7f61868f29667202cd43fdcca073040ba013215ecc16e2580e79529b750dbf21" exitCode=0 Dec 05 13:31:55.337051 master-0 kubenswrapper[29936]: I1205 13:31:55.337011 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9knp" event={"ID":"3ba59227-47c3-4611-9668-aa723fc93972","Type":"ContainerDied","Data":"7f61868f29667202cd43fdcca073040ba013215ecc16e2580e79529b750dbf21"} Dec 05 13:31:56.828498 master-0 kubenswrapper[29936]: I1205 13:31:56.828447 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:56.939451 master-0 kubenswrapper[29936]: I1205 13:31:56.939192 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-utilities\") pod \"3ba59227-47c3-4611-9668-aa723fc93972\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " Dec 05 13:31:56.940094 master-0 kubenswrapper[29936]: I1205 13:31:56.940036 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-utilities" (OuterVolumeSpecName: "utilities") pod "3ba59227-47c3-4611-9668-aa723fc93972" (UID: "3ba59227-47c3-4611-9668-aa723fc93972"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:31:56.942303 master-0 kubenswrapper[29936]: I1205 13:31:56.942261 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-catalog-content\") pod \"3ba59227-47c3-4611-9668-aa723fc93972\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " Dec 05 13:31:56.942531 master-0 kubenswrapper[29936]: I1205 13:31:56.942503 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks769\" (UniqueName: \"kubernetes.io/projected/3ba59227-47c3-4611-9668-aa723fc93972-kube-api-access-ks769\") pod \"3ba59227-47c3-4611-9668-aa723fc93972\" (UID: \"3ba59227-47c3-4611-9668-aa723fc93972\") " Dec 05 13:31:56.943363 master-0 kubenswrapper[29936]: I1205 13:31:56.943331 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:31:56.945479 master-0 kubenswrapper[29936]: I1205 13:31:56.945439 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ba59227-47c3-4611-9668-aa723fc93972-kube-api-access-ks769" (OuterVolumeSpecName: "kube-api-access-ks769") pod "3ba59227-47c3-4611-9668-aa723fc93972" (UID: "3ba59227-47c3-4611-9668-aa723fc93972"). InnerVolumeSpecName "kube-api-access-ks769". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:31:57.002308 master-0 kubenswrapper[29936]: I1205 13:31:57.002227 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ba59227-47c3-4611-9668-aa723fc93972" (UID: "3ba59227-47c3-4611-9668-aa723fc93972"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:31:57.045853 master-0 kubenswrapper[29936]: I1205 13:31:57.045784 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks769\" (UniqueName: \"kubernetes.io/projected/3ba59227-47c3-4611-9668-aa723fc93972-kube-api-access-ks769\") on node \"master-0\" DevicePath \"\"" Dec 05 13:31:57.045853 master-0 kubenswrapper[29936]: I1205 13:31:57.045835 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ba59227-47c3-4611-9668-aa723fc93972-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:31:57.390809 master-0 kubenswrapper[29936]: I1205 13:31:57.390623 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x9knp" event={"ID":"3ba59227-47c3-4611-9668-aa723fc93972","Type":"ContainerDied","Data":"a732d725fdf64091075770715d4ff1cec57cadac593421b46c354692eab7342c"} Dec 05 13:31:57.390809 master-0 kubenswrapper[29936]: I1205 13:31:57.390691 29936 scope.go:117] "RemoveContainer" containerID="7f61868f29667202cd43fdcca073040ba013215ecc16e2580e79529b750dbf21" Dec 05 13:31:57.391162 master-0 kubenswrapper[29936]: I1205 13:31:57.390838 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x9knp" Dec 05 13:31:57.433018 master-0 kubenswrapper[29936]: I1205 13:31:57.432202 29936 scope.go:117] "RemoveContainer" containerID="573e040a9c7d46bc075a5d23e429db503d9e689474e3d3a849cbcafe563fd154" Dec 05 13:31:57.446354 master-0 kubenswrapper[29936]: I1205 13:31:57.446242 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x9knp"] Dec 05 13:31:57.456977 master-0 kubenswrapper[29936]: I1205 13:31:57.456724 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x9knp"] Dec 05 13:31:57.479867 master-0 kubenswrapper[29936]: I1205 13:31:57.479786 29936 scope.go:117] "RemoveContainer" containerID="76b444b50fcade46d5d5bde4b9ee31a028489262ff2d003d230024c5f50cf736" Dec 05 13:31:59.206543 master-0 kubenswrapper[29936]: I1205 13:31:59.206494 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ba59227-47c3-4611-9668-aa723fc93972" path="/var/lib/kubelet/pods/3ba59227-47c3-4611-9668-aa723fc93972/volumes" Dec 05 13:34:19.607510 master-0 kubenswrapper[29936]: I1205 13:34:19.607384 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h6cw9"] Dec 05 13:34:19.608312 master-0 kubenswrapper[29936]: E1205 13:34:19.607978 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba59227-47c3-4611-9668-aa723fc93972" containerName="registry-server" Dec 05 13:34:19.608312 master-0 kubenswrapper[29936]: I1205 13:34:19.607995 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba59227-47c3-4611-9668-aa723fc93972" containerName="registry-server" Dec 05 13:34:19.608312 master-0 kubenswrapper[29936]: E1205 13:34:19.608012 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba59227-47c3-4611-9668-aa723fc93972" containerName="extract-utilities" Dec 05 13:34:19.608312 master-0 kubenswrapper[29936]: I1205 13:34:19.608021 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba59227-47c3-4611-9668-aa723fc93972" containerName="extract-utilities" Dec 05 13:34:19.608312 master-0 kubenswrapper[29936]: E1205 13:34:19.608095 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ba59227-47c3-4611-9668-aa723fc93972" containerName="extract-content" Dec 05 13:34:19.608312 master-0 kubenswrapper[29936]: I1205 13:34:19.608101 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ba59227-47c3-4611-9668-aa723fc93972" containerName="extract-content" Dec 05 13:34:19.608943 master-0 kubenswrapper[29936]: I1205 13:34:19.608890 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ba59227-47c3-4611-9668-aa723fc93972" containerName="registry-server" Dec 05 13:34:19.611397 master-0 kubenswrapper[29936]: I1205 13:34:19.611365 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:19.615894 master-0 kubenswrapper[29936]: I1205 13:34:19.615747 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-catalog-content\") pod \"redhat-marketplace-h6cw9\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:19.616634 master-0 kubenswrapper[29936]: I1205 13:34:19.616585 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4mtn\" (UniqueName: \"kubernetes.io/projected/81030cc2-a049-482c-be0c-a707851abf43-kube-api-access-n4mtn\") pod \"redhat-marketplace-h6cw9\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:19.616713 master-0 kubenswrapper[29936]: I1205 13:34:19.616663 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-utilities\") pod \"redhat-marketplace-h6cw9\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:19.680826 master-0 kubenswrapper[29936]: I1205 13:34:19.680754 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6cw9"] Dec 05 13:34:19.726146 master-0 kubenswrapper[29936]: I1205 13:34:19.726044 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4mtn\" (UniqueName: \"kubernetes.io/projected/81030cc2-a049-482c-be0c-a707851abf43-kube-api-access-n4mtn\") pod \"redhat-marketplace-h6cw9\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:19.726146 master-0 kubenswrapper[29936]: I1205 13:34:19.726127 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-utilities\") pod \"redhat-marketplace-h6cw9\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:19.726560 master-0 kubenswrapper[29936]: I1205 13:34:19.726224 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-catalog-content\") pod \"redhat-marketplace-h6cw9\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:19.726843 master-0 kubenswrapper[29936]: I1205 13:34:19.726814 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-catalog-content\") pod \"redhat-marketplace-h6cw9\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:19.727374 master-0 kubenswrapper[29936]: I1205 13:34:19.727343 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-utilities\") pod \"redhat-marketplace-h6cw9\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:20.249556 master-0 kubenswrapper[29936]: I1205 13:34:20.249434 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fmfqd"] Dec 05 13:34:20.252596 master-0 kubenswrapper[29936]: I1205 13:34:20.252547 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:20.441992 master-0 kubenswrapper[29936]: I1205 13:34:20.441876 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-utilities\") pod \"redhat-operators-fmfqd\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:20.442380 master-0 kubenswrapper[29936]: I1205 13:34:20.442052 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmc87\" (UniqueName: \"kubernetes.io/projected/8b9c335c-671c-4875-8e16-bb7c4db198d5-kube-api-access-rmc87\") pod \"redhat-operators-fmfqd\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:20.442380 master-0 kubenswrapper[29936]: I1205 13:34:20.442218 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-catalog-content\") pod \"redhat-operators-fmfqd\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:20.545402 master-0 kubenswrapper[29936]: I1205 13:34:20.545201 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-catalog-content\") pod \"redhat-operators-fmfqd\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:20.545916 master-0 kubenswrapper[29936]: I1205 13:34:20.545897 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-utilities\") pod \"redhat-operators-fmfqd\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:20.546104 master-0 kubenswrapper[29936]: I1205 13:34:20.546081 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmc87\" (UniqueName: \"kubernetes.io/projected/8b9c335c-671c-4875-8e16-bb7c4db198d5-kube-api-access-rmc87\") pod \"redhat-operators-fmfqd\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:20.546304 master-0 kubenswrapper[29936]: I1205 13:34:20.545974 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-catalog-content\") pod \"redhat-operators-fmfqd\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:20.546418 master-0 kubenswrapper[29936]: I1205 13:34:20.546347 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-utilities\") pod \"redhat-operators-fmfqd\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:21.179223 master-0 kubenswrapper[29936]: I1205 13:34:21.179133 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmfqd"] Dec 05 13:34:21.211279 master-0 kubenswrapper[29936]: I1205 13:34:21.209752 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmc87\" (UniqueName: \"kubernetes.io/projected/8b9c335c-671c-4875-8e16-bb7c4db198d5-kube-api-access-rmc87\") pod \"redhat-operators-fmfqd\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:21.223246 master-0 kubenswrapper[29936]: I1205 13:34:21.221017 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4mtn\" (UniqueName: \"kubernetes.io/projected/81030cc2-a049-482c-be0c-a707851abf43-kube-api-access-n4mtn\") pod \"redhat-marketplace-h6cw9\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:21.440863 master-0 kubenswrapper[29936]: I1205 13:34:21.440788 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:21.486156 master-0 kubenswrapper[29936]: I1205 13:34:21.486076 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:22.604288 master-0 kubenswrapper[29936]: I1205 13:34:22.601907 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6cw9"] Dec 05 13:34:22.638194 master-0 kubenswrapper[29936]: I1205 13:34:22.638015 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmfqd"] Dec 05 13:34:22.772544 master-0 kubenswrapper[29936]: I1205 13:34:22.772411 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s9s7h"] Dec 05 13:34:22.776675 master-0 kubenswrapper[29936]: I1205 13:34:22.776586 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:22.845296 master-0 kubenswrapper[29936]: I1205 13:34:22.843602 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9s7h"] Dec 05 13:34:22.879796 master-0 kubenswrapper[29936]: I1205 13:34:22.879717 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfr54\" (UniqueName: \"kubernetes.io/projected/259e8b75-6590-4765-b974-2f736623be89-kube-api-access-bfr54\") pod \"certified-operators-s9s7h\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:22.880118 master-0 kubenswrapper[29936]: I1205 13:34:22.879975 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-utilities\") pod \"certified-operators-s9s7h\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:22.880343 master-0 kubenswrapper[29936]: I1205 13:34:22.880318 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-catalog-content\") pod \"certified-operators-s9s7h\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:22.986221 master-0 kubenswrapper[29936]: I1205 13:34:22.984134 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-catalog-content\") pod \"certified-operators-s9s7h\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:22.986221 master-0 kubenswrapper[29936]: I1205 13:34:22.984320 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfr54\" (UniqueName: \"kubernetes.io/projected/259e8b75-6590-4765-b974-2f736623be89-kube-api-access-bfr54\") pod \"certified-operators-s9s7h\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:22.986221 master-0 kubenswrapper[29936]: I1205 13:34:22.984410 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-utilities\") pod \"certified-operators-s9s7h\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:22.986221 master-0 kubenswrapper[29936]: I1205 13:34:22.984836 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-catalog-content\") pod \"certified-operators-s9s7h\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:22.986221 master-0 kubenswrapper[29936]: I1205 13:34:22.985285 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-utilities\") pod \"certified-operators-s9s7h\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:23.093137 master-0 kubenswrapper[29936]: I1205 13:34:23.093066 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfr54\" (UniqueName: \"kubernetes.io/projected/259e8b75-6590-4765-b974-2f736623be89-kube-api-access-bfr54\") pod \"certified-operators-s9s7h\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:23.100424 master-0 kubenswrapper[29936]: I1205 13:34:23.100310 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:23.608194 master-0 kubenswrapper[29936]: I1205 13:34:23.608084 29936 generic.go:334] "Generic (PLEG): container finished" podID="81030cc2-a049-482c-be0c-a707851abf43" containerID="24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03" exitCode=0 Dec 05 13:34:23.608194 master-0 kubenswrapper[29936]: I1205 13:34:23.608202 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6cw9" event={"ID":"81030cc2-a049-482c-be0c-a707851abf43","Type":"ContainerDied","Data":"24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03"} Dec 05 13:34:23.608834 master-0 kubenswrapper[29936]: I1205 13:34:23.608244 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6cw9" event={"ID":"81030cc2-a049-482c-be0c-a707851abf43","Type":"ContainerStarted","Data":"5dd1f6455e15407c23115ed8f7f2b4b07ce960a18e4236a08c85842dddabbdfc"} Dec 05 13:34:23.619317 master-0 kubenswrapper[29936]: I1205 13:34:23.619249 29936 generic.go:334] "Generic (PLEG): container finished" podID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerID="66548a4429b05aa8fc439ad1d17439ad4afb0080f1dc169f9fbc71588618c42d" exitCode=0 Dec 05 13:34:23.619317 master-0 kubenswrapper[29936]: I1205 13:34:23.619320 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmfqd" event={"ID":"8b9c335c-671c-4875-8e16-bb7c4db198d5","Type":"ContainerDied","Data":"66548a4429b05aa8fc439ad1d17439ad4afb0080f1dc169f9fbc71588618c42d"} Dec 05 13:34:23.619568 master-0 kubenswrapper[29936]: I1205 13:34:23.619356 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmfqd" event={"ID":"8b9c335c-671c-4875-8e16-bb7c4db198d5","Type":"ContainerStarted","Data":"3fc3c66c5f6df06946c4d3dd3f03f59e48b3d80791699f8562e51d40910f1c8b"} Dec 05 13:34:23.758641 master-0 kubenswrapper[29936]: I1205 13:34:23.758504 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s9s7h"] Dec 05 13:34:23.759287 master-0 kubenswrapper[29936]: W1205 13:34:23.759247 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod259e8b75_6590_4765_b974_2f736623be89.slice/crio-67a726907f00f9f1fcd63e4daeaa074d52a54dd1d9a066c6647f071f54a2c430 WatchSource:0}: Error finding container 67a726907f00f9f1fcd63e4daeaa074d52a54dd1d9a066c6647f071f54a2c430: Status 404 returned error can't find the container with id 67a726907f00f9f1fcd63e4daeaa074d52a54dd1d9a066c6647f071f54a2c430 Dec 05 13:34:24.636774 master-0 kubenswrapper[29936]: I1205 13:34:24.636644 29936 generic.go:334] "Generic (PLEG): container finished" podID="259e8b75-6590-4765-b974-2f736623be89" containerID="e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf" exitCode=0 Dec 05 13:34:24.636774 master-0 kubenswrapper[29936]: I1205 13:34:24.636721 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9s7h" event={"ID":"259e8b75-6590-4765-b974-2f736623be89","Type":"ContainerDied","Data":"e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf"} Dec 05 13:34:24.636774 master-0 kubenswrapper[29936]: I1205 13:34:24.636774 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9s7h" event={"ID":"259e8b75-6590-4765-b974-2f736623be89","Type":"ContainerStarted","Data":"67a726907f00f9f1fcd63e4daeaa074d52a54dd1d9a066c6647f071f54a2c430"} Dec 05 13:34:36.103146 master-0 kubenswrapper[29936]: I1205 13:34:36.103027 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" podUID="ee9e12a6-a899-4e44-b4e1-d975493b6b9c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.154:8081/readyz\": dial tcp 10.128.0.154:8081: connect: connection refused" Dec 05 13:34:36.104062 master-0 kubenswrapper[29936]: I1205 13:34:36.103956 29936 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" podUID="ee9e12a6-a899-4e44-b4e1-d975493b6b9c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.154:8081/healthz\": dial tcp 10.128.0.154:8081: connect: connection refused" Dec 05 13:34:36.735100 master-0 kubenswrapper[29936]: I1205 13:34:36.734279 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" podUID="49bd6523-c715-46b2-8112-070019badeed" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.160:8081/readyz\": dial tcp 10.128.0.160:8081: connect: connection refused" Dec 05 13:34:36.735437 master-0 kubenswrapper[29936]: I1205 13:34:36.735371 29936 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" podUID="49bd6523-c715-46b2-8112-070019badeed" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.160:8081/healthz\": dial tcp 10.128.0.160:8081: connect: connection refused" Dec 05 13:34:36.795998 master-0 kubenswrapper[29936]: I1205 13:34:36.795928 29936 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" podUID="6e8afa75-0149-45e2-8015-1c519267961c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.166:8081/healthz\": dial tcp 10.128.0.166:8081: connect: connection refused" Dec 05 13:34:36.796247 master-0 kubenswrapper[29936]: I1205 13:34:36.796063 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" podUID="6e8afa75-0149-45e2-8015-1c519267961c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.166:8081/readyz\": dial tcp 10.128.0.166:8081: connect: connection refused" Dec 05 13:34:36.802198 master-0 kubenswrapper[29936]: I1205 13:34:36.802121 29936 generic.go:334] "Generic (PLEG): container finished" podID="49bd6523-c715-46b2-8112-070019badeed" containerID="ed07333b823e7ca2042f0e70dca1b26b97185ed863a7ae192cee1e6f24ade680" exitCode=1 Dec 05 13:34:36.802722 master-0 kubenswrapper[29936]: I1205 13:34:36.802222 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" event={"ID":"49bd6523-c715-46b2-8112-070019badeed","Type":"ContainerDied","Data":"ed07333b823e7ca2042f0e70dca1b26b97185ed863a7ae192cee1e6f24ade680"} Dec 05 13:34:36.802722 master-0 kubenswrapper[29936]: I1205 13:34:36.802317 29936 scope.go:117] "RemoveContainer" containerID="45e5fcf585d5814ad5b0cbe7f317814e0dfe349afc792a92ca00c4f55bebf638" Dec 05 13:34:36.805217 master-0 kubenswrapper[29936]: I1205 13:34:36.805165 29936 scope.go:117] "RemoveContainer" containerID="ed07333b823e7ca2042f0e70dca1b26b97185ed863a7ae192cee1e6f24ade680" Dec 05 13:34:36.806678 master-0 kubenswrapper[29936]: I1205 13:34:36.806647 29936 generic.go:334] "Generic (PLEG): container finished" podID="ee9e12a6-a899-4e44-b4e1-d975493b6b9c" containerID="6478ca3e4a608a849d9b81d8c96477d5c26519d2b05077c7dffed7ca76e18042" exitCode=1 Dec 05 13:34:36.806802 master-0 kubenswrapper[29936]: I1205 13:34:36.806744 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" event={"ID":"ee9e12a6-a899-4e44-b4e1-d975493b6b9c","Type":"ContainerDied","Data":"6478ca3e4a608a849d9b81d8c96477d5c26519d2b05077c7dffed7ca76e18042"} Dec 05 13:34:36.807769 master-0 kubenswrapper[29936]: I1205 13:34:36.807726 29936 scope.go:117] "RemoveContainer" containerID="6478ca3e4a608a849d9b81d8c96477d5c26519d2b05077c7dffed7ca76e18042" Dec 05 13:34:36.809301 master-0 kubenswrapper[29936]: I1205 13:34:36.809244 29936 generic.go:334] "Generic (PLEG): container finished" podID="81030cc2-a049-482c-be0c-a707851abf43" containerID="a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb" exitCode=0 Dec 05 13:34:36.809301 master-0 kubenswrapper[29936]: I1205 13:34:36.809279 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6cw9" event={"ID":"81030cc2-a049-482c-be0c-a707851abf43","Type":"ContainerDied","Data":"a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb"} Dec 05 13:34:36.813662 master-0 kubenswrapper[29936]: I1205 13:34:36.813588 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmfqd" event={"ID":"8b9c335c-671c-4875-8e16-bb7c4db198d5","Type":"ContainerStarted","Data":"b7d4f3bc4ab5029d068a567c8e884688bba06d8f6c61acf0f850f7a4150ae55d"} Dec 05 13:34:36.819007 master-0 kubenswrapper[29936]: I1205 13:34:36.818613 29936 generic.go:334] "Generic (PLEG): container finished" podID="f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9" containerID="321161d05cd0c512e03cc9c7c5780c21ae2336cde18ecdbcb95e96b72bc3ead8" exitCode=1 Dec 05 13:34:36.819007 master-0 kubenswrapper[29936]: I1205 13:34:36.818676 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" event={"ID":"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9","Type":"ContainerDied","Data":"321161d05cd0c512e03cc9c7c5780c21ae2336cde18ecdbcb95e96b72bc3ead8"} Dec 05 13:34:36.819646 master-0 kubenswrapper[29936]: I1205 13:34:36.819608 29936 scope.go:117] "RemoveContainer" containerID="321161d05cd0c512e03cc9c7c5780c21ae2336cde18ecdbcb95e96b72bc3ead8" Dec 05 13:34:36.825475 master-0 kubenswrapper[29936]: I1205 13:34:36.824207 29936 generic.go:334] "Generic (PLEG): container finished" podID="6e8afa75-0149-45e2-8015-1c519267961c" containerID="3fd38880aae6161003b43e52213fa9ca64e765c2f83ed746ca6892842d567d56" exitCode=1 Dec 05 13:34:36.825475 master-0 kubenswrapper[29936]: I1205 13:34:36.824298 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" event={"ID":"6e8afa75-0149-45e2-8015-1c519267961c","Type":"ContainerDied","Data":"3fd38880aae6161003b43e52213fa9ca64e765c2f83ed746ca6892842d567d56"} Dec 05 13:34:36.825475 master-0 kubenswrapper[29936]: I1205 13:34:36.825193 29936 scope.go:117] "RemoveContainer" containerID="3fd38880aae6161003b43e52213fa9ca64e765c2f83ed746ca6892842d567d56" Dec 05 13:34:36.832099 master-0 kubenswrapper[29936]: I1205 13:34:36.832051 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9s7h" event={"ID":"259e8b75-6590-4765-b974-2f736623be89","Type":"ContainerStarted","Data":"6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be"} Dec 05 13:34:37.113711 master-0 kubenswrapper[29936]: I1205 13:34:37.113653 29936 scope.go:117] "RemoveContainer" containerID="96b6da81f888d72f7ddf48309b989a8dd4260e98e2589be7969922301c257b92" Dec 05 13:34:37.263565 master-0 kubenswrapper[29936]: I1205 13:34:37.261820 29936 scope.go:117] "RemoveContainer" containerID="1bf2e35c67c3218c13bc82b693bd3640510210e6eaed8961cfe29daa9ba8cb73" Dec 05 13:34:37.443231 master-0 kubenswrapper[29936]: I1205 13:34:37.440580 29936 scope.go:117] "RemoveContainer" containerID="15762cebb5368bb9e3aef28eeef55563c81d2c20b15cc1b9953470093d4e003d" Dec 05 13:34:37.853281 master-0 kubenswrapper[29936]: I1205 13:34:37.853211 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" event={"ID":"49bd6523-c715-46b2-8112-070019badeed","Type":"ContainerStarted","Data":"be6b51c8a9c25ea472591d695e0703c3461da1ed55a5bf80bc71675606ed753b"} Dec 05 13:34:37.853540 master-0 kubenswrapper[29936]: I1205 13:34:37.853469 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:34:37.855126 master-0 kubenswrapper[29936]: I1205 13:34:37.855083 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" event={"ID":"ee9e12a6-a899-4e44-b4e1-d975493b6b9c","Type":"ContainerStarted","Data":"92946e79e42ccee6e7edee1342f606f779f54a62c7411a6c4f32a2fad4b94334"} Dec 05 13:34:37.855341 master-0 kubenswrapper[29936]: I1205 13:34:37.855315 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:34:37.856830 master-0 kubenswrapper[29936]: I1205 13:34:37.856794 29936 generic.go:334] "Generic (PLEG): container finished" podID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerID="b7d4f3bc4ab5029d068a567c8e884688bba06d8f6c61acf0f850f7a4150ae55d" exitCode=0 Dec 05 13:34:37.856890 master-0 kubenswrapper[29936]: I1205 13:34:37.856848 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmfqd" event={"ID":"8b9c335c-671c-4875-8e16-bb7c4db198d5","Type":"ContainerDied","Data":"b7d4f3bc4ab5029d068a567c8e884688bba06d8f6c61acf0f850f7a4150ae55d"} Dec 05 13:34:37.862317 master-0 kubenswrapper[29936]: I1205 13:34:37.862229 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" event={"ID":"f8b5103d-0778-42b9-a7ce-9b99a2c4a1a9","Type":"ContainerStarted","Data":"1a83829b97579ef9950c105c61d10baeb601cf930fded2ea02b668d4c45f662b"} Dec 05 13:34:37.863827 master-0 kubenswrapper[29936]: I1205 13:34:37.863163 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:34:37.868521 master-0 kubenswrapper[29936]: I1205 13:34:37.868461 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" event={"ID":"6e8afa75-0149-45e2-8015-1c519267961c","Type":"ContainerStarted","Data":"e674b3edd75b1ccf0508a968cf090d8f1334f0a2a9ca10e99303335e4056bd80"} Dec 05 13:34:37.870040 master-0 kubenswrapper[29936]: I1205 13:34:37.869667 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:34:37.880473 master-0 kubenswrapper[29936]: I1205 13:34:37.880405 29936 generic.go:334] "Generic (PLEG): container finished" podID="259e8b75-6590-4765-b974-2f736623be89" containerID="6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be" exitCode=0 Dec 05 13:34:37.880554 master-0 kubenswrapper[29936]: I1205 13:34:37.880494 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9s7h" event={"ID":"259e8b75-6590-4765-b974-2f736623be89","Type":"ContainerDied","Data":"6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be"} Dec 05 13:34:38.895355 master-0 kubenswrapper[29936]: I1205 13:34:38.895111 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9s7h" event={"ID":"259e8b75-6590-4765-b974-2f736623be89","Type":"ContainerStarted","Data":"615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8"} Dec 05 13:34:38.899673 master-0 kubenswrapper[29936]: I1205 13:34:38.899600 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6cw9" event={"ID":"81030cc2-a049-482c-be0c-a707851abf43","Type":"ContainerStarted","Data":"60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f"} Dec 05 13:34:38.903418 master-0 kubenswrapper[29936]: I1205 13:34:38.903248 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmfqd" event={"ID":"8b9c335c-671c-4875-8e16-bb7c4db198d5","Type":"ContainerStarted","Data":"ce2e5f67dbdc801f8f409b34e1f7ba5429d6d7eafb9b8c78b70e3f7e66bf8839"} Dec 05 13:34:38.966951 master-0 kubenswrapper[29936]: I1205 13:34:38.962117 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s9s7h" podStartSLOduration=3.990599703 podStartE2EDuration="16.962093113s" podCreationTimestamp="2025-12-05 13:34:22 +0000 UTC" firstStartedPulling="2025-12-05 13:34:24.683283582 +0000 UTC m=+2661.815363283" lastFinishedPulling="2025-12-05 13:34:37.654777012 +0000 UTC m=+2674.786856693" observedRunningTime="2025-12-05 13:34:38.934355907 +0000 UTC m=+2676.066435598" watchObservedRunningTime="2025-12-05 13:34:38.962093113 +0000 UTC m=+2676.094172804" Dec 05 13:34:38.983208 master-0 kubenswrapper[29936]: I1205 13:34:38.980403 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fmfqd" podStartSLOduration=5.326175665 podStartE2EDuration="19.980373751s" podCreationTimestamp="2025-12-05 13:34:19 +0000 UTC" firstStartedPulling="2025-12-05 13:34:23.622171698 +0000 UTC m=+2660.754251379" lastFinishedPulling="2025-12-05 13:34:38.276369784 +0000 UTC m=+2675.408449465" observedRunningTime="2025-12-05 13:34:38.978920821 +0000 UTC m=+2676.111000502" watchObservedRunningTime="2025-12-05 13:34:38.980373751 +0000 UTC m=+2676.112453432" Dec 05 13:34:39.042361 master-0 kubenswrapper[29936]: I1205 13:34:39.042242 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h6cw9" podStartSLOduration=5.98241802 podStartE2EDuration="20.042208315s" podCreationTimestamp="2025-12-05 13:34:19 +0000 UTC" firstStartedPulling="2025-12-05 13:34:23.612643548 +0000 UTC m=+2660.744723229" lastFinishedPulling="2025-12-05 13:34:37.672433833 +0000 UTC m=+2674.804513524" observedRunningTime="2025-12-05 13:34:39.029605472 +0000 UTC m=+2676.161685173" watchObservedRunningTime="2025-12-05 13:34:39.042208315 +0000 UTC m=+2676.174288006" Dec 05 13:34:41.441546 master-0 kubenswrapper[29936]: I1205 13:34:41.441457 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:41.441546 master-0 kubenswrapper[29936]: I1205 13:34:41.441533 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:41.486809 master-0 kubenswrapper[29936]: I1205 13:34:41.486727 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:41.486809 master-0 kubenswrapper[29936]: I1205 13:34:41.486818 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:41.494886 master-0 kubenswrapper[29936]: I1205 13:34:41.494833 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:42.286833 master-0 kubenswrapper[29936]: I1205 13:34:42.286713 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f998f5746gr6c9" Dec 05 13:34:42.545381 master-0 kubenswrapper[29936]: I1205 13:34:42.545042 29936 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fmfqd" podUID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerName="registry-server" probeResult="failure" output=< Dec 05 13:34:42.545381 master-0 kubenswrapper[29936]: timeout: failed to connect service ":50051" within 1s Dec 05 13:34:42.545381 master-0 kubenswrapper[29936]: > Dec 05 13:34:43.100811 master-0 kubenswrapper[29936]: I1205 13:34:43.100746 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:43.101062 master-0 kubenswrapper[29936]: I1205 13:34:43.100828 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:43.277396 master-0 kubenswrapper[29936]: I1205 13:34:43.277344 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:44.028871 master-0 kubenswrapper[29936]: I1205 13:34:44.028751 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:44.906857 master-0 kubenswrapper[29936]: I1205 13:34:44.906763 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9s7h"] Dec 05 13:34:45.992557 master-0 kubenswrapper[29936]: I1205 13:34:45.992443 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s9s7h" podUID="259e8b75-6590-4765-b974-2f736623be89" containerName="registry-server" containerID="cri-o://615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8" gracePeriod=2 Dec 05 13:34:46.116427 master-0 kubenswrapper[29936]: I1205 13:34:46.116343 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-c6t95" Dec 05 13:34:46.678632 master-0 kubenswrapper[29936]: I1205 13:34:46.678592 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:46.735329 master-0 kubenswrapper[29936]: I1205 13:34:46.735258 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-pt7jr" Dec 05 13:34:46.779483 master-0 kubenswrapper[29936]: I1205 13:34:46.779420 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-catalog-content\") pod \"259e8b75-6590-4765-b974-2f736623be89\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " Dec 05 13:34:46.779706 master-0 kubenswrapper[29936]: I1205 13:34:46.779575 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-utilities\") pod \"259e8b75-6590-4765-b974-2f736623be89\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " Dec 05 13:34:46.780023 master-0 kubenswrapper[29936]: I1205 13:34:46.779995 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfr54\" (UniqueName: \"kubernetes.io/projected/259e8b75-6590-4765-b974-2f736623be89-kube-api-access-bfr54\") pod \"259e8b75-6590-4765-b974-2f736623be89\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " Dec 05 13:34:46.783324 master-0 kubenswrapper[29936]: I1205 13:34:46.783283 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-utilities" (OuterVolumeSpecName: "utilities") pod "259e8b75-6590-4765-b974-2f736623be89" (UID: "259e8b75-6590-4765-b974-2f736623be89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:34:46.796421 master-0 kubenswrapper[29936]: I1205 13:34:46.796124 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259e8b75-6590-4765-b974-2f736623be89-kube-api-access-bfr54" (OuterVolumeSpecName: "kube-api-access-bfr54") pod "259e8b75-6590-4765-b974-2f736623be89" (UID: "259e8b75-6590-4765-b974-2f736623be89"). InnerVolumeSpecName "kube-api-access-bfr54". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:34:46.805249 master-0 kubenswrapper[29936]: I1205 13:34:46.805210 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-t6nq2" Dec 05 13:34:46.882983 master-0 kubenswrapper[29936]: I1205 13:34:46.882453 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "259e8b75-6590-4765-b974-2f736623be89" (UID: "259e8b75-6590-4765-b974-2f736623be89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:34:46.882983 master-0 kubenswrapper[29936]: I1205 13:34:46.882651 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-catalog-content\") pod \"259e8b75-6590-4765-b974-2f736623be89\" (UID: \"259e8b75-6590-4765-b974-2f736623be89\") " Dec 05 13:34:46.882983 master-0 kubenswrapper[29936]: W1205 13:34:46.882796 29936 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/259e8b75-6590-4765-b974-2f736623be89/volumes/kubernetes.io~empty-dir/catalog-content Dec 05 13:34:46.882983 master-0 kubenswrapper[29936]: I1205 13:34:46.882806 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "259e8b75-6590-4765-b974-2f736623be89" (UID: "259e8b75-6590-4765-b974-2f736623be89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:34:46.888290 master-0 kubenswrapper[29936]: I1205 13:34:46.888218 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:34:46.888588 master-0 kubenswrapper[29936]: I1205 13:34:46.888295 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259e8b75-6590-4765-b974-2f736623be89-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:34:46.888588 master-0 kubenswrapper[29936]: I1205 13:34:46.888313 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfr54\" (UniqueName: \"kubernetes.io/projected/259e8b75-6590-4765-b974-2f736623be89-kube-api-access-bfr54\") on node \"master-0\" DevicePath \"\"" Dec 05 13:34:47.006526 master-0 kubenswrapper[29936]: I1205 13:34:47.006449 29936 generic.go:334] "Generic (PLEG): container finished" podID="259e8b75-6590-4765-b974-2f736623be89" containerID="615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8" exitCode=0 Dec 05 13:34:47.006526 master-0 kubenswrapper[29936]: I1205 13:34:47.006509 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9s7h" event={"ID":"259e8b75-6590-4765-b974-2f736623be89","Type":"ContainerDied","Data":"615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8"} Dec 05 13:34:47.007138 master-0 kubenswrapper[29936]: I1205 13:34:47.006581 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s9s7h" event={"ID":"259e8b75-6590-4765-b974-2f736623be89","Type":"ContainerDied","Data":"67a726907f00f9f1fcd63e4daeaa074d52a54dd1d9a066c6647f071f54a2c430"} Dec 05 13:34:47.007138 master-0 kubenswrapper[29936]: I1205 13:34:47.006608 29936 scope.go:117] "RemoveContainer" containerID="615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8" Dec 05 13:34:47.007138 master-0 kubenswrapper[29936]: I1205 13:34:47.006627 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s9s7h" Dec 05 13:34:47.030255 master-0 kubenswrapper[29936]: I1205 13:34:47.028067 29936 scope.go:117] "RemoveContainer" containerID="6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be" Dec 05 13:34:47.078357 master-0 kubenswrapper[29936]: I1205 13:34:47.074483 29936 scope.go:117] "RemoveContainer" containerID="e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf" Dec 05 13:34:47.097381 master-0 kubenswrapper[29936]: I1205 13:34:47.097324 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s9s7h"] Dec 05 13:34:47.108525 master-0 kubenswrapper[29936]: I1205 13:34:47.108444 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s9s7h"] Dec 05 13:34:47.117118 master-0 kubenswrapper[29936]: I1205 13:34:47.117039 29936 scope.go:117] "RemoveContainer" containerID="615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8" Dec 05 13:34:47.117677 master-0 kubenswrapper[29936]: E1205 13:34:47.117622 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8\": container with ID starting with 615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8 not found: ID does not exist" containerID="615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8" Dec 05 13:34:47.117763 master-0 kubenswrapper[29936]: I1205 13:34:47.117665 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8"} err="failed to get container status \"615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8\": rpc error: code = NotFound desc = could not find container \"615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8\": container with ID starting with 615de7585bd4c7373e362d8fd7597934348d5ade949cb7438c48e2a131e736f8 not found: ID does not exist" Dec 05 13:34:47.117763 master-0 kubenswrapper[29936]: I1205 13:34:47.117699 29936 scope.go:117] "RemoveContainer" containerID="6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be" Dec 05 13:34:47.118248 master-0 kubenswrapper[29936]: E1205 13:34:47.118199 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be\": container with ID starting with 6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be not found: ID does not exist" containerID="6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be" Dec 05 13:34:47.118322 master-0 kubenswrapper[29936]: I1205 13:34:47.118257 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be"} err="failed to get container status \"6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be\": rpc error: code = NotFound desc = could not find container \"6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be\": container with ID starting with 6ced79883e4a07e78108314f5ea5d0de5229fecb2d0423d50d490ec32029f3be not found: ID does not exist" Dec 05 13:34:47.118322 master-0 kubenswrapper[29936]: I1205 13:34:47.118275 29936 scope.go:117] "RemoveContainer" containerID="e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf" Dec 05 13:34:47.118735 master-0 kubenswrapper[29936]: E1205 13:34:47.118682 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf\": container with ID starting with e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf not found: ID does not exist" containerID="e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf" Dec 05 13:34:47.118807 master-0 kubenswrapper[29936]: I1205 13:34:47.118737 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf"} err="failed to get container status \"e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf\": rpc error: code = NotFound desc = could not find container \"e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf\": container with ID starting with e7d25e7a1242457efe8b5094875f8134551b3a2d87c0f8df03021365972fe5bf not found: ID does not exist" Dec 05 13:34:47.199907 master-0 kubenswrapper[29936]: I1205 13:34:47.199831 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="259e8b75-6590-4765-b974-2f736623be89" path="/var/lib/kubelet/pods/259e8b75-6590-4765-b974-2f736623be89/volumes" Dec 05 13:34:51.498555 master-0 kubenswrapper[29936]: I1205 13:34:51.498504 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:51.544123 master-0 kubenswrapper[29936]: I1205 13:34:51.544063 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:51.599810 master-0 kubenswrapper[29936]: I1205 13:34:51.599756 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:52.853292 master-0 kubenswrapper[29936]: I1205 13:34:52.853175 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6cw9"] Dec 05 13:34:52.854000 master-0 kubenswrapper[29936]: I1205 13:34:52.853479 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h6cw9" podUID="81030cc2-a049-482c-be0c-a707851abf43" containerName="registry-server" containerID="cri-o://60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f" gracePeriod=2 Dec 05 13:34:53.848670 master-0 kubenswrapper[29936]: I1205 13:34:53.848620 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:53.850481 master-0 kubenswrapper[29936]: I1205 13:34:53.850427 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmfqd"] Dec 05 13:34:53.850691 master-0 kubenswrapper[29936]: I1205 13:34:53.850658 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fmfqd" podUID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerName="registry-server" containerID="cri-o://ce2e5f67dbdc801f8f409b34e1f7ba5429d6d7eafb9b8c78b70e3f7e66bf8839" gracePeriod=2 Dec 05 13:34:53.981058 master-0 kubenswrapper[29936]: I1205 13:34:53.981001 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-utilities\") pod \"81030cc2-a049-482c-be0c-a707851abf43\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " Dec 05 13:34:53.984446 master-0 kubenswrapper[29936]: I1205 13:34:53.981202 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-catalog-content\") pod \"81030cc2-a049-482c-be0c-a707851abf43\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " Dec 05 13:34:53.984446 master-0 kubenswrapper[29936]: I1205 13:34:53.981455 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4mtn\" (UniqueName: \"kubernetes.io/projected/81030cc2-a049-482c-be0c-a707851abf43-kube-api-access-n4mtn\") pod \"81030cc2-a049-482c-be0c-a707851abf43\" (UID: \"81030cc2-a049-482c-be0c-a707851abf43\") " Dec 05 13:34:53.984446 master-0 kubenswrapper[29936]: I1205 13:34:53.983871 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-utilities" (OuterVolumeSpecName: "utilities") pod "81030cc2-a049-482c-be0c-a707851abf43" (UID: "81030cc2-a049-482c-be0c-a707851abf43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:34:53.984446 master-0 kubenswrapper[29936]: I1205 13:34:53.984322 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81030cc2-a049-482c-be0c-a707851abf43-kube-api-access-n4mtn" (OuterVolumeSpecName: "kube-api-access-n4mtn") pod "81030cc2-a049-482c-be0c-a707851abf43" (UID: "81030cc2-a049-482c-be0c-a707851abf43"). InnerVolumeSpecName "kube-api-access-n4mtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:34:54.000560 master-0 kubenswrapper[29936]: I1205 13:34:54.000482 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81030cc2-a049-482c-be0c-a707851abf43" (UID: "81030cc2-a049-482c-be0c-a707851abf43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:34:54.084517 master-0 kubenswrapper[29936]: I1205 13:34:54.084423 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4mtn\" (UniqueName: \"kubernetes.io/projected/81030cc2-a049-482c-be0c-a707851abf43-kube-api-access-n4mtn\") on node \"master-0\" DevicePath \"\"" Dec 05 13:34:54.084517 master-0 kubenswrapper[29936]: I1205 13:34:54.084483 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:34:54.084517 master-0 kubenswrapper[29936]: I1205 13:34:54.084514 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81030cc2-a049-482c-be0c-a707851abf43-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:34:54.097366 master-0 kubenswrapper[29936]: I1205 13:34:54.095352 29936 generic.go:334] "Generic (PLEG): container finished" podID="81030cc2-a049-482c-be0c-a707851abf43" containerID="60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f" exitCode=0 Dec 05 13:34:54.097366 master-0 kubenswrapper[29936]: I1205 13:34:54.095424 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6cw9" Dec 05 13:34:54.097366 master-0 kubenswrapper[29936]: I1205 13:34:54.095446 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6cw9" event={"ID":"81030cc2-a049-482c-be0c-a707851abf43","Type":"ContainerDied","Data":"60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f"} Dec 05 13:34:54.097366 master-0 kubenswrapper[29936]: I1205 13:34:54.095482 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6cw9" event={"ID":"81030cc2-a049-482c-be0c-a707851abf43","Type":"ContainerDied","Data":"5dd1f6455e15407c23115ed8f7f2b4b07ce960a18e4236a08c85842dddabbdfc"} Dec 05 13:34:54.097366 master-0 kubenswrapper[29936]: I1205 13:34:54.095503 29936 scope.go:117] "RemoveContainer" containerID="60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f" Dec 05 13:34:54.099771 master-0 kubenswrapper[29936]: I1205 13:34:54.099724 29936 generic.go:334] "Generic (PLEG): container finished" podID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerID="ce2e5f67dbdc801f8f409b34e1f7ba5429d6d7eafb9b8c78b70e3f7e66bf8839" exitCode=0 Dec 05 13:34:54.099842 master-0 kubenswrapper[29936]: I1205 13:34:54.099777 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmfqd" event={"ID":"8b9c335c-671c-4875-8e16-bb7c4db198d5","Type":"ContainerDied","Data":"ce2e5f67dbdc801f8f409b34e1f7ba5429d6d7eafb9b8c78b70e3f7e66bf8839"} Dec 05 13:34:54.120086 master-0 kubenswrapper[29936]: I1205 13:34:54.120020 29936 scope.go:117] "RemoveContainer" containerID="a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb" Dec 05 13:34:54.159811 master-0 kubenswrapper[29936]: I1205 13:34:54.151195 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6cw9"] Dec 05 13:34:54.161793 master-0 kubenswrapper[29936]: I1205 13:34:54.161683 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6cw9"] Dec 05 13:34:54.192402 master-0 kubenswrapper[29936]: I1205 13:34:54.192367 29936 scope.go:117] "RemoveContainer" containerID="24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03" Dec 05 13:34:54.217571 master-0 kubenswrapper[29936]: I1205 13:34:54.217525 29936 scope.go:117] "RemoveContainer" containerID="60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f" Dec 05 13:34:54.218034 master-0 kubenswrapper[29936]: E1205 13:34:54.217993 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f\": container with ID starting with 60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f not found: ID does not exist" containerID="60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f" Dec 05 13:34:54.218109 master-0 kubenswrapper[29936]: I1205 13:34:54.218042 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f"} err="failed to get container status \"60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f\": rpc error: code = NotFound desc = could not find container \"60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f\": container with ID starting with 60246767692a8ff19eb3bbb7330e32710e5fe544e2885b177c6f68b1a253067f not found: ID does not exist" Dec 05 13:34:54.218109 master-0 kubenswrapper[29936]: I1205 13:34:54.218073 29936 scope.go:117] "RemoveContainer" containerID="a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb" Dec 05 13:34:54.218520 master-0 kubenswrapper[29936]: E1205 13:34:54.218495 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb\": container with ID starting with a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb not found: ID does not exist" containerID="a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb" Dec 05 13:34:54.218614 master-0 kubenswrapper[29936]: I1205 13:34:54.218526 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb"} err="failed to get container status \"a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb\": rpc error: code = NotFound desc = could not find container \"a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb\": container with ID starting with a45e474480d07b92ae1f8e0945a41c71244b323b213924f229f570cea47cebeb not found: ID does not exist" Dec 05 13:34:54.218614 master-0 kubenswrapper[29936]: I1205 13:34:54.218592 29936 scope.go:117] "RemoveContainer" containerID="24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03" Dec 05 13:34:54.218946 master-0 kubenswrapper[29936]: E1205 13:34:54.218906 29936 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03\": container with ID starting with 24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03 not found: ID does not exist" containerID="24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03" Dec 05 13:34:54.219056 master-0 kubenswrapper[29936]: I1205 13:34:54.219028 29936 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03"} err="failed to get container status \"24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03\": rpc error: code = NotFound desc = could not find container \"24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03\": container with ID starting with 24551a347446e1a07d83557b7d0794eb677ed35a09756517c3549b8596becc03 not found: ID does not exist" Dec 05 13:34:54.383910 master-0 kubenswrapper[29936]: I1205 13:34:54.383848 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:54.502039 master-0 kubenswrapper[29936]: I1205 13:34:54.501984 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-utilities\") pod \"8b9c335c-671c-4875-8e16-bb7c4db198d5\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " Dec 05 13:34:54.502334 master-0 kubenswrapper[29936]: I1205 13:34:54.502079 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-catalog-content\") pod \"8b9c335c-671c-4875-8e16-bb7c4db198d5\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " Dec 05 13:34:54.502556 master-0 kubenswrapper[29936]: I1205 13:34:54.502530 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmc87\" (UniqueName: \"kubernetes.io/projected/8b9c335c-671c-4875-8e16-bb7c4db198d5-kube-api-access-rmc87\") pod \"8b9c335c-671c-4875-8e16-bb7c4db198d5\" (UID: \"8b9c335c-671c-4875-8e16-bb7c4db198d5\") " Dec 05 13:34:54.511283 master-0 kubenswrapper[29936]: I1205 13:34:54.507949 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9c335c-671c-4875-8e16-bb7c4db198d5-kube-api-access-rmc87" (OuterVolumeSpecName: "kube-api-access-rmc87") pod "8b9c335c-671c-4875-8e16-bb7c4db198d5" (UID: "8b9c335c-671c-4875-8e16-bb7c4db198d5"). InnerVolumeSpecName "kube-api-access-rmc87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:34:54.511283 master-0 kubenswrapper[29936]: I1205 13:34:54.508676 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-utilities" (OuterVolumeSpecName: "utilities") pod "8b9c335c-671c-4875-8e16-bb7c4db198d5" (UID: "8b9c335c-671c-4875-8e16-bb7c4db198d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:34:54.605675 master-0 kubenswrapper[29936]: I1205 13:34:54.605557 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:34:54.605675 master-0 kubenswrapper[29936]: I1205 13:34:54.605602 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmc87\" (UniqueName: \"kubernetes.io/projected/8b9c335c-671c-4875-8e16-bb7c4db198d5-kube-api-access-rmc87\") on node \"master-0\" DevicePath \"\"" Dec 05 13:34:54.628708 master-0 kubenswrapper[29936]: I1205 13:34:54.628638 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b9c335c-671c-4875-8e16-bb7c4db198d5" (UID: "8b9c335c-671c-4875-8e16-bb7c4db198d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:34:54.708770 master-0 kubenswrapper[29936]: I1205 13:34:54.708692 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9c335c-671c-4875-8e16-bb7c4db198d5-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:34:55.117797 master-0 kubenswrapper[29936]: I1205 13:34:55.117736 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmfqd" event={"ID":"8b9c335c-671c-4875-8e16-bb7c4db198d5","Type":"ContainerDied","Data":"3fc3c66c5f6df06946c4d3dd3f03f59e48b3d80791699f8562e51d40910f1c8b"} Dec 05 13:34:55.118393 master-0 kubenswrapper[29936]: I1205 13:34:55.117811 29936 scope.go:117] "RemoveContainer" containerID="ce2e5f67dbdc801f8f409b34e1f7ba5429d6d7eafb9b8c78b70e3f7e66bf8839" Dec 05 13:34:55.118393 master-0 kubenswrapper[29936]: I1205 13:34:55.117852 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmfqd" Dec 05 13:34:55.142013 master-0 kubenswrapper[29936]: I1205 13:34:55.141961 29936 scope.go:117] "RemoveContainer" containerID="b7d4f3bc4ab5029d068a567c8e884688bba06d8f6c61acf0f850f7a4150ae55d" Dec 05 13:34:55.183489 master-0 kubenswrapper[29936]: I1205 13:34:55.183424 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmfqd"] Dec 05 13:34:55.200071 master-0 kubenswrapper[29936]: I1205 13:34:55.200010 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81030cc2-a049-482c-be0c-a707851abf43" path="/var/lib/kubelet/pods/81030cc2-a049-482c-be0c-a707851abf43/volumes" Dec 05 13:34:55.200774 master-0 kubenswrapper[29936]: I1205 13:34:55.200751 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fmfqd"] Dec 05 13:34:55.364446 master-0 kubenswrapper[29936]: I1205 13:34:55.364372 29936 scope.go:117] "RemoveContainer" containerID="66548a4429b05aa8fc439ad1d17439ad4afb0080f1dc169f9fbc71588618c42d" Dec 05 13:34:57.200442 master-0 kubenswrapper[29936]: I1205 13:34:57.200368 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9c335c-671c-4875-8e16-bb7c4db198d5" path="/var/lib/kubelet/pods/8b9c335c-671c-4875-8e16-bb7c4db198d5/volumes" Dec 05 13:38:57.391286 master-0 kubenswrapper[29936]: I1205 13:38:57.391152 29936 trace.go:236] Trace[1177465454]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (05-Dec-2025 13:38:56.176) (total time: 1214ms): Dec 05 13:38:57.391286 master-0 kubenswrapper[29936]: Trace[1177465454]: [1.214315837s] [1.214315837s] END Dec 05 13:40:06.552770 master-0 kubenswrapper[29936]: I1205 13:40:06.552706 29936 trace.go:236] Trace[1585900955]: "Calculate volume metrics of glance for pod openstack/glance-b46d8-default-external-api-0" (05-Dec-2025 13:40:05.500) (total time: 1052ms): Dec 05 13:40:06.552770 master-0 kubenswrapper[29936]: Trace[1585900955]: [1.052668062s] [1.052668062s] END Dec 05 13:42:23.432063 master-0 kubenswrapper[29936]: I1205 13:42:23.431857 29936 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e05f116e-3a9f-481a-8ca7-e9f4715f5d7f" containerName="galera" probeResult="failure" output="command timed out" Dec 05 13:42:23.436539 master-0 kubenswrapper[29936]: I1205 13:42:23.436068 29936 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e05f116e-3a9f-481a-8ca7-e9f4715f5d7f" containerName="galera" probeResult="failure" output="command timed out" Dec 05 13:42:57.475551 master-0 kubenswrapper[29936]: I1205 13:42:57.475487 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zwvmc"] Dec 05 13:42:57.476299 master-0 kubenswrapper[29936]: E1205 13:42:57.476042 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81030cc2-a049-482c-be0c-a707851abf43" containerName="extract-content" Dec 05 13:42:57.476299 master-0 kubenswrapper[29936]: I1205 13:42:57.476062 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="81030cc2-a049-482c-be0c-a707851abf43" containerName="extract-content" Dec 05 13:42:57.476299 master-0 kubenswrapper[29936]: E1205 13:42:57.476075 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81030cc2-a049-482c-be0c-a707851abf43" containerName="extract-utilities" Dec 05 13:42:57.476299 master-0 kubenswrapper[29936]: I1205 13:42:57.476082 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="81030cc2-a049-482c-be0c-a707851abf43" containerName="extract-utilities" Dec 05 13:42:57.476299 master-0 kubenswrapper[29936]: E1205 13:42:57.476124 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerName="extract-utilities" Dec 05 13:42:57.476299 master-0 kubenswrapper[29936]: I1205 13:42:57.476132 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerName="extract-utilities" Dec 05 13:42:57.476299 master-0 kubenswrapper[29936]: E1205 13:42:57.476152 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerName="extract-content" Dec 05 13:42:57.476299 master-0 kubenswrapper[29936]: I1205 13:42:57.476159 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerName="extract-content" Dec 05 13:42:57.476562 master-0 kubenswrapper[29936]: E1205 13:42:57.476328 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259e8b75-6590-4765-b974-2f736623be89" containerName="extract-content" Dec 05 13:42:57.476562 master-0 kubenswrapper[29936]: I1205 13:42:57.476343 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="259e8b75-6590-4765-b974-2f736623be89" containerName="extract-content" Dec 05 13:42:57.476562 master-0 kubenswrapper[29936]: E1205 13:42:57.476367 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259e8b75-6590-4765-b974-2f736623be89" containerName="registry-server" Dec 05 13:42:57.476562 master-0 kubenswrapper[29936]: I1205 13:42:57.476375 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="259e8b75-6590-4765-b974-2f736623be89" containerName="registry-server" Dec 05 13:42:57.476562 master-0 kubenswrapper[29936]: E1205 13:42:57.476396 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81030cc2-a049-482c-be0c-a707851abf43" containerName="registry-server" Dec 05 13:42:57.476562 master-0 kubenswrapper[29936]: I1205 13:42:57.476404 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="81030cc2-a049-482c-be0c-a707851abf43" containerName="registry-server" Dec 05 13:42:57.476562 master-0 kubenswrapper[29936]: E1205 13:42:57.476418 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259e8b75-6590-4765-b974-2f736623be89" containerName="extract-utilities" Dec 05 13:42:57.476562 master-0 kubenswrapper[29936]: I1205 13:42:57.476425 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="259e8b75-6590-4765-b974-2f736623be89" containerName="extract-utilities" Dec 05 13:42:57.476562 master-0 kubenswrapper[29936]: E1205 13:42:57.476447 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerName="registry-server" Dec 05 13:42:57.476562 master-0 kubenswrapper[29936]: I1205 13:42:57.476455 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerName="registry-server" Dec 05 13:42:57.477040 master-0 kubenswrapper[29936]: I1205 13:42:57.476703 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9c335c-671c-4875-8e16-bb7c4db198d5" containerName="registry-server" Dec 05 13:42:57.477040 master-0 kubenswrapper[29936]: I1205 13:42:57.476742 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="259e8b75-6590-4765-b974-2f736623be89" containerName="registry-server" Dec 05 13:42:57.477040 master-0 kubenswrapper[29936]: I1205 13:42:57.476761 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="81030cc2-a049-482c-be0c-a707851abf43" containerName="registry-server" Dec 05 13:42:57.478762 master-0 kubenswrapper[29936]: I1205 13:42:57.478721 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:57.517077 master-0 kubenswrapper[29936]: I1205 13:42:57.517000 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwvmc"] Dec 05 13:42:57.522927 master-0 kubenswrapper[29936]: I1205 13:42:57.520839 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-catalog-content\") pod \"community-operators-zwvmc\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:57.522927 master-0 kubenswrapper[29936]: I1205 13:42:57.521017 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzzsc\" (UniqueName: \"kubernetes.io/projected/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-kube-api-access-jzzsc\") pod \"community-operators-zwvmc\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:57.522927 master-0 kubenswrapper[29936]: I1205 13:42:57.521061 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-utilities\") pod \"community-operators-zwvmc\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:57.623105 master-0 kubenswrapper[29936]: I1205 13:42:57.623033 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzzsc\" (UniqueName: \"kubernetes.io/projected/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-kube-api-access-jzzsc\") pod \"community-operators-zwvmc\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:57.623105 master-0 kubenswrapper[29936]: I1205 13:42:57.623091 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-utilities\") pod \"community-operators-zwvmc\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:57.623422 master-0 kubenswrapper[29936]: I1205 13:42:57.623269 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-catalog-content\") pod \"community-operators-zwvmc\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:57.623849 master-0 kubenswrapper[29936]: I1205 13:42:57.623819 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-catalog-content\") pod \"community-operators-zwvmc\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:57.625173 master-0 kubenswrapper[29936]: I1205 13:42:57.625138 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-utilities\") pod \"community-operators-zwvmc\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:57.640926 master-0 kubenswrapper[29936]: I1205 13:42:57.640824 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzzsc\" (UniqueName: \"kubernetes.io/projected/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-kube-api-access-jzzsc\") pod \"community-operators-zwvmc\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:57.802353 master-0 kubenswrapper[29936]: I1205 13:42:57.802215 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:42:58.371105 master-0 kubenswrapper[29936]: I1205 13:42:58.371014 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwvmc"] Dec 05 13:42:58.588266 master-0 kubenswrapper[29936]: I1205 13:42:58.587861 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwvmc" event={"ID":"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7","Type":"ContainerStarted","Data":"dd0499aff476a87133bd8be070170467e76962646630666f1909d44b78aee16b"} Dec 05 13:42:58.588266 master-0 kubenswrapper[29936]: I1205 13:42:58.587915 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwvmc" event={"ID":"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7","Type":"ContainerStarted","Data":"0ffd9d2cfd2128409def905d7c2e7c6ac0fd77380defedfea85c53883f040126"} Dec 05 13:42:59.599740 master-0 kubenswrapper[29936]: I1205 13:42:59.599660 29936 generic.go:334] "Generic (PLEG): container finished" podID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerID="dd0499aff476a87133bd8be070170467e76962646630666f1909d44b78aee16b" exitCode=0 Dec 05 13:42:59.599740 master-0 kubenswrapper[29936]: I1205 13:42:59.599725 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwvmc" event={"ID":"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7","Type":"ContainerDied","Data":"dd0499aff476a87133bd8be070170467e76962646630666f1909d44b78aee16b"} Dec 05 13:42:59.601975 master-0 kubenswrapper[29936]: I1205 13:42:59.601927 29936 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 13:43:00.613607 master-0 kubenswrapper[29936]: I1205 13:43:00.613521 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwvmc" event={"ID":"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7","Type":"ContainerStarted","Data":"9eb755fa267007760f89ddb5cd45cdf11499c8db28fa2714dab145261e774ccc"} Dec 05 13:43:01.653985 master-0 kubenswrapper[29936]: I1205 13:43:01.637248 29936 generic.go:334] "Generic (PLEG): container finished" podID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerID="9eb755fa267007760f89ddb5cd45cdf11499c8db28fa2714dab145261e774ccc" exitCode=0 Dec 05 13:43:01.653985 master-0 kubenswrapper[29936]: I1205 13:43:01.637334 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwvmc" event={"ID":"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7","Type":"ContainerDied","Data":"9eb755fa267007760f89ddb5cd45cdf11499c8db28fa2714dab145261e774ccc"} Dec 05 13:43:02.650476 master-0 kubenswrapper[29936]: I1205 13:43:02.650410 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwvmc" event={"ID":"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7","Type":"ContainerStarted","Data":"33ff765feb25d864816b15e1d89dff89392b9a125f0e526248baaafd55ab2e37"} Dec 05 13:43:02.669864 master-0 kubenswrapper[29936]: I1205 13:43:02.669772 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zwvmc" podStartSLOduration=3.7427050680000002 podStartE2EDuration="5.669751443s" podCreationTimestamp="2025-12-05 13:42:57 +0000 UTC" firstStartedPulling="2025-12-05 13:42:59.601878498 +0000 UTC m=+3176.733958179" lastFinishedPulling="2025-12-05 13:43:01.528924873 +0000 UTC m=+3178.661004554" observedRunningTime="2025-12-05 13:43:02.669205249 +0000 UTC m=+3179.801284940" watchObservedRunningTime="2025-12-05 13:43:02.669751443 +0000 UTC m=+3179.801831124" Dec 05 13:43:07.802820 master-0 kubenswrapper[29936]: I1205 13:43:07.802611 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:43:07.802820 master-0 kubenswrapper[29936]: I1205 13:43:07.802689 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:43:07.849895 master-0 kubenswrapper[29936]: I1205 13:43:07.849803 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:43:08.762348 master-0 kubenswrapper[29936]: I1205 13:43:08.762264 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:43:08.976840 master-0 kubenswrapper[29936]: I1205 13:43:08.976750 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwvmc"] Dec 05 13:43:10.735697 master-0 kubenswrapper[29936]: I1205 13:43:10.735598 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zwvmc" podUID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerName="registry-server" containerID="cri-o://33ff765feb25d864816b15e1d89dff89392b9a125f0e526248baaafd55ab2e37" gracePeriod=2 Dec 05 13:43:11.758654 master-0 kubenswrapper[29936]: I1205 13:43:11.758574 29936 generic.go:334] "Generic (PLEG): container finished" podID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerID="33ff765feb25d864816b15e1d89dff89392b9a125f0e526248baaafd55ab2e37" exitCode=0 Dec 05 13:43:11.758654 master-0 kubenswrapper[29936]: I1205 13:43:11.758634 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwvmc" event={"ID":"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7","Type":"ContainerDied","Data":"33ff765feb25d864816b15e1d89dff89392b9a125f0e526248baaafd55ab2e37"} Dec 05 13:43:11.758654 master-0 kubenswrapper[29936]: I1205 13:43:11.758664 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwvmc" event={"ID":"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7","Type":"ContainerDied","Data":"0ffd9d2cfd2128409def905d7c2e7c6ac0fd77380defedfea85c53883f040126"} Dec 05 13:43:11.759328 master-0 kubenswrapper[29936]: I1205 13:43:11.758677 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ffd9d2cfd2128409def905d7c2e7c6ac0fd77380defedfea85c53883f040126" Dec 05 13:43:11.838745 master-0 kubenswrapper[29936]: I1205 13:43:11.838684 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:43:12.002457 master-0 kubenswrapper[29936]: I1205 13:43:11.983950 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-catalog-content\") pod \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " Dec 05 13:43:12.002457 master-0 kubenswrapper[29936]: I1205 13:43:11.984050 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-utilities\") pod \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " Dec 05 13:43:12.002457 master-0 kubenswrapper[29936]: I1205 13:43:11.984386 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzzsc\" (UniqueName: \"kubernetes.io/projected/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-kube-api-access-jzzsc\") pod \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\" (UID: \"5a621ac7-bf70-4293-88ce-1ddc7dfd62d7\") " Dec 05 13:43:12.002457 master-0 kubenswrapper[29936]: I1205 13:43:11.990347 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-kube-api-access-jzzsc" (OuterVolumeSpecName: "kube-api-access-jzzsc") pod "5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" (UID: "5a621ac7-bf70-4293-88ce-1ddc7dfd62d7"). InnerVolumeSpecName "kube-api-access-jzzsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:43:12.002457 master-0 kubenswrapper[29936]: I1205 13:43:12.000301 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-utilities" (OuterVolumeSpecName: "utilities") pod "5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" (UID: "5a621ac7-bf70-4293-88ce-1ddc7dfd62d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:43:12.046207 master-0 kubenswrapper[29936]: I1205 13:43:12.045952 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" (UID: "5a621ac7-bf70-4293-88ce-1ddc7dfd62d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:43:12.087619 master-0 kubenswrapper[29936]: I1205 13:43:12.087528 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzzsc\" (UniqueName: \"kubernetes.io/projected/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-kube-api-access-jzzsc\") on node \"master-0\" DevicePath \"\"" Dec 05 13:43:12.087619 master-0 kubenswrapper[29936]: I1205 13:43:12.087595 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:43:12.087619 master-0 kubenswrapper[29936]: I1205 13:43:12.087610 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:43:12.768891 master-0 kubenswrapper[29936]: I1205 13:43:12.768817 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwvmc" Dec 05 13:43:12.966296 master-0 kubenswrapper[29936]: I1205 13:43:12.964562 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwvmc"] Dec 05 13:43:12.980699 master-0 kubenswrapper[29936]: I1205 13:43:12.980579 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zwvmc"] Dec 05 13:43:13.204102 master-0 kubenswrapper[29936]: I1205 13:43:13.204003 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" path="/var/lib/kubelet/pods/5a621ac7-bf70-4293-88ce-1ddc7dfd62d7/volumes" Dec 05 13:44:27.159408 master-0 kubenswrapper[29936]: I1205 13:44:27.159324 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4bl"] Dec 05 13:44:27.160449 master-0 kubenswrapper[29936]: E1205 13:44:27.160105 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerName="extract-content" Dec 05 13:44:27.160449 master-0 kubenswrapper[29936]: I1205 13:44:27.160125 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerName="extract-content" Dec 05 13:44:27.160449 master-0 kubenswrapper[29936]: E1205 13:44:27.160151 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerName="registry-server" Dec 05 13:44:27.160449 master-0 kubenswrapper[29936]: I1205 13:44:27.160158 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerName="registry-server" Dec 05 13:44:27.160449 master-0 kubenswrapper[29936]: E1205 13:44:27.160216 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerName="extract-utilities" Dec 05 13:44:27.160449 master-0 kubenswrapper[29936]: I1205 13:44:27.160226 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerName="extract-utilities" Dec 05 13:44:27.160665 master-0 kubenswrapper[29936]: I1205 13:44:27.160474 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a621ac7-bf70-4293-88ce-1ddc7dfd62d7" containerName="registry-server" Dec 05 13:44:27.162633 master-0 kubenswrapper[29936]: I1205 13:44:27.162592 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:27.175209 master-0 kubenswrapper[29936]: I1205 13:44:27.175112 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4bl"] Dec 05 13:44:27.257804 master-0 kubenswrapper[29936]: I1205 13:44:27.257713 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zsvx\" (UniqueName: \"kubernetes.io/projected/3ccf4371-8e48-4661-bdb3-df46e70e371c-kube-api-access-6zsvx\") pod \"redhat-marketplace-ht4bl\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:27.258149 master-0 kubenswrapper[29936]: I1205 13:44:27.257907 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-utilities\") pod \"redhat-marketplace-ht4bl\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:27.258433 master-0 kubenswrapper[29936]: I1205 13:44:27.258372 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-catalog-content\") pod \"redhat-marketplace-ht4bl\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:27.361400 master-0 kubenswrapper[29936]: I1205 13:44:27.361321 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-catalog-content\") pod \"redhat-marketplace-ht4bl\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:27.361696 master-0 kubenswrapper[29936]: I1205 13:44:27.361542 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zsvx\" (UniqueName: \"kubernetes.io/projected/3ccf4371-8e48-4661-bdb3-df46e70e371c-kube-api-access-6zsvx\") pod \"redhat-marketplace-ht4bl\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:27.361905 master-0 kubenswrapper[29936]: I1205 13:44:27.361874 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-catalog-content\") pod \"redhat-marketplace-ht4bl\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:27.362012 master-0 kubenswrapper[29936]: I1205 13:44:27.361891 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-utilities\") pod \"redhat-marketplace-ht4bl\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:27.362414 master-0 kubenswrapper[29936]: I1205 13:44:27.362388 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-utilities\") pod \"redhat-marketplace-ht4bl\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:27.389780 master-0 kubenswrapper[29936]: I1205 13:44:27.389719 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zsvx\" (UniqueName: \"kubernetes.io/projected/3ccf4371-8e48-4661-bdb3-df46e70e371c-kube-api-access-6zsvx\") pod \"redhat-marketplace-ht4bl\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:27.484274 master-0 kubenswrapper[29936]: I1205 13:44:27.484153 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:28.076416 master-0 kubenswrapper[29936]: I1205 13:44:28.076349 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4bl"] Dec 05 13:44:28.082277 master-0 kubenswrapper[29936]: W1205 13:44:28.082208 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ccf4371_8e48_4661_bdb3_df46e70e371c.slice/crio-597de4726306f60026e5cf3fc9eb5e74ba411a394adeecc459f25cc02a39c1c4 WatchSource:0}: Error finding container 597de4726306f60026e5cf3fc9eb5e74ba411a394adeecc459f25cc02a39c1c4: Status 404 returned error can't find the container with id 597de4726306f60026e5cf3fc9eb5e74ba411a394adeecc459f25cc02a39c1c4 Dec 05 13:44:28.710543 master-0 kubenswrapper[29936]: I1205 13:44:28.710465 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4bl" event={"ID":"3ccf4371-8e48-4661-bdb3-df46e70e371c","Type":"ContainerStarted","Data":"597de4726306f60026e5cf3fc9eb5e74ba411a394adeecc459f25cc02a39c1c4"} Dec 05 13:44:29.722678 master-0 kubenswrapper[29936]: I1205 13:44:29.722592 29936 generic.go:334] "Generic (PLEG): container finished" podID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerID="5209316a3aa54d39258c633915e79883bc926201eb6fa2f5680df023a46af179" exitCode=0 Dec 05 13:44:29.722678 master-0 kubenswrapper[29936]: I1205 13:44:29.722680 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4bl" event={"ID":"3ccf4371-8e48-4661-bdb3-df46e70e371c","Type":"ContainerDied","Data":"5209316a3aa54d39258c633915e79883bc926201eb6fa2f5680df023a46af179"} Dec 05 13:44:31.778672 master-0 kubenswrapper[29936]: I1205 13:44:31.778601 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4bl" event={"ID":"3ccf4371-8e48-4661-bdb3-df46e70e371c","Type":"ContainerStarted","Data":"9733ab475c4311e5b26aec8e304589967e7ab1794651250f9275416b35a8fcba"} Dec 05 13:44:32.791969 master-0 kubenswrapper[29936]: I1205 13:44:32.791873 29936 generic.go:334] "Generic (PLEG): container finished" podID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerID="9733ab475c4311e5b26aec8e304589967e7ab1794651250f9275416b35a8fcba" exitCode=0 Dec 05 13:44:32.791969 master-0 kubenswrapper[29936]: I1205 13:44:32.791971 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4bl" event={"ID":"3ccf4371-8e48-4661-bdb3-df46e70e371c","Type":"ContainerDied","Data":"9733ab475c4311e5b26aec8e304589967e7ab1794651250f9275416b35a8fcba"} Dec 05 13:44:33.809088 master-0 kubenswrapper[29936]: I1205 13:44:33.809014 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4bl" event={"ID":"3ccf4371-8e48-4661-bdb3-df46e70e371c","Type":"ContainerStarted","Data":"832bb2edf080d6b0951e19e5ec175dca9240df31ec38fbe095d0d129affae330"} Dec 05 13:44:33.842285 master-0 kubenswrapper[29936]: I1205 13:44:33.842193 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ht4bl" podStartSLOduration=3.30664835 podStartE2EDuration="6.842148463s" podCreationTimestamp="2025-12-05 13:44:27 +0000 UTC" firstStartedPulling="2025-12-05 13:44:29.724451377 +0000 UTC m=+3266.856531058" lastFinishedPulling="2025-12-05 13:44:33.25995149 +0000 UTC m=+3270.392031171" observedRunningTime="2025-12-05 13:44:33.834931881 +0000 UTC m=+3270.967011562" watchObservedRunningTime="2025-12-05 13:44:33.842148463 +0000 UTC m=+3270.974228144" Dec 05 13:44:37.485562 master-0 kubenswrapper[29936]: I1205 13:44:37.485484 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:37.486728 master-0 kubenswrapper[29936]: I1205 13:44:37.486658 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:37.536065 master-0 kubenswrapper[29936]: I1205 13:44:37.535640 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:38.914051 master-0 kubenswrapper[29936]: I1205 13:44:38.913939 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:38.978395 master-0 kubenswrapper[29936]: I1205 13:44:38.976078 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4bl"] Dec 05 13:44:40.884081 master-0 kubenswrapper[29936]: I1205 13:44:40.883987 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ht4bl" podUID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerName="registry-server" containerID="cri-o://832bb2edf080d6b0951e19e5ec175dca9240df31ec38fbe095d0d129affae330" gracePeriod=2 Dec 05 13:44:41.897874 master-0 kubenswrapper[29936]: I1205 13:44:41.897794 29936 generic.go:334] "Generic (PLEG): container finished" podID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerID="832bb2edf080d6b0951e19e5ec175dca9240df31ec38fbe095d0d129affae330" exitCode=0 Dec 05 13:44:41.897874 master-0 kubenswrapper[29936]: I1205 13:44:41.897849 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4bl" event={"ID":"3ccf4371-8e48-4661-bdb3-df46e70e371c","Type":"ContainerDied","Data":"832bb2edf080d6b0951e19e5ec175dca9240df31ec38fbe095d0d129affae330"} Dec 05 13:44:41.897874 master-0 kubenswrapper[29936]: I1205 13:44:41.897875 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ht4bl" event={"ID":"3ccf4371-8e48-4661-bdb3-df46e70e371c","Type":"ContainerDied","Data":"597de4726306f60026e5cf3fc9eb5e74ba411a394adeecc459f25cc02a39c1c4"} Dec 05 13:44:41.897874 master-0 kubenswrapper[29936]: I1205 13:44:41.897887 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="597de4726306f60026e5cf3fc9eb5e74ba411a394adeecc459f25cc02a39c1c4" Dec 05 13:44:41.975116 master-0 kubenswrapper[29936]: I1205 13:44:41.975063 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:42.053324 master-0 kubenswrapper[29936]: I1205 13:44:42.053245 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zsvx\" (UniqueName: \"kubernetes.io/projected/3ccf4371-8e48-4661-bdb3-df46e70e371c-kube-api-access-6zsvx\") pod \"3ccf4371-8e48-4661-bdb3-df46e70e371c\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " Dec 05 13:44:42.053552 master-0 kubenswrapper[29936]: I1205 13:44:42.053525 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-utilities\") pod \"3ccf4371-8e48-4661-bdb3-df46e70e371c\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " Dec 05 13:44:42.053624 master-0 kubenswrapper[29936]: I1205 13:44:42.053561 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-catalog-content\") pod \"3ccf4371-8e48-4661-bdb3-df46e70e371c\" (UID: \"3ccf4371-8e48-4661-bdb3-df46e70e371c\") " Dec 05 13:44:42.055302 master-0 kubenswrapper[29936]: I1205 13:44:42.055249 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-utilities" (OuterVolumeSpecName: "utilities") pod "3ccf4371-8e48-4661-bdb3-df46e70e371c" (UID: "3ccf4371-8e48-4661-bdb3-df46e70e371c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:44:42.071341 master-0 kubenswrapper[29936]: I1205 13:44:42.067557 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ccf4371-8e48-4661-bdb3-df46e70e371c-kube-api-access-6zsvx" (OuterVolumeSpecName: "kube-api-access-6zsvx") pod "3ccf4371-8e48-4661-bdb3-df46e70e371c" (UID: "3ccf4371-8e48-4661-bdb3-df46e70e371c"). InnerVolumeSpecName "kube-api-access-6zsvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:44:42.090695 master-0 kubenswrapper[29936]: I1205 13:44:42.090619 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ccf4371-8e48-4661-bdb3-df46e70e371c" (UID: "3ccf4371-8e48-4661-bdb3-df46e70e371c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:44:42.156559 master-0 kubenswrapper[29936]: I1205 13:44:42.156485 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:44:42.156559 master-0 kubenswrapper[29936]: I1205 13:44:42.156544 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ccf4371-8e48-4661-bdb3-df46e70e371c-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:44:42.156559 master-0 kubenswrapper[29936]: I1205 13:44:42.156558 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zsvx\" (UniqueName: \"kubernetes.io/projected/3ccf4371-8e48-4661-bdb3-df46e70e371c-kube-api-access-6zsvx\") on node \"master-0\" DevicePath \"\"" Dec 05 13:44:42.915622 master-0 kubenswrapper[29936]: I1205 13:44:42.915541 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ht4bl" Dec 05 13:44:43.203958 master-0 kubenswrapper[29936]: I1205 13:44:43.203894 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4bl"] Dec 05 13:44:43.227712 master-0 kubenswrapper[29936]: I1205 13:44:43.227625 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ht4bl"] Dec 05 13:44:45.202659 master-0 kubenswrapper[29936]: I1205 13:44:45.202590 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ccf4371-8e48-4661-bdb3-df46e70e371c" path="/var/lib/kubelet/pods/3ccf4371-8e48-4661-bdb3-df46e70e371c/volumes" Dec 05 13:44:57.739270 master-0 kubenswrapper[29936]: I1205 13:44:57.737085 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qsn5z"] Dec 05 13:44:57.739270 master-0 kubenswrapper[29936]: E1205 13:44:57.737701 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerName="extract-content" Dec 05 13:44:57.739270 master-0 kubenswrapper[29936]: I1205 13:44:57.737715 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerName="extract-content" Dec 05 13:44:57.739270 master-0 kubenswrapper[29936]: E1205 13:44:57.737748 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerName="registry-server" Dec 05 13:44:57.739270 master-0 kubenswrapper[29936]: I1205 13:44:57.737760 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerName="registry-server" Dec 05 13:44:57.739270 master-0 kubenswrapper[29936]: E1205 13:44:57.737802 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerName="extract-utilities" Dec 05 13:44:57.739270 master-0 kubenswrapper[29936]: I1205 13:44:57.737809 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerName="extract-utilities" Dec 05 13:44:57.739270 master-0 kubenswrapper[29936]: I1205 13:44:57.738024 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ccf4371-8e48-4661-bdb3-df46e70e371c" containerName="registry-server" Dec 05 13:44:57.740246 master-0 kubenswrapper[29936]: I1205 13:44:57.739729 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:57.754745 master-0 kubenswrapper[29936]: I1205 13:44:57.754611 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qsn5z"] Dec 05 13:44:57.887276 master-0 kubenswrapper[29936]: I1205 13:44:57.887196 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-catalog-content\") pod \"redhat-operators-qsn5z\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:57.887276 master-0 kubenswrapper[29936]: I1205 13:44:57.887267 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw5ps\" (UniqueName: \"kubernetes.io/projected/948c5cb6-f8c8-47dc-a593-5c13012969e9-kube-api-access-vw5ps\") pod \"redhat-operators-qsn5z\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:57.887566 master-0 kubenswrapper[29936]: I1205 13:44:57.887468 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-utilities\") pod \"redhat-operators-qsn5z\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:57.989703 master-0 kubenswrapper[29936]: I1205 13:44:57.989511 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-utilities\") pod \"redhat-operators-qsn5z\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:57.989703 master-0 kubenswrapper[29936]: I1205 13:44:57.989677 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-catalog-content\") pod \"redhat-operators-qsn5z\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:57.989982 master-0 kubenswrapper[29936]: I1205 13:44:57.989710 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw5ps\" (UniqueName: \"kubernetes.io/projected/948c5cb6-f8c8-47dc-a593-5c13012969e9-kube-api-access-vw5ps\") pod \"redhat-operators-qsn5z\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:57.990677 master-0 kubenswrapper[29936]: I1205 13:44:57.990643 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-catalog-content\") pod \"redhat-operators-qsn5z\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:57.990817 master-0 kubenswrapper[29936]: I1205 13:44:57.990649 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-utilities\") pod \"redhat-operators-qsn5z\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:58.012119 master-0 kubenswrapper[29936]: I1205 13:44:58.011998 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw5ps\" (UniqueName: \"kubernetes.io/projected/948c5cb6-f8c8-47dc-a593-5c13012969e9-kube-api-access-vw5ps\") pod \"redhat-operators-qsn5z\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:58.071773 master-0 kubenswrapper[29936]: I1205 13:44:58.071228 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:44:58.563550 master-0 kubenswrapper[29936]: I1205 13:44:58.562758 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qsn5z"] Dec 05 13:44:59.127985 master-0 kubenswrapper[29936]: I1205 13:44:59.127779 29936 generic.go:334] "Generic (PLEG): container finished" podID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerID="16dea82cb68218c0e5068aab2f2ff3b19f9322ccb6d936c47fa7d4919d09b34c" exitCode=0 Dec 05 13:44:59.127985 master-0 kubenswrapper[29936]: I1205 13:44:59.127842 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsn5z" event={"ID":"948c5cb6-f8c8-47dc-a593-5c13012969e9","Type":"ContainerDied","Data":"16dea82cb68218c0e5068aab2f2ff3b19f9322ccb6d936c47fa7d4919d09b34c"} Dec 05 13:44:59.127985 master-0 kubenswrapper[29936]: I1205 13:44:59.127870 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsn5z" event={"ID":"948c5cb6-f8c8-47dc-a593-5c13012969e9","Type":"ContainerStarted","Data":"dac0e14523b2dd42d729ad372681bd0c74b25d7f02c531a1713527a9606e2414"} Dec 05 13:45:00.185785 master-0 kubenswrapper[29936]: I1205 13:45:00.185725 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp"] Dec 05 13:45:00.187606 master-0 kubenswrapper[29936]: I1205 13:45:00.187569 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:00.191800 master-0 kubenswrapper[29936]: I1205 13:45:00.191696 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 13:45:00.194172 master-0 kubenswrapper[29936]: I1205 13:45:00.194116 29936 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-rdxkm" Dec 05 13:45:00.215135 master-0 kubenswrapper[29936]: I1205 13:45:00.215061 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp"] Dec 05 13:45:00.369220 master-0 kubenswrapper[29936]: I1205 13:45:00.369005 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8k6g\" (UniqueName: \"kubernetes.io/projected/d170b02e-991d-4b19-9c33-d73ae8974b37-kube-api-access-b8k6g\") pod \"collect-profiles-29415705-pjkgp\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:00.369548 master-0 kubenswrapper[29936]: I1205 13:45:00.369172 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d170b02e-991d-4b19-9c33-d73ae8974b37-config-volume\") pod \"collect-profiles-29415705-pjkgp\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:00.369548 master-0 kubenswrapper[29936]: I1205 13:45:00.369495 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d170b02e-991d-4b19-9c33-d73ae8974b37-secret-volume\") pod \"collect-profiles-29415705-pjkgp\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:00.472899 master-0 kubenswrapper[29936]: I1205 13:45:00.472788 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d170b02e-991d-4b19-9c33-d73ae8974b37-secret-volume\") pod \"collect-profiles-29415705-pjkgp\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:00.473273 master-0 kubenswrapper[29936]: I1205 13:45:00.473006 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8k6g\" (UniqueName: \"kubernetes.io/projected/d170b02e-991d-4b19-9c33-d73ae8974b37-kube-api-access-b8k6g\") pod \"collect-profiles-29415705-pjkgp\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:00.473273 master-0 kubenswrapper[29936]: I1205 13:45:00.473075 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d170b02e-991d-4b19-9c33-d73ae8974b37-config-volume\") pod \"collect-profiles-29415705-pjkgp\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:00.474762 master-0 kubenswrapper[29936]: I1205 13:45:00.474686 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d170b02e-991d-4b19-9c33-d73ae8974b37-config-volume\") pod \"collect-profiles-29415705-pjkgp\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:00.478213 master-0 kubenswrapper[29936]: I1205 13:45:00.478100 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d170b02e-991d-4b19-9c33-d73ae8974b37-secret-volume\") pod \"collect-profiles-29415705-pjkgp\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:00.496336 master-0 kubenswrapper[29936]: I1205 13:45:00.496271 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8k6g\" (UniqueName: \"kubernetes.io/projected/d170b02e-991d-4b19-9c33-d73ae8974b37-kube-api-access-b8k6g\") pod \"collect-profiles-29415705-pjkgp\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:00.530156 master-0 kubenswrapper[29936]: I1205 13:45:00.530085 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:01.053326 master-0 kubenswrapper[29936]: W1205 13:45:01.053158 29936 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd170b02e_991d_4b19_9c33_d73ae8974b37.slice/crio-3d7aedf70ae406a7f562f9883c2c8e942f9cfe10c99970aab01521fd92071d9b WatchSource:0}: Error finding container 3d7aedf70ae406a7f562f9883c2c8e942f9cfe10c99970aab01521fd92071d9b: Status 404 returned error can't find the container with id 3d7aedf70ae406a7f562f9883c2c8e942f9cfe10c99970aab01521fd92071d9b Dec 05 13:45:01.065275 master-0 kubenswrapper[29936]: I1205 13:45:01.065168 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp"] Dec 05 13:45:01.155372 master-0 kubenswrapper[29936]: I1205 13:45:01.155285 29936 generic.go:334] "Generic (PLEG): container finished" podID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerID="713a89a190c1b8b17a8a6480a9d7cce33f89bae63bcba622f59c3a31d7de4ec1" exitCode=0 Dec 05 13:45:01.155372 master-0 kubenswrapper[29936]: I1205 13:45:01.155374 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsn5z" event={"ID":"948c5cb6-f8c8-47dc-a593-5c13012969e9","Type":"ContainerDied","Data":"713a89a190c1b8b17a8a6480a9d7cce33f89bae63bcba622f59c3a31d7de4ec1"} Dec 05 13:45:01.158950 master-0 kubenswrapper[29936]: I1205 13:45:01.158859 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" event={"ID":"d170b02e-991d-4b19-9c33-d73ae8974b37","Type":"ContainerStarted","Data":"3d7aedf70ae406a7f562f9883c2c8e942f9cfe10c99970aab01521fd92071d9b"} Dec 05 13:45:02.171538 master-0 kubenswrapper[29936]: I1205 13:45:02.171476 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsn5z" event={"ID":"948c5cb6-f8c8-47dc-a593-5c13012969e9","Type":"ContainerStarted","Data":"1963b84f7dda11d130951c4807712824b992e48d06d0bfdde7a97865bd19049d"} Dec 05 13:45:02.175108 master-0 kubenswrapper[29936]: I1205 13:45:02.174986 29936 generic.go:334] "Generic (PLEG): container finished" podID="d170b02e-991d-4b19-9c33-d73ae8974b37" containerID="daf1ade680aa4b97212420aebe9db21e659467673b4914ea368d919ca6ed7939" exitCode=0 Dec 05 13:45:02.175108 master-0 kubenswrapper[29936]: I1205 13:45:02.175005 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" event={"ID":"d170b02e-991d-4b19-9c33-d73ae8974b37","Type":"ContainerDied","Data":"daf1ade680aa4b97212420aebe9db21e659467673b4914ea368d919ca6ed7939"} Dec 05 13:45:02.208287 master-0 kubenswrapper[29936]: I1205 13:45:02.205624 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qsn5z" podStartSLOduration=2.742490616 podStartE2EDuration="5.205602668s" podCreationTimestamp="2025-12-05 13:44:57 +0000 UTC" firstStartedPulling="2025-12-05 13:44:59.130125767 +0000 UTC m=+3296.262205448" lastFinishedPulling="2025-12-05 13:45:01.593237819 +0000 UTC m=+3298.725317500" observedRunningTime="2025-12-05 13:45:02.193486355 +0000 UTC m=+3299.325566046" watchObservedRunningTime="2025-12-05 13:45:02.205602668 +0000 UTC m=+3299.337682349" Dec 05 13:45:03.647195 master-0 kubenswrapper[29936]: I1205 13:45:03.647120 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:03.769985 master-0 kubenswrapper[29936]: I1205 13:45:03.769902 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d170b02e-991d-4b19-9c33-d73ae8974b37-config-volume\") pod \"d170b02e-991d-4b19-9c33-d73ae8974b37\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " Dec 05 13:45:03.770279 master-0 kubenswrapper[29936]: I1205 13:45:03.770169 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8k6g\" (UniqueName: \"kubernetes.io/projected/d170b02e-991d-4b19-9c33-d73ae8974b37-kube-api-access-b8k6g\") pod \"d170b02e-991d-4b19-9c33-d73ae8974b37\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " Dec 05 13:45:03.770343 master-0 kubenswrapper[29936]: I1205 13:45:03.770329 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d170b02e-991d-4b19-9c33-d73ae8974b37-secret-volume\") pod \"d170b02e-991d-4b19-9c33-d73ae8974b37\" (UID: \"d170b02e-991d-4b19-9c33-d73ae8974b37\") " Dec 05 13:45:03.771388 master-0 kubenswrapper[29936]: I1205 13:45:03.771355 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d170b02e-991d-4b19-9c33-d73ae8974b37-config-volume" (OuterVolumeSpecName: "config-volume") pod "d170b02e-991d-4b19-9c33-d73ae8974b37" (UID: "d170b02e-991d-4b19-9c33-d73ae8974b37"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 13:45:03.775056 master-0 kubenswrapper[29936]: I1205 13:45:03.775018 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d170b02e-991d-4b19-9c33-d73ae8974b37-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d170b02e-991d-4b19-9c33-d73ae8974b37" (UID: "d170b02e-991d-4b19-9c33-d73ae8974b37"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 13:45:03.775406 master-0 kubenswrapper[29936]: I1205 13:45:03.775291 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d170b02e-991d-4b19-9c33-d73ae8974b37-kube-api-access-b8k6g" (OuterVolumeSpecName: "kube-api-access-b8k6g") pod "d170b02e-991d-4b19-9c33-d73ae8974b37" (UID: "d170b02e-991d-4b19-9c33-d73ae8974b37"). InnerVolumeSpecName "kube-api-access-b8k6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:45:03.873413 master-0 kubenswrapper[29936]: I1205 13:45:03.873345 29936 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d170b02e-991d-4b19-9c33-d73ae8974b37-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 05 13:45:03.873413 master-0 kubenswrapper[29936]: I1205 13:45:03.873400 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8k6g\" (UniqueName: \"kubernetes.io/projected/d170b02e-991d-4b19-9c33-d73ae8974b37-kube-api-access-b8k6g\") on node \"master-0\" DevicePath \"\"" Dec 05 13:45:03.873413 master-0 kubenswrapper[29936]: I1205 13:45:03.873413 29936 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d170b02e-991d-4b19-9c33-d73ae8974b37-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 05 13:45:04.206094 master-0 kubenswrapper[29936]: I1205 13:45:04.206021 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" event={"ID":"d170b02e-991d-4b19-9c33-d73ae8974b37","Type":"ContainerDied","Data":"3d7aedf70ae406a7f562f9883c2c8e942f9cfe10c99970aab01521fd92071d9b"} Dec 05 13:45:04.206094 master-0 kubenswrapper[29936]: I1205 13:45:04.206096 29936 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d7aedf70ae406a7f562f9883c2c8e942f9cfe10c99970aab01521fd92071d9b" Dec 05 13:45:04.206467 master-0 kubenswrapper[29936]: I1205 13:45:04.206109 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415705-pjkgp" Dec 05 13:45:04.800355 master-0 kubenswrapper[29936]: I1205 13:45:04.800256 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz"] Dec 05 13:45:04.810376 master-0 kubenswrapper[29936]: I1205 13:45:04.810299 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415660-dfpmz"] Dec 05 13:45:05.219934 master-0 kubenswrapper[29936]: I1205 13:45:05.219876 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1043a2e5-6dbc-42aa-96ed-1b46a6b5484f" path="/var/lib/kubelet/pods/1043a2e5-6dbc-42aa-96ed-1b46a6b5484f/volumes" Dec 05 13:45:08.071835 master-0 kubenswrapper[29936]: I1205 13:45:08.071753 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:45:08.073346 master-0 kubenswrapper[29936]: I1205 13:45:08.073293 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:45:08.128275 master-0 kubenswrapper[29936]: I1205 13:45:08.128200 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:45:08.308505 master-0 kubenswrapper[29936]: I1205 13:45:08.308438 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:45:08.382545 master-0 kubenswrapper[29936]: I1205 13:45:08.382411 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qsn5z"] Dec 05 13:45:10.285729 master-0 kubenswrapper[29936]: I1205 13:45:10.285648 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qsn5z" podUID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerName="registry-server" containerID="cri-o://1963b84f7dda11d130951c4807712824b992e48d06d0bfdde7a97865bd19049d" gracePeriod=2 Dec 05 13:45:11.299832 master-0 kubenswrapper[29936]: I1205 13:45:11.299760 29936 generic.go:334] "Generic (PLEG): container finished" podID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerID="1963b84f7dda11d130951c4807712824b992e48d06d0bfdde7a97865bd19049d" exitCode=0 Dec 05 13:45:11.299832 master-0 kubenswrapper[29936]: I1205 13:45:11.299821 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsn5z" event={"ID":"948c5cb6-f8c8-47dc-a593-5c13012969e9","Type":"ContainerDied","Data":"1963b84f7dda11d130951c4807712824b992e48d06d0bfdde7a97865bd19049d"} Dec 05 13:45:12.040117 master-0 kubenswrapper[29936]: I1205 13:45:12.040064 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:45:12.189611 master-0 kubenswrapper[29936]: I1205 13:45:12.189488 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw5ps\" (UniqueName: \"kubernetes.io/projected/948c5cb6-f8c8-47dc-a593-5c13012969e9-kube-api-access-vw5ps\") pod \"948c5cb6-f8c8-47dc-a593-5c13012969e9\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " Dec 05 13:45:12.189926 master-0 kubenswrapper[29936]: I1205 13:45:12.189893 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-utilities\") pod \"948c5cb6-f8c8-47dc-a593-5c13012969e9\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " Dec 05 13:45:12.190041 master-0 kubenswrapper[29936]: I1205 13:45:12.190014 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-catalog-content\") pod \"948c5cb6-f8c8-47dc-a593-5c13012969e9\" (UID: \"948c5cb6-f8c8-47dc-a593-5c13012969e9\") " Dec 05 13:45:12.190869 master-0 kubenswrapper[29936]: I1205 13:45:12.190812 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-utilities" (OuterVolumeSpecName: "utilities") pod "948c5cb6-f8c8-47dc-a593-5c13012969e9" (UID: "948c5cb6-f8c8-47dc-a593-5c13012969e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:45:12.194204 master-0 kubenswrapper[29936]: I1205 13:45:12.194139 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948c5cb6-f8c8-47dc-a593-5c13012969e9-kube-api-access-vw5ps" (OuterVolumeSpecName: "kube-api-access-vw5ps") pod "948c5cb6-f8c8-47dc-a593-5c13012969e9" (UID: "948c5cb6-f8c8-47dc-a593-5c13012969e9"). InnerVolumeSpecName "kube-api-access-vw5ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:45:12.292481 master-0 kubenswrapper[29936]: I1205 13:45:12.292410 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:45:12.292481 master-0 kubenswrapper[29936]: I1205 13:45:12.292453 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw5ps\" (UniqueName: \"kubernetes.io/projected/948c5cb6-f8c8-47dc-a593-5c13012969e9-kube-api-access-vw5ps\") on node \"master-0\" DevicePath \"\"" Dec 05 13:45:12.319316 master-0 kubenswrapper[29936]: I1205 13:45:12.319250 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qsn5z" event={"ID":"948c5cb6-f8c8-47dc-a593-5c13012969e9","Type":"ContainerDied","Data":"dac0e14523b2dd42d729ad372681bd0c74b25d7f02c531a1713527a9606e2414"} Dec 05 13:45:12.319316 master-0 kubenswrapper[29936]: I1205 13:45:12.319306 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qsn5z" Dec 05 13:45:12.319886 master-0 kubenswrapper[29936]: I1205 13:45:12.319338 29936 scope.go:117] "RemoveContainer" containerID="1963b84f7dda11d130951c4807712824b992e48d06d0bfdde7a97865bd19049d" Dec 05 13:45:12.340363 master-0 kubenswrapper[29936]: I1205 13:45:12.340318 29936 scope.go:117] "RemoveContainer" containerID="713a89a190c1b8b17a8a6480a9d7cce33f89bae63bcba622f59c3a31d7de4ec1" Dec 05 13:45:12.367020 master-0 kubenswrapper[29936]: I1205 13:45:12.366969 29936 scope.go:117] "RemoveContainer" containerID="16dea82cb68218c0e5068aab2f2ff3b19f9322ccb6d936c47fa7d4919d09b34c" Dec 05 13:45:13.087482 master-0 kubenswrapper[29936]: I1205 13:45:13.087391 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "948c5cb6-f8c8-47dc-a593-5c13012969e9" (UID: "948c5cb6-f8c8-47dc-a593-5c13012969e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:45:13.115016 master-0 kubenswrapper[29936]: I1205 13:45:13.114940 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/948c5cb6-f8c8-47dc-a593-5c13012969e9-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:45:13.381230 master-0 kubenswrapper[29936]: I1205 13:45:13.381049 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qsn5z"] Dec 05 13:45:13.396112 master-0 kubenswrapper[29936]: I1205 13:45:13.396038 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qsn5z"] Dec 05 13:45:15.199565 master-0 kubenswrapper[29936]: I1205 13:45:15.199505 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="948c5cb6-f8c8-47dc-a593-5c13012969e9" path="/var/lib/kubelet/pods/948c5cb6-f8c8-47dc-a593-5c13012969e9/volumes" Dec 05 13:45:20.307452 master-0 kubenswrapper[29936]: I1205 13:45:20.307332 29936 scope.go:117] "RemoveContainer" containerID="dc0fca244ceb67d41e3f8bc84f55bf3d37a32dd711c9991168b5d3972cd3da3d" Dec 05 13:45:26.861575 master-0 kubenswrapper[29936]: I1205 13:45:26.861503 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tbrhv"] Dec 05 13:45:26.862314 master-0 kubenswrapper[29936]: E1205 13:45:26.862037 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerName="extract-content" Dec 05 13:45:26.862314 master-0 kubenswrapper[29936]: I1205 13:45:26.862052 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerName="extract-content" Dec 05 13:45:26.862314 master-0 kubenswrapper[29936]: E1205 13:45:26.862072 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerName="registry-server" Dec 05 13:45:26.862314 master-0 kubenswrapper[29936]: I1205 13:45:26.862080 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerName="registry-server" Dec 05 13:45:26.862314 master-0 kubenswrapper[29936]: E1205 13:45:26.862116 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d170b02e-991d-4b19-9c33-d73ae8974b37" containerName="collect-profiles" Dec 05 13:45:26.862314 master-0 kubenswrapper[29936]: I1205 13:45:26.862125 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="d170b02e-991d-4b19-9c33-d73ae8974b37" containerName="collect-profiles" Dec 05 13:45:26.862314 master-0 kubenswrapper[29936]: E1205 13:45:26.862139 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerName="extract-utilities" Dec 05 13:45:26.862314 master-0 kubenswrapper[29936]: I1205 13:45:26.862145 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerName="extract-utilities" Dec 05 13:45:26.862664 master-0 kubenswrapper[29936]: I1205 13:45:26.862609 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="948c5cb6-f8c8-47dc-a593-5c13012969e9" containerName="registry-server" Dec 05 13:45:26.862708 master-0 kubenswrapper[29936]: I1205 13:45:26.862674 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="d170b02e-991d-4b19-9c33-d73ae8974b37" containerName="collect-profiles" Dec 05 13:45:26.865083 master-0 kubenswrapper[29936]: I1205 13:45:26.865053 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:26.937590 master-0 kubenswrapper[29936]: I1205 13:45:26.937521 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tbrhv"] Dec 05 13:45:26.965954 master-0 kubenswrapper[29936]: I1205 13:45:26.965869 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzmjm\" (UniqueName: \"kubernetes.io/projected/6d128665-a885-48e8-8e58-ea6b19cd3d37-kube-api-access-gzmjm\") pod \"certified-operators-tbrhv\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:26.966213 master-0 kubenswrapper[29936]: I1205 13:45:26.966062 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-catalog-content\") pod \"certified-operators-tbrhv\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:26.966213 master-0 kubenswrapper[29936]: I1205 13:45:26.966097 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-utilities\") pod \"certified-operators-tbrhv\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:27.069576 master-0 kubenswrapper[29936]: I1205 13:45:27.069514 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzmjm\" (UniqueName: \"kubernetes.io/projected/6d128665-a885-48e8-8e58-ea6b19cd3d37-kube-api-access-gzmjm\") pod \"certified-operators-tbrhv\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:27.070049 master-0 kubenswrapper[29936]: I1205 13:45:27.070030 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-catalog-content\") pod \"certified-operators-tbrhv\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:27.070153 master-0 kubenswrapper[29936]: I1205 13:45:27.070138 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-utilities\") pod \"certified-operators-tbrhv\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:27.070646 master-0 kubenswrapper[29936]: I1205 13:45:27.070590 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-catalog-content\") pod \"certified-operators-tbrhv\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:27.070732 master-0 kubenswrapper[29936]: I1205 13:45:27.070668 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-utilities\") pod \"certified-operators-tbrhv\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:27.113989 master-0 kubenswrapper[29936]: I1205 13:45:27.113829 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzmjm\" (UniqueName: \"kubernetes.io/projected/6d128665-a885-48e8-8e58-ea6b19cd3d37-kube-api-access-gzmjm\") pod \"certified-operators-tbrhv\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:27.185459 master-0 kubenswrapper[29936]: I1205 13:45:27.184643 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:27.722309 master-0 kubenswrapper[29936]: I1205 13:45:27.722236 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tbrhv"] Dec 05 13:45:28.217123 master-0 kubenswrapper[29936]: E1205 13:45:28.217052 29936 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d128665_a885_48e8_8e58_ea6b19cd3d37.slice/crio-conmon-3425e44944d50121260f78f43fa29f5343dd79afb89aa44375c11eafaf27468f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d128665_a885_48e8_8e58_ea6b19cd3d37.slice/crio-3425e44944d50121260f78f43fa29f5343dd79afb89aa44375c11eafaf27468f.scope\": RecentStats: unable to find data in memory cache]" Dec 05 13:45:28.567038 master-0 kubenswrapper[29936]: I1205 13:45:28.566914 29936 generic.go:334] "Generic (PLEG): container finished" podID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerID="3425e44944d50121260f78f43fa29f5343dd79afb89aa44375c11eafaf27468f" exitCode=0 Dec 05 13:45:28.567282 master-0 kubenswrapper[29936]: I1205 13:45:28.567014 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbrhv" event={"ID":"6d128665-a885-48e8-8e58-ea6b19cd3d37","Type":"ContainerDied","Data":"3425e44944d50121260f78f43fa29f5343dd79afb89aa44375c11eafaf27468f"} Dec 05 13:45:28.567407 master-0 kubenswrapper[29936]: I1205 13:45:28.567385 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbrhv" event={"ID":"6d128665-a885-48e8-8e58-ea6b19cd3d37","Type":"ContainerStarted","Data":"1125b24983327d80da3d25b448a178e8c946ef6ba6d86546ea4c251fb64a3458"} Dec 05 13:45:29.580275 master-0 kubenswrapper[29936]: I1205 13:45:29.580211 29936 generic.go:334] "Generic (PLEG): container finished" podID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerID="16e03a600d75966110840e57d9de7fbbb59c8d4c59029fa8608896207b9ed6a7" exitCode=0 Dec 05 13:45:29.580275 master-0 kubenswrapper[29936]: I1205 13:45:29.580269 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbrhv" event={"ID":"6d128665-a885-48e8-8e58-ea6b19cd3d37","Type":"ContainerDied","Data":"16e03a600d75966110840e57d9de7fbbb59c8d4c59029fa8608896207b9ed6a7"} Dec 05 13:45:30.593854 master-0 kubenswrapper[29936]: I1205 13:45:30.593792 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbrhv" event={"ID":"6d128665-a885-48e8-8e58-ea6b19cd3d37","Type":"ContainerStarted","Data":"ad46e67f860fd5b8fd4801d5a225eb66657be24d1e1c13a374c583f856de1382"} Dec 05 13:45:30.634866 master-0 kubenswrapper[29936]: I1205 13:45:30.634737 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tbrhv" podStartSLOduration=3.198225376 podStartE2EDuration="4.634709445s" podCreationTimestamp="2025-12-05 13:45:26 +0000 UTC" firstStartedPulling="2025-12-05 13:45:28.569652653 +0000 UTC m=+3325.701732334" lastFinishedPulling="2025-12-05 13:45:30.006136722 +0000 UTC m=+3327.138216403" observedRunningTime="2025-12-05 13:45:30.629889596 +0000 UTC m=+3327.761969287" watchObservedRunningTime="2025-12-05 13:45:30.634709445 +0000 UTC m=+3327.766789146" Dec 05 13:45:33.797353 master-0 kubenswrapper[29936]: I1205 13:45:33.797246 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k65dw/must-gather-2j7nc"] Dec 05 13:45:33.800795 master-0 kubenswrapper[29936]: I1205 13:45:33.800224 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k65dw/must-gather-2j7nc" Dec 05 13:45:33.803852 master-0 kubenswrapper[29936]: I1205 13:45:33.803265 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k65dw"/"kube-root-ca.crt" Dec 05 13:45:33.806369 master-0 kubenswrapper[29936]: I1205 13:45:33.806067 29936 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-k65dw"/"openshift-service-ca.crt" Dec 05 13:45:33.808370 master-0 kubenswrapper[29936]: I1205 13:45:33.807562 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k65dw/must-gather-r5sc9"] Dec 05 13:45:33.809780 master-0 kubenswrapper[29936]: I1205 13:45:33.809749 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k65dw/must-gather-r5sc9" Dec 05 13:45:33.817043 master-0 kubenswrapper[29936]: I1205 13:45:33.816956 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k65dw/must-gather-2j7nc"] Dec 05 13:45:33.885923 master-0 kubenswrapper[29936]: I1205 13:45:33.885835 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7741ec5-bd8c-4e7b-9987-77dde81d8f4d-must-gather-output\") pod \"must-gather-r5sc9\" (UID: \"f7741ec5-bd8c-4e7b-9987-77dde81d8f4d\") " pod="openshift-must-gather-k65dw/must-gather-r5sc9" Dec 05 13:45:33.886209 master-0 kubenswrapper[29936]: I1205 13:45:33.885983 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5c00167c-afc3-4c63-81bc-a077fe0fb7ae-must-gather-output\") pod \"must-gather-2j7nc\" (UID: \"5c00167c-afc3-4c63-81bc-a077fe0fb7ae\") " pod="openshift-must-gather-k65dw/must-gather-2j7nc" Dec 05 13:45:33.886784 master-0 kubenswrapper[29936]: I1205 13:45:33.886715 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf9lw\" (UniqueName: \"kubernetes.io/projected/f7741ec5-bd8c-4e7b-9987-77dde81d8f4d-kube-api-access-kf9lw\") pod \"must-gather-r5sc9\" (UID: \"f7741ec5-bd8c-4e7b-9987-77dde81d8f4d\") " pod="openshift-must-gather-k65dw/must-gather-r5sc9" Dec 05 13:45:33.886857 master-0 kubenswrapper[29936]: I1205 13:45:33.886824 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ffwm\" (UniqueName: \"kubernetes.io/projected/5c00167c-afc3-4c63-81bc-a077fe0fb7ae-kube-api-access-6ffwm\") pod \"must-gather-2j7nc\" (UID: \"5c00167c-afc3-4c63-81bc-a077fe0fb7ae\") " pod="openshift-must-gather-k65dw/must-gather-2j7nc" Dec 05 13:45:33.988445 master-0 kubenswrapper[29936]: I1205 13:45:33.988364 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf9lw\" (UniqueName: \"kubernetes.io/projected/f7741ec5-bd8c-4e7b-9987-77dde81d8f4d-kube-api-access-kf9lw\") pod \"must-gather-r5sc9\" (UID: \"f7741ec5-bd8c-4e7b-9987-77dde81d8f4d\") " pod="openshift-must-gather-k65dw/must-gather-r5sc9" Dec 05 13:45:33.988445 master-0 kubenswrapper[29936]: I1205 13:45:33.988421 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ffwm\" (UniqueName: \"kubernetes.io/projected/5c00167c-afc3-4c63-81bc-a077fe0fb7ae-kube-api-access-6ffwm\") pod \"must-gather-2j7nc\" (UID: \"5c00167c-afc3-4c63-81bc-a077fe0fb7ae\") " pod="openshift-must-gather-k65dw/must-gather-2j7nc" Dec 05 13:45:33.988758 master-0 kubenswrapper[29936]: I1205 13:45:33.988624 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7741ec5-bd8c-4e7b-9987-77dde81d8f4d-must-gather-output\") pod \"must-gather-r5sc9\" (UID: \"f7741ec5-bd8c-4e7b-9987-77dde81d8f4d\") " pod="openshift-must-gather-k65dw/must-gather-r5sc9" Dec 05 13:45:33.988888 master-0 kubenswrapper[29936]: I1205 13:45:33.988862 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5c00167c-afc3-4c63-81bc-a077fe0fb7ae-must-gather-output\") pod \"must-gather-2j7nc\" (UID: \"5c00167c-afc3-4c63-81bc-a077fe0fb7ae\") " pod="openshift-must-gather-k65dw/must-gather-2j7nc" Dec 05 13:45:33.989157 master-0 kubenswrapper[29936]: I1205 13:45:33.989112 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7741ec5-bd8c-4e7b-9987-77dde81d8f4d-must-gather-output\") pod \"must-gather-r5sc9\" (UID: \"f7741ec5-bd8c-4e7b-9987-77dde81d8f4d\") " pod="openshift-must-gather-k65dw/must-gather-r5sc9" Dec 05 13:45:33.989262 master-0 kubenswrapper[29936]: I1205 13:45:33.989239 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5c00167c-afc3-4c63-81bc-a077fe0fb7ae-must-gather-output\") pod \"must-gather-2j7nc\" (UID: \"5c00167c-afc3-4c63-81bc-a077fe0fb7ae\") " pod="openshift-must-gather-k65dw/must-gather-2j7nc" Dec 05 13:45:34.046415 master-0 kubenswrapper[29936]: I1205 13:45:34.046297 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k65dw/must-gather-r5sc9"] Dec 05 13:45:34.096385 master-0 kubenswrapper[29936]: I1205 13:45:34.096008 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf9lw\" (UniqueName: \"kubernetes.io/projected/f7741ec5-bd8c-4e7b-9987-77dde81d8f4d-kube-api-access-kf9lw\") pod \"must-gather-r5sc9\" (UID: \"f7741ec5-bd8c-4e7b-9987-77dde81d8f4d\") " pod="openshift-must-gather-k65dw/must-gather-r5sc9" Dec 05 13:45:34.112658 master-0 kubenswrapper[29936]: I1205 13:45:34.112592 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ffwm\" (UniqueName: \"kubernetes.io/projected/5c00167c-afc3-4c63-81bc-a077fe0fb7ae-kube-api-access-6ffwm\") pod \"must-gather-2j7nc\" (UID: \"5c00167c-afc3-4c63-81bc-a077fe0fb7ae\") " pod="openshift-must-gather-k65dw/must-gather-2j7nc" Dec 05 13:45:34.127894 master-0 kubenswrapper[29936]: I1205 13:45:34.127832 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k65dw/must-gather-2j7nc" Dec 05 13:45:34.137571 master-0 kubenswrapper[29936]: I1205 13:45:34.137512 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k65dw/must-gather-r5sc9" Dec 05 13:45:34.935789 master-0 kubenswrapper[29936]: I1205 13:45:34.935580 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k65dw/must-gather-r5sc9"] Dec 05 13:45:34.968136 master-0 kubenswrapper[29936]: I1205 13:45:34.968053 29936 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k65dw/must-gather-2j7nc"] Dec 05 13:45:35.653850 master-0 kubenswrapper[29936]: I1205 13:45:35.653733 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k65dw/must-gather-2j7nc" event={"ID":"5c00167c-afc3-4c63-81bc-a077fe0fb7ae","Type":"ContainerStarted","Data":"d02aac92dc08c90ea5df6ea560487683e1c4631acacc551d78014dd77b30868b"} Dec 05 13:45:35.654842 master-0 kubenswrapper[29936]: I1205 13:45:35.654806 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k65dw/must-gather-r5sc9" event={"ID":"f7741ec5-bd8c-4e7b-9987-77dde81d8f4d","Type":"ContainerStarted","Data":"67c9825225e7cabe965dc29da8bd648f231d8f8c8f5a1170b8e4b7f4b466b8e8"} Dec 05 13:45:37.201040 master-0 kubenswrapper[29936]: I1205 13:45:37.200981 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:37.201040 master-0 kubenswrapper[29936]: I1205 13:45:37.201036 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:37.248262 master-0 kubenswrapper[29936]: I1205 13:45:37.248140 29936 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:37.749473 master-0 kubenswrapper[29936]: I1205 13:45:37.749420 29936 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:41.410109 master-0 kubenswrapper[29936]: I1205 13:45:41.409974 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tbrhv"] Dec 05 13:45:41.411087 master-0 kubenswrapper[29936]: I1205 13:45:41.410371 29936 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tbrhv" podUID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerName="registry-server" containerID="cri-o://ad46e67f860fd5b8fd4801d5a225eb66657be24d1e1c13a374c583f856de1382" gracePeriod=2 Dec 05 13:45:41.745495 master-0 kubenswrapper[29936]: I1205 13:45:41.745449 29936 generic.go:334] "Generic (PLEG): container finished" podID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerID="ad46e67f860fd5b8fd4801d5a225eb66657be24d1e1c13a374c583f856de1382" exitCode=0 Dec 05 13:45:41.745762 master-0 kubenswrapper[29936]: I1205 13:45:41.745500 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbrhv" event={"ID":"6d128665-a885-48e8-8e58-ea6b19cd3d37","Type":"ContainerDied","Data":"ad46e67f860fd5b8fd4801d5a225eb66657be24d1e1c13a374c583f856de1382"} Dec 05 13:45:42.225776 master-0 kubenswrapper[29936]: I1205 13:45:42.225593 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:42.370007 master-0 kubenswrapper[29936]: I1205 13:45:42.369854 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-utilities\") pod \"6d128665-a885-48e8-8e58-ea6b19cd3d37\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " Dec 05 13:45:42.370007 master-0 kubenswrapper[29936]: I1205 13:45:42.369966 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-catalog-content\") pod \"6d128665-a885-48e8-8e58-ea6b19cd3d37\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " Dec 05 13:45:42.370603 master-0 kubenswrapper[29936]: I1205 13:45:42.370572 29936 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzmjm\" (UniqueName: \"kubernetes.io/projected/6d128665-a885-48e8-8e58-ea6b19cd3d37-kube-api-access-gzmjm\") pod \"6d128665-a885-48e8-8e58-ea6b19cd3d37\" (UID: \"6d128665-a885-48e8-8e58-ea6b19cd3d37\") " Dec 05 13:45:42.371715 master-0 kubenswrapper[29936]: I1205 13:45:42.371657 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-utilities" (OuterVolumeSpecName: "utilities") pod "6d128665-a885-48e8-8e58-ea6b19cd3d37" (UID: "6d128665-a885-48e8-8e58-ea6b19cd3d37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:45:42.374940 master-0 kubenswrapper[29936]: I1205 13:45:42.374899 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d128665-a885-48e8-8e58-ea6b19cd3d37-kube-api-access-gzmjm" (OuterVolumeSpecName: "kube-api-access-gzmjm") pod "6d128665-a885-48e8-8e58-ea6b19cd3d37" (UID: "6d128665-a885-48e8-8e58-ea6b19cd3d37"). InnerVolumeSpecName "kube-api-access-gzmjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 13:45:42.415845 master-0 kubenswrapper[29936]: I1205 13:45:42.415766 29936 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d128665-a885-48e8-8e58-ea6b19cd3d37" (UID: "6d128665-a885-48e8-8e58-ea6b19cd3d37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 13:45:42.474261 master-0 kubenswrapper[29936]: I1205 13:45:42.473800 29936 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzmjm\" (UniqueName: \"kubernetes.io/projected/6d128665-a885-48e8-8e58-ea6b19cd3d37-kube-api-access-gzmjm\") on node \"master-0\" DevicePath \"\"" Dec 05 13:45:42.474261 master-0 kubenswrapper[29936]: I1205 13:45:42.473896 29936 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-utilities\") on node \"master-0\" DevicePath \"\"" Dec 05 13:45:42.474261 master-0 kubenswrapper[29936]: I1205 13:45:42.473913 29936 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d128665-a885-48e8-8e58-ea6b19cd3d37-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 05 13:45:42.761734 master-0 kubenswrapper[29936]: I1205 13:45:42.761601 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k65dw/must-gather-2j7nc" event={"ID":"5c00167c-afc3-4c63-81bc-a077fe0fb7ae","Type":"ContainerStarted","Data":"8ba8e7c8025f89a1cd880864c646ec3662abcf2f930e77e01af183f246946c73"} Dec 05 13:45:42.765098 master-0 kubenswrapper[29936]: I1205 13:45:42.764987 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tbrhv" event={"ID":"6d128665-a885-48e8-8e58-ea6b19cd3d37","Type":"ContainerDied","Data":"1125b24983327d80da3d25b448a178e8c946ef6ba6d86546ea4c251fb64a3458"} Dec 05 13:45:42.765098 master-0 kubenswrapper[29936]: I1205 13:45:42.765057 29936 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tbrhv" Dec 05 13:45:42.765238 master-0 kubenswrapper[29936]: I1205 13:45:42.765094 29936 scope.go:117] "RemoveContainer" containerID="ad46e67f860fd5b8fd4801d5a225eb66657be24d1e1c13a374c583f856de1382" Dec 05 13:45:42.790862 master-0 kubenswrapper[29936]: I1205 13:45:42.790809 29936 scope.go:117] "RemoveContainer" containerID="16e03a600d75966110840e57d9de7fbbb59c8d4c59029fa8608896207b9ed6a7" Dec 05 13:45:42.827456 master-0 kubenswrapper[29936]: I1205 13:45:42.827407 29936 scope.go:117] "RemoveContainer" containerID="3425e44944d50121260f78f43fa29f5343dd79afb89aa44375c11eafaf27468f" Dec 05 13:45:45.645011 master-0 kubenswrapper[29936]: I1205 13:45:45.644242 29936 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tbrhv"] Dec 05 13:45:45.805365 master-0 kubenswrapper[29936]: I1205 13:45:45.805293 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k65dw/must-gather-r5sc9" event={"ID":"f7741ec5-bd8c-4e7b-9987-77dde81d8f4d","Type":"ContainerStarted","Data":"f9045d6f5b1538e05e5d1fdb89bc1e375fee4c281e5aafe4c1609758fb93e707"} Dec 05 13:45:45.808301 master-0 kubenswrapper[29936]: I1205 13:45:45.808258 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k65dw/must-gather-2j7nc" event={"ID":"5c00167c-afc3-4c63-81bc-a077fe0fb7ae","Type":"ContainerStarted","Data":"5fc00dd1ae706cdbec3a6409c0d1afae9ec59305d60b28604a7482466e9a5929"} Dec 05 13:45:45.936278 master-0 kubenswrapper[29936]: I1205 13:45:45.936200 29936 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tbrhv"] Dec 05 13:45:46.575802 master-0 kubenswrapper[29936]: I1205 13:45:46.573886 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k65dw/must-gather-2j7nc" podStartSLOduration=6.712360962 podStartE2EDuration="13.573867169s" podCreationTimestamp="2025-12-05 13:45:33 +0000 UTC" firstStartedPulling="2025-12-05 13:45:34.938740527 +0000 UTC m=+3332.070820208" lastFinishedPulling="2025-12-05 13:45:41.800246734 +0000 UTC m=+3338.932326415" observedRunningTime="2025-12-05 13:45:46.572884783 +0000 UTC m=+3343.704964474" watchObservedRunningTime="2025-12-05 13:45:46.573867169 +0000 UTC m=+3343.705946870" Dec 05 13:45:46.827101 master-0 kubenswrapper[29936]: I1205 13:45:46.826796 29936 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k65dw/must-gather-r5sc9" event={"ID":"f7741ec5-bd8c-4e7b-9987-77dde81d8f4d","Type":"ContainerStarted","Data":"32e21f67b1ecc88f4fb8598d8f8f218feea2deee5133cc579769e75a5f62668c"} Dec 05 13:45:47.203196 master-0 kubenswrapper[29936]: I1205 13:45:47.203107 29936 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d128665-a885-48e8-8e58-ea6b19cd3d37" path="/var/lib/kubelet/pods/6d128665-a885-48e8-8e58-ea6b19cd3d37/volumes" Dec 05 13:45:48.846713 master-0 kubenswrapper[29936]: I1205 13:45:48.846658 29936 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-6d5d5dcc89-gktn5_7d0792bf-e2da-4ee7-91fe-032299cea42f/cluster-version-operator/0.log" Dec 05 13:45:52.630289 master-0 kubenswrapper[29936]: I1205 13:45:52.620366 29936 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k65dw/must-gather-r5sc9" podStartSLOduration=12.779306879 podStartE2EDuration="19.620346289s" podCreationTimestamp="2025-12-05 13:45:33 +0000 UTC" firstStartedPulling="2025-12-05 13:45:34.919354329 +0000 UTC m=+3332.051434010" lastFinishedPulling="2025-12-05 13:45:41.760393739 +0000 UTC m=+3338.892473420" observedRunningTime="2025-12-05 13:45:46.854679125 +0000 UTC m=+3343.986758816" watchObservedRunningTime="2025-12-05 13:45:52.620346289 +0000 UTC m=+3349.752425970" Dec 05 13:45:52.637449 master-0 kubenswrapper[29936]: I1205 13:45:52.636537 29936 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k65dw/master-0-debug-nmnt6"] Dec 05 13:45:52.637449 master-0 kubenswrapper[29936]: E1205 13:45:52.637108 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerName="registry-server" Dec 05 13:45:52.637449 master-0 kubenswrapper[29936]: I1205 13:45:52.637121 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerName="registry-server" Dec 05 13:45:52.637449 master-0 kubenswrapper[29936]: E1205 13:45:52.637192 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerName="extract-utilities" Dec 05 13:45:52.637449 master-0 kubenswrapper[29936]: I1205 13:45:52.637200 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerName="extract-utilities" Dec 05 13:45:52.637449 master-0 kubenswrapper[29936]: E1205 13:45:52.637216 29936 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerName="extract-content" Dec 05 13:45:52.637449 master-0 kubenswrapper[29936]: I1205 13:45:52.637223 29936 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerName="extract-content" Dec 05 13:45:52.637449 master-0 kubenswrapper[29936]: I1205 13:45:52.637447 29936 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d128665-a885-48e8-8e58-ea6b19cd3d37" containerName="registry-server" Dec 05 13:45:52.638534 master-0 kubenswrapper[29936]: I1205 13:45:52.638335 29936 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k65dw/master-0-debug-nmnt6" Dec 05 13:45:52.787988 master-0 kubenswrapper[29936]: I1205 13:45:52.787885 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28s99\" (UniqueName: \"kubernetes.io/projected/8c2cf36b-cf67-4e00-838a-a25c63fcfdfd-kube-api-access-28s99\") pod \"master-0-debug-nmnt6\" (UID: \"8c2cf36b-cf67-4e00-838a-a25c63fcfdfd\") " pod="openshift-must-gather-k65dw/master-0-debug-nmnt6" Dec 05 13:45:52.788402 master-0 kubenswrapper[29936]: I1205 13:45:52.787997 29936 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c2cf36b-cf67-4e00-838a-a25c63fcfdfd-host\") pod \"master-0-debug-nmnt6\" (UID: \"8c2cf36b-cf67-4e00-838a-a25c63fcfdfd\") " pod="openshift-must-gather-k65dw/master-0-debug-nmnt6" Dec 05 13:45:52.890896 master-0 kubenswrapper[29936]: I1205 13:45:52.890101 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28s99\" (UniqueName: \"kubernetes.io/projected/8c2cf36b-cf67-4e00-838a-a25c63fcfdfd-kube-api-access-28s99\") pod \"master-0-debug-nmnt6\" (UID: \"8c2cf36b-cf67-4e00-838a-a25c63fcfdfd\") " pod="openshift-must-gather-k65dw/master-0-debug-nmnt6" Dec 05 13:45:52.890896 master-0 kubenswrapper[29936]: I1205 13:45:52.890169 29936 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c2cf36b-cf67-4e00-838a-a25c63fcfdfd-host\") pod \"master-0-debug-nmnt6\" (UID: \"8c2cf36b-cf67-4e00-838a-a25c63fcfdfd\") " pod="openshift-must-gather-k65dw/master-0-debug-nmnt6" Dec 05 13:45:52.890896 master-0 kubenswrapper[29936]: I1205 13:45:52.890380 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c2cf36b-cf67-4e00-838a-a25c63fcfdfd-host\") pod \"master-0-debug-nmnt6\" (UID: \"8c2cf36b-cf67-4e00-838a-a25c63fcfdfd\") " pod="openshift-must-gather-k65dw/master-0-debug-nmnt6" Dec 05 13:45:53.042235 master-0 kubenswrapper[29936]: I1205 13:45:53.034750 29936 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28s99\" (UniqueName: \"kubernetes.io/projected/8c2cf36b-cf67-4e00-838a-a25c63fcfdfd-kube-api-access-28s99\") pod \"master-0-debug-nmnt6\" (UID: \"8c2cf36b-cf67-4e00-838a-a25c63fcfdfd\") " pod="openshift-must-gather-k65dw/master-0-debug-nmnt6"